Search results for: hybrid forecasting models
7563 The Role of Information Technology in Supply Chain Management
Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao
Abstract:
This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.Keywords: supply chain management, information technology, business process, extended enterprise
Procedia PDF Downloads 3767562 Unlocking the Puzzle of Borrowing Adult Data for Designing Hybrid Pediatric Clinical Trials
Authors: Rajesh Kumar G
Abstract:
A challenging aspect of any clinical trial is to carefully plan the study design to meet the study objective in optimum way and to validate the assumptions made during protocol designing. And when it is a pediatric study, there is the added challenge of stringent guidelines and difficulty in recruiting the necessary subjects. Unlike adult trials, there is not much historical data available for pediatrics, which is required to validate assumptions for planning pediatric trials. Typically, pediatric studies are initiated as soon as approval is obtained for a drug to be marketed for adults, so with the adult study historical information and with the available pediatric pilot study data or simulated pediatric data, the pediatric study can be well planned. Generalizing the historical adult study for new pediatric study is a tedious task; however, it is possible by integrating various statistical techniques and utilizing the advantage of hybrid study design, which will help to achieve the study objective in a smoother way even with the presence of many constraints. This research paper will explain how well the hybrid study design can be planned along with integrated technique (SEV) to plan the pediatric study; In brief the SEV technique (Simulation, Estimation (using borrowed adult data and applying Bayesian methods)) incorporates the use of simulating the planned study data and getting the desired estimates to Validate the assumptions.This method of validation can be used to improve the accuracy of data analysis, ensuring that results are as valid and reliable as possible, which allow us to make informed decisions well ahead of study initiation. With professional precision, this technique based on the collected data allows to gain insight into best practices when using data from historical study and simulated data alike.Keywords: adaptive design, simulation, borrowing data, bayesian model
Procedia PDF Downloads 767561 Preparation of Wireless Networks and Security; Challenges in Efficient Accession of Encrypted Data in Healthcare
Authors: M. Zayoud, S. Oueida, S. Ionescu, P. AbiChar
Abstract:
Background: Wireless sensor network is encompassed of diversified tools of information technology, which is widely applied in a range of domains, including military surveillance, weather forecasting, and earthquake forecasting. Strengthened grounds are always developed for wireless sensor networks, which usually emerges security issues during professional application. Thus, essential technological tools are necessary to be assessed for secure aggregation of data. Moreover, such practices have to be incorporated in the healthcare practices that shall be serving in the best of the mutual interest Objective: Aggregation of encrypted data has been assessed through homomorphic stream cipher to assure its effectiveness along with providing the optimum solutions to the field of healthcare. Methods: An experimental design has been incorporated, which utilized newly developed cipher along with CPU-constrained devices. Modular additions have also been employed to evaluate the nature of aggregated data. The processes of homomorphic stream cipher have been highlighted through different sensors and modular additions. Results: Homomorphic stream cipher has been recognized as simple and secure process, which has allowed efficient aggregation of encrypted data. In addition, the application has led its way to the improvisation of the healthcare practices. Statistical values can be easily computed through the aggregation on the basis of selected cipher. Sensed data in accordance with variance, mean, and standard deviation has also been computed through the selected tool. Conclusion: It can be concluded that homomorphic stream cipher can be an ideal tool for appropriate aggregation of data. Alongside, it shall also provide the best solutions to the healthcare sector.Keywords: aggregation, cipher, homomorphic stream, encryption
Procedia PDF Downloads 2607560 Predominance of Teaching Models Used by Math Teachers in Secondary Education
Authors: Verónica Diaz Quezada
Abstract:
This research examines the teaching models used by secondary math teachers when teaching logarithmic, quadratic and exponential functions. For this, descriptive case studies have been carried out on 5 secondary teachers. These teachers have been chosen from 3 scientific-humanistic and technical schools, in Chile. Data have been obtained through non-participant class observation and the application of a questionnaire and a rubric to teachers. According to the results, the didactic model that prevails is the one that starts with an interactive strategy, moves to a more content-based structure, and ends with a reinforcement stage. Nonetheless, there is always influence from teachers, their methods, and the group of students.Keywords: teaching models, math teachers, functions, secondary education
Procedia PDF Downloads 1897559 A Super-Efficiency Model for Evaluating Efficiency in the Presence of Time Lag Effect
Authors: Yanshuang Zhang, Byungho Jeong
Abstract:
In many cases, there is a time lag between the consumption of inputs and the production of outputs. This time lag effect should be considered in evaluating the performance of organizations. Recently, a couple of DEA models were developed for considering time lag effect in efficiency evaluation of research activities. Multi-periods input(MpI) and Multi-periods output(MpO) models are integrated models to calculate simple efficiency considering time lag effect. However, these models can’t discriminate efficient DMUs because of the nature of basic DEA model in which efficiency scores are limited to ‘1’. That is, efficient DMUs can’t be discriminated because their efficiency scores are same. Thus, this paper suggests a super-efficiency model for efficiency evaluation under the consideration of time lag effect based on the MpO model. A case example using a long-term research project is given to compare the suggested model with the MpO model.Keywords: DEA, super-efficiency, time lag, multi-periods input
Procedia PDF Downloads 4737558 Hybrid Solutions in Physicochemical Processes for the Removal of Turbidity in Andean Reservoirs
Authors: María Cárdenas Gaudry, Gonzalo Ramces Fano Miranda
Abstract:
Sediment removal is very important in the purification of water, not only for reasons of visual perception but also because of its association with odor and taste problems. The Cuchoquesera reservoir, which is in the Andean region of Ayacucho (Peru) at an altitude of 3,740 meters above sea level, visually presents suspended particles and organic impurities indicating that it contains water of dubious quality to deduce that it is suitable for direct consumption of human beings. In order to quantitatively know the degree of impurities, water quality monitoring was carried out from February to August 2018, in which four sampling stations were established in the reservoir. The selected measured parameters were electrical conductivity, total dissolved solids, pH, color, turbidity, and sludge volume. The indicators of the studied parameters exceed the permissible limits except for electrical conductivity (190 μS/cm) and total dissolved solids (255 mg/L). In this investigation, the best combination and the optimal doses of reagents were determined that allowed the removal of sediments from the waters of the Cuchoquesera reservoir, through the physicochemical process of coagulation-flocculation. In order to improve this process during the rainy season, six combinations of reagents were evaluated, made up of three coagulants (ferric chloride, ferrous sulfate, and aluminum sulfate) and two natural flocculants: prickly pear powder (Opuntia ficus-indica) and tara gum (Caesalpinia spinoza). For each combination of reagents, jar tests were developed following the central composite experimental design (CCED), where the design factors were the doses of coagulant and flocculant and the initial turbidity. The results of the jar tests were adjusted to mathematical models, obtaining that to treat the water from the Cuchoquesera reservoir, with a turbidity of 150 UTN and a color of 137 U Pt-Co, 27.9 mg/L of the coagulant aluminum sulfate with 3 mg/L of the natural tara gum flocculant to produce a purified water quality of 1.7 UTN of turbidity and 3.2 U Pt-Co of apparent color. The estimated cost of the dose of coagulant and flocculant found was 0.22 USD/m³. This is how “grey-green” technologies can be used as a combination in nature-based solutions in water treatment, in this case, to achieve potability, making it more sustainable, especially economically, if green technology is available at the site of application of the nature-based hybrid solution. This research is a demonstration of the compatibility of natural coagulants/flocculants with other treatment technologies in the integrated/hybrid treatment process, such as the possibility of hybridizing natural coagulants with other types of coagulants.Keywords: prickly pear powder, tara gum, nature-based solutions, aluminum sulfate, jar test, turbidity, coagulation, flocculation
Procedia PDF Downloads 1087557 Exploring Tweet Geolocation: Leveraging Large Language Models for Post-Hoc Explanations
Authors: Sarra Hasni, Sami Faiz
Abstract:
In recent years, location prediction on social networks has gained significant attention, with short and unstructured texts like tweets posing additional challenges. Advanced geolocation models have been proposed, increasing the need to explain their predictions. In this paper, we provide explanations for a geolocation black-box model using LIME and SHAP, two state-of-the-art XAI (eXplainable Artificial Intelligence) methods. We extend our evaluations to Large Language Models (LLMs) as post hoc explainers for tweet geolocation. Our preliminary results show that LLMs outperform LIME and SHAP by generating more accurate explanations. Additionally, we demonstrate that prompts with examples and meta-prompts containing phonetic spelling rules improve the interpretability of these models, even with informal input data. This approach highlights the potential of advanced prompt engineering techniques to enhance the effectiveness of black-box models in geolocation tasks on social networks.Keywords: large language model, post hoc explainer, prompt engineering, local explanation, tweet geolocation
Procedia PDF Downloads 257556 Behavior of Composite Reinforced Concrete Circular Columns with Glass Fiber Reinforced Polymer I-Section
Authors: Hiba S. Ahmed, Abbas A. Allawi, Riyadh A. Hindi
Abstract:
Pultruded materials made of fiber-reinforced polymer (FRP) come in a broad range of shapes, such as bars, I-sections, C-sections, and other structural sections. These FRP materials are starting to compete with steel as structural materials because of their great resistance, low self-weight, and cheap maintenance costs-especially in corrosive conditions. This study aimed to evaluate the effectiveness of Glass Fiber Reinforced Polymer (GFRP) of the hybrid columns built by combining (GFRP) profiles with concrete columns because of their low cost and high structural efficiency. To achieve the aims of this study, nine circular columns with a diameter of (150 mm) and a height of (1000mm) were cast using normal concrete with compression strength equal to (35 MPa). The research involved three different types of reinforcement: hybrid circular columns type (IG) with GFRP I-section and 1% of the reinforcement ratio of steel bars, hybrid circular columns type (IS) with steel I-section and 1% of the reinforcement ratio of steel bars, (where the cross-section area of I-section for GFRP and steel was the same), compared with reference column (R) without I-section. To investigate the ultimate capacity, axial and lateral deformation, strain in longitudinal and transverse reinforcement, and failure mode of the circular column under different loading conditions (concentric and eccentric) with eccentricities of 25 mm and 50 mm, respectively. In the second part, an analytical finite element model will be performed using ABAQUS software to validate the experimental results.Keywords: composite, columns, reinforced concrete, GFRP, axial load
Procedia PDF Downloads 557555 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers
Procedia PDF Downloads 2987554 Hybrid Quasi-Steady Thermal Lattice Boltzmann Model for Studying the Behavior of Oil in Water Emulsions Used in Machining Tool Cooling and Lubrication
Authors: W. Hasan, H. Farhat, A. Alhilo, L. Tamimi
Abstract:
Oil in water (O/W) emulsions are utilized extensively for cooling and lubricating cutting tools during parts machining. A robust Lattice Boltzmann (LBM) thermal-surfactants model, which provides a useful platform for exploring complex emulsions’ characteristics under variety of flow conditions, is used here for the study of the fluid behavior during conventional tools cooling. The transient thermal capabilities of the model are employed for simulating the effects of the flow conditions of O/W emulsions on the cooling of cutting tools. The model results show that the temperature outcome is slightly affected by reversing the direction of upper plate (workpiece). On the other hand, an important increase in effective viscosity is seen which supports better lubrication during the work.Keywords: hybrid lattice Boltzmann method, Gunstensen model, thermal, surfactant-covered droplet, Marangoni stress
Procedia PDF Downloads 3037553 A Nonlinear Dynamical System with Application
Authors: Abdullah Eqal Al Mazrooei
Abstract:
In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model
Procedia PDF Downloads 2547552 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 1547551 Models of State Organization and Influence over Collective Identity and Nationalism in Spain
Authors: Muñoz-Sanchez, Victor Manuel, Perez-Flores, Antonio Manuel
Abstract:
The main objective of this paper is to establish the relationship between models of state organization and the various types of collective identity expressed by the Spanish. The question of nationalism and identity ascription in Spain has always been a topic of special importance due to the presence in that country of territories where the population emits very different opinions of nationalist sentiment than the rest of Spain. The current situation of sovereignty challenge of Catalonia to the central government exemplifies the importance of the subject matter. In order to analyze this process of interrelation, we use a secondary data mining by applying the multiple correspondence analysis technique (MCA). As a main result a typology of four types of expression of collective identity based on models of State organization are shown, which are connected with the party position on this issue.Keywords: models of organization of the state, nationalism, collective identity, Spain, political parties
Procedia PDF Downloads 4437550 Mosaic Augmentation: Insights and Limitations
Authors: Olivia A. Kjorlien, Maryam Asghari, Farshid Alizadeh-Shabdiz
Abstract:
The goal of this paper is to investigate the impact of mosaic augmentation on the performance of object detection solutions. To carry out the study, YOLOv4 and YOLOv4-Tiny models have been selected, which are popular, advanced object detection models. These models are also representatives of two classes of complex and simple models. The study also has been carried out on two categories of objects, simple and complex. For this study, YOLOv4 and YOLOv4 Tiny are trained with and without mosaic augmentation for two sets of objects. While mosaic augmentation improves the performance of simple object detection, it deteriorates the performance of complex object detection, specifically having the largest negative impact on the false positive rate in a complex object detection case.Keywords: accuracy, false positives, mosaic augmentation, object detection, YOLOV4, YOLOV4-Tiny
Procedia PDF Downloads 1277549 Investigation of Different Control Stratgies for UPFC Decoupled Model and the Impact of Location on Control Parameters
Authors: S. A. Al-Qallaf, S. A. Al-Mawsawi, A. Haider
Abstract:
In order to evaluate the performance of a unified power flow controller (UPFC), mathematical models for steady state and dynamic analysis are to be developed. The steady state model is mainly concerned with the incorporation of the UPFC in load flow studies. Several load flow models for UPFC have been introduced in literature, and one of the most reliable models is the decoupled UPFC model. In spite of UPFC decoupled load flow model simplicity, it is more robust compared to other UPFC load flow models and it contains unique capabilities. Some shortcoming such as additional set of nonlinear equations are to be solved separately after the load flow solution is obtained. The aim of this study is to investigate the different control strategies that can be realized in the decoupled load flow model (individual control and combined control), and the impact of the location of the UPFC in the network on its control parameters.Keywords: UPFC, decoupled model, load flow, control parameters
Procedia PDF Downloads 5547548 A Study on Characteristics of Hedonic Price Models in Korea Based on Meta-Regression Analysis
Authors: Minseo Jo
Abstract:
The purpose of this paper is to examine the factors in the hedonic price models, that has significance impact in determining the price of apartments. There are many variables employed in the hedonic price models and their effectiveness vary differently according to the researchers and the regions they are analysing. In order to consider various conditions, the meta-regression analysis has been selected for the study. In this paper, four meta-independent variables, from the 65 hedonic price models to analysis. The factors that influence the prices of apartments, as well as including factors that influence the prices of apartments, regions, which are divided into two of the research performed, years of research performed, the coefficients of the functions employed. The covariance between the four meta-variables and p-value of the coefficients and the four meta-variables and number of data used in the 65 hedonic price models have been analyzed in this study. The six factors that are most important in deciding the prices of apartments are positioning of apartments, the noise of the apartments, points of the compass and views from the apartments, proximity to the public transportations, companies that have constructed the apartments, social environments (such as schools etc.).Keywords: hedonic price model, housing price, meta-regression analysis, characteristics
Procedia PDF Downloads 4027547 Filtering Momentum Life Cycles, Price Acceleration Signals and Trend Reversals for Stocks, Credit Derivatives and Bonds
Authors: Periklis Brakatsoulas
Abstract:
Recent empirical research shows a growing interest in investment decision-making under market anomalies that contradict the rational paradigm. Momentum is undoubtedly one of the most robust anomalies in the empirical asset pricing research and remains surprisingly lucrative ever since first documented. Although predominantly phenomena identified across equities, momentum premia are now evident across various asset classes. Yet few many attempts are made so far to provide traders a diversified portfolio of strategies across different assets and markets. Moreover, literature focuses on patterns from past returns rather than mechanisms to signal future price directions prior to momentum runs. The aim of this paper is to develop a diversified portfolio approach to price distortion signals using daily position data on stocks, credit derivatives, and bonds. An algorithm allocates assets periodically, and new investment tactics take over upon price momentum signals and across different ranking groups. We focus on momentum life cycles, trend reversals, and price acceleration signals. The main effort here concentrates on the density, time span and maturity of momentum phenomena to identify consistent patterns over time and measure the predictive power of buy-sell signals generated by these anomalies. To tackle this, we propose a two-stage modelling process. First, we generate forecasts on core macroeconomic drivers. Secondly, satellite models generate market risk forecasts using the core driver projections generated at the first stage as input. Moreover, using a combination of the ARFIMA and FIGARCH models, we examine the dependence of consecutive observations across time and portfolio assets since long memory behavior in volatilities of one market appears to trigger persistent volatility patterns across other markets. We believe that this is the first work that employs evidence of volatility transmissions among derivatives, equities, and bonds to identify momentum life cycle patterns.Keywords: forecasting, long memory, momentum, returns
Procedia PDF Downloads 1027546 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field
Authors: Yana Snegireva
Abstract:
Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model
Procedia PDF Downloads 757545 Convolutional Neural Networks Architecture Analysis for Image Captioning
Authors: Jun Seung Woo, Shin Dong Ho
Abstract:
The Image Captioning models with Attention technology have developed significantly compared to previous models, but it is still unsatisfactory in recognizing images. We perform an extensive search over seven interesting Convolutional Neural Networks(CNN) architectures to analyze the behavior of different models for image captioning. We compared seven different CNN Architectures, according to batch size, using on public benchmarks: MS-COCO datasets. In our experimental results, DenseNet and InceptionV3 got about 14% loss and about 160sec training time per epoch. It was the most satisfactory result among the seven CNN architectures after training 50 epochs on GPU.Keywords: deep learning, image captioning, CNN architectures, densenet, inceptionV3
Procedia PDF Downloads 1317544 A Hybrid Distributed Algorithm for Solving Job Shop Scheduling Problem
Authors: Aydin Teymourifar, Gurkan Ozturk
Abstract:
In this paper, a distributed hybrid algorithm is proposed for solving the job shop scheduling problem. The suggested method executes different artificial neural networks, heuristics and meta-heuristics simultaneously on more than one machine. The neural networks are used to control the constraints of the problem while the meta-heuristics search the global space and the heuristics are used to prevent the premature convergence. To attain an efficient distributed intelligent method for solving big and distributed job shop scheduling problems, Apache Spark and Hadoop frameworks are used. In the algorithm implementation and design steps, new approaches are applied. Comparison between the proposed algorithm and other efficient algorithms from the literature shows its efficiency, which is able to solve large size problems in short time.Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, neural network
Procedia PDF Downloads 3877543 Fire Resistance of High Alumina Cement and Slag Based Ultra High Performance Fibre-Reinforced Cementitious Composites
Authors: A. Q. Sobia, M. S. Hamidah, I. Azmi, S. F. A. Rafeeqi
Abstract:
Fibre-reinforced polymer (FRP) strengthened reinforced concrete (RC) structures are susceptible to intense deterioration when exposed to elevated temperatures, particularly in the incident of fire. FRP has the tendency to lose bond with the substrate due to the low glass transition temperature of epoxy; the key component of FRP matrix. In the past few decades, various types of high performance cementitious composites (HPCC) were explored for the protection of RC structural members against elevated temperature. However, there is an inadequate information on the influence of elevated temperature on the ultra high performance fibre-reinforced cementitious composites (UHPFRCC) containing ground granulated blast furnace slag (GGBS) as a replacement of high alumina cement (HAC) in conjunction with hybrid fibres (basalt and polypropylene fibres), which could be a prospective fire resisting material for the structural components. The influence of elevated temperatures on the compressive as well as flexural strength of UHPFRCC, made of HAC-GGBS and hybrid fibres, were examined in this study. Besides control sample (without fibres), three other samples, containing 0.5%, 1% and 1.5% of basalt fibres by total weight of mix and 1 kg/m3 of polypropylene fibres, were prepared and tested. Another mix was also prepared with only 1 kg/m3 of polypropylene fibres. Each of the samples were retained at ambient temperature as well as exposed to 400, 700 and 1000 °C followed by testing after 28 and 56 days of conventional curing. Investigation of results disclosed that the use of hybrid fibres significantly helped to improve the ambient temperature compressive and flexural strength of UHPFRCC, which was found to be 80 and 14.3 MPa respectively. However, the optimum residual compressive strength was marked by UHPFRCC-CP (with polypropylene fibres only), equally after both curing days (28 and 56 days), i.e. 41%. In addition, the utmost residual flexural strength, after 28 and 56 days of curing, was marked by UHPFRCC– CP and UHPFRCC– CB2 (1 kg/m3 of PP fibres + 1% of basalt fibres) i.e. 39% and 48.5% respectively.Keywords: fibre reinforced polymer materials (FRP), ground granulated blast furnace slag (GGBS), high-alumina cement, hybrid, fibres
Procedia PDF Downloads 2877542 Models and Metamodels for Computer-Assisted Natural Language Grammar Learning
Authors: Evgeny Pyshkin, Maxim Mozgovoy, Vladislav Volkov
Abstract:
The paper follows a discourse on computer-assisted language learning. We examine problems of foreign language teaching and learning and introduce a metamodel that can be used to define learning models of language grammar structures in order to support teacher/student interaction. Special attention is paid to the concept of a virtual language lab. Our approach to language education assumes to encourage learners to experiment with a language and to learn by discovering patterns of grammatically correct structures created and managed by a language expert.Keywords: computer-assisted instruction, language learning, natural language grammar models, HCI
Procedia PDF Downloads 5197541 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 247540 Surface Modified Core–Shell Type Lipid–Polymer Hybrid Nanoparticles of Trans-Resveratrol, an Anticancer Agent, for Long Circulation and Improved Efficacy against MCF-7 Cells
Authors: M. R. Vijayakumar, K. Priyanka, Ramoji Kosuru, Lakshmi, Sanjay Singh
Abstract:
Trans resveratrol (RES) is a non-flavonoid poly-phenolic compound proved for its therapeutic and preventive effect against various types of cancer. However, the practical application of RES in cancer treatment is limited because of its higher dose (up to 7.5 g/day in humans), low biological half life, rapid metabolism and faster elimination in mammals. PEGylated core-shell type lipid polymer hybrid nanoparticles are the novel drug delivery systems for long circulation and improved anti cancer effect of its therapeutic payloads. Therefore, the main objective of this study is to extend the biological half life (long circulation) and improve the therapeutic efficacy of RES through core shell type of nanoparticles. D-α-tocopheryl polyethylene glycol 1000 succinate (vitamin E TPGS), a novel surfactant is applied for the preparation of PEGylated lipid polymer hybrid nanoparticles. The prepared nanoparticles were evaluated by various state of the art techniques such as dynamic light scattering (DLS) technique for particle size and zeta potential, TEM for shape, differential scanning calorimetry (DSC) for interaction analysis and XRD for crystalline changes of drug. Entrapment efficiency and invitro drug release were determined by ultracentrifugation method and dialysis bag method, respectively. Cancer cell viability studies were performed by MTT assay, respectively. Pharmacokinetic studies after i.v administration were performed in sprague dawley rats. The prepared NPs were found to be spherical in shape with smooth surfaces. Particle size and zeta potential of prepared NPs were found to be in the range of 179.2±7.45 to 266.8±9.61 nm and -0.63 to -48.35 mV, respectively. DSC revealed absence of potential interaction. XRD study revealed presence of amorphous form in nanoparticles. Entrapment efficiency was found to be 83.7 % and drug release was found to be in controlled manner. MTT assay showed low MEC and pharmacokinetic studies showed higher AUC of nanoformulaition than its pristine drug. All these studies revealed that the RES loaded PEG modified core-shell type lipid polymer hybrid nanoparticles can be an alternative tool for chemopreventive and therapeutic application of RES in cancer.Keywords: trans resveratrol, cancer nanotechnology, long circulating nanoparticles, bioavailability enhancement, core shell nanoparticles, lipid polymer hybrid nanoparticles
Procedia PDF Downloads 4727539 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models
Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri
Abstract:
Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation
Procedia PDF Downloads 747538 Seismic Analysis of Vertical Expansion Hybrid Structure by Response Spectrum Method Concern with Disaster Management and Solving the Problems of Urbanization
Authors: Gautam, Gurcharan Singh, Mandeep Kaur, Yogesh Aggarwal, Sanjeev Naval
Abstract:
The present ground reality scenario of suffering of humanity shows the evidence of failure to take wrong decisions to shape the civilization with Irresponsibilities in the history. A strong positive will of right responsibilities make the right civilization structure which affects itself and the whole world. Present suffering of humanity shows and reflect the failure of past decisions taken to shape the true culture with right social structure of society, due to unplanned system of Indian civilization and its rapid disaster of population make the failure to face all kind of problems which make the society sufferer. Our India is still suffering from disaster like earthquake, floods, droughts, tsunamis etc. and we face the uncountable disaster of deaths from the beginning of humanity at the present time. In this research paper our focus is to make a Disaster Resistance Structure having the solution of dense populated urban cities area by high vertical expansion HYBRID STRUCTURE. Our efforts are to analyse the Reinforced Concrete Hybrid Structure at different seismic zones, these concrete frames were analyzed using the response spectrum method to calculate and compare the different seismic displacement and drift. Seismic analysis by this method generally is based on dynamic analysis of building. Analysis results shows that the Reinforced Concrete Building at seismic Zone V having maximum peak story shear, base shear, drift and node displacement as compare to the analytical results of Reinforced Concrete Building at seismic Zone III and Zone IV. This analysis results indicating to focus on structural drawings strictly at construction site to make a HYBRID STRUCTURE. The study case is deal with the 10 story height of a vertical expansion Hybrid frame structure at different zones i.e. zone III, zone IV and zone V having the column 0.45x0.36mt and beam 0.6x0.36mt. with total height of 30mt, to make the structure more stable bracing techniques shell be applied like mage bracing and V shape bracing. If this kind of efforts or structure drawings are followed by the builders and contractors then we save the lives during earthquake disaster at Bhuj (Gujarat State, India) on 26th January, 2001 which resulted in more than 19,000 deaths. This kind of Disaster Resistance Structure having the capabilities to solve the problems of densely populated area of cities by the utilization of area in vertical expansion hybrid structure. We request to Government of India to make new plans and implementing it to save the lives from future disasters instead of unnecessary wants of development plans like Bullet Trains.Keywords: history, irresponsibilities, unplanned social structure, humanity, hybrid structure, response spectrum analysis, DRIFT, and NODE displacement
Procedia PDF Downloads 2117537 Evaluation and Selection of Elite Jatropha Genotypes for Biofuel
Authors: Bambang Heliyanto, Rully Dyah Purwati, Hasnam, Fadjry Djufry
Abstract:
Jatropha curcas L., a drought tolerant and monoecious perennial shrub, has received attention worldwide during the past decade. Realizing the facts, the Indonesian government has decided to option for Jatropha and palm oil for in country biofuel production. To support the program development of high yielding jatropha varieties is necessary. This paper reviews Jatropha improvement program in Indonesia using mass selection and hybrid development. To start with, at the end of 2005, in-country germplasm collection was mobilized to Lampung and Nusa Tenggara Barat (NTB) provinces and successfully collected 15 provenances/sub-provenances which serves as a base population for selection. A significant improvement has been achieved through a simple recurrent breeding selection during 2006 to 2007. Seed yield productivity increased more than double, from 0.36 to 0.97 ton dry seed per hectare during the first selection cycle (IP-1), and then increased to 2.2 ton per hectare during the second cycles (IP-2) in Lampung provenance. Similar result was also observed in NTB provenance. Seed yield productivity increased from 0.43 ton to 1 ton dry seed per hectare in the first cycle (IP-1), and then 1.9 ton in the second cycle (IP-2). In 2008, the population IP-3 resulted from the third cycle of selection have been identified which were capable of producing 2.2 to 2.4 ton seed yield per hectare. To improve the seed yield per hectare, jatropha hybrid varieties was developed involving superior provenances. As a result a Jatropha Energy Terbarukan (JET) variety-2 was released in 2017 with seed yield potential of 2.6 ton per hectare. The use of this high yielding genotypes for biofuel is discussed.Keywords: Jatropha curcas, provenance, biofuel, improve population, hybrid
Procedia PDF Downloads 1717536 Continuum-Based Modelling Approaches for Cell Mechanics
Authors: Yogesh D. Bansod, Jiri Bursa
Abstract:
The quantitative study of cell mechanics is of paramount interest since it regulates the behavior of the living cells in response to the myriad of extracellular and intracellular mechanical stimuli. The novel experimental techniques together with robust computational approaches have given rise to new theories and models, which describe cell mechanics as a combination of biomechanical and biochemical processes. This review paper encapsulates the existing continuum-based computational approaches that have been developed for interpreting the mechanical responses of living cells under different loading and boundary conditions. The salient features and drawbacks of each model are discussed from both structural and biological points of view. This discussion can contribute to the development of even more precise and realistic computational models of cell mechanics based on continuum approaches or on their combination with microstructural approaches, which in turn may provide a better understanding of mechanotransduction in living cells.Keywords: cell mechanics, computational models, continuum approach, mechanical models
Procedia PDF Downloads 3637535 Analyzing How Working From Home Can Lead to Higher Job Satisfaction for Employees Who Have Care Responsibilities Using Structural Equation Modeling
Authors: Christian Louis Kühner, Florian Pfeffel, Valentin Nickolai
Abstract:
Taking care of children, dependents, or pets can be a difficult and time-consuming task. Especially for part- and full-time employees, it can feel exhausting and overwhelming to meet these obligations besides working a job. Thus, working mostly at home and not having to drive to the company can save valuable time and stress. This study aims to show the influence that the working model has on the job satisfaction of employees with care responsibilities in comparison to employees who do not have such obligations. Using structural equation modeling (SEM), the three work models, “work from home”, “working remotely”, and a hybrid model, have been analyzed based on 13 influencing constructs on job satisfaction. These 13 factors have been further summarized into three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, an online survey was conducted with n = 684 employees from the service sector. Here, Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. In addition, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that among the employees with care responsibilities, the higher the proportion of working from home in comparison to working from the office, the more satisfied the employees are with their job. Since the work models that meet the requirements of comprehensive care led to higher job satisfaction amongst employees with such obligations, adapting as a company to such private obligations by employees can be crucial to sustained success. Conversely, the satisfaction level of the working model where employees work at the office is higher for workers without caregiving responsibilities.Keywords: care responsibilities, home office, job satisfaction, structural equation modeling
Procedia PDF Downloads 837534 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 131