Search results for: bilinear model
15581 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks
Authors: Wang Yichen, Haruka Yamashita
Abstract:
In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.Keywords: recurrent neural network, players lineup, basketball data, decision making model
Procedia PDF Downloads 13415580 Elasto-Plastic Behavior of Rock during Temperature Drop
Authors: N. Reppas, Y. L. Gui, B. Wetenhall, C. T. Davie, J. Ma
Abstract:
A theoretical constitutive model describing the stress-strain behavior of rock subjected to different confining pressures is presented. A bounding surface plastic model with hardening effects is proposed which includes the effect of temperature drop. The bounding surface is based on a mapping rule and the temperature effect on rock is controlled by Poisson’s ratio. Validation of the results against available experimental data is also presented. The relation of deviatoric stress and axial strain is illustrated at different temperatures to analyze the effect of temperature decrease in terms of stiffness of the material.Keywords: bounding surface, cooling of rock, plasticity model, rock deformation, elasto-plastic behavior
Procedia PDF Downloads 12915579 Application of Artificial Neural Network in Initiating Cleaning Of Photovoltaic Solar Panels
Authors: Mohamed Mokhtar, Mostafa F. Shaaban
Abstract:
Among the challenges facing solar photovoltaic (PV) systems in the United Arab Emirates (UAE), dust accumulation on solar panels is considered the most severe problem that faces the growth of solar power plants. The accumulation of dust on the solar panels significantly degrades output from these panels. Hence, solar PV panels have to be cleaned manually or using costly automated cleaning methods. This paper focuses on initiating cleaning actions when required to reduce maintenance costs. The cleaning actions are triggered only when the dust level exceeds a threshold value. The amount of dust accumulated on the PV panels is estimated using an artificial neural network (ANN). Experiments are conducted to collect the required data, which are used in the training of the ANN model. Then, this ANN model will be fed by the output power from solar panels, ambient temperature, and solar irradiance, and thus, it will be able to estimate the amount of dust accumulated on solar panels at these conditions. The model was tested on different case studies to confirm the accuracy of the developed model.Keywords: machine learning, dust, PV panels, renewable energy
Procedia PDF Downloads 14515578 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand
Authors: Jefferson Hernandez, Juan Padilla
Abstract:
Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.Keywords: price elasticity, volume, correlation structures, Bayesian models
Procedia PDF Downloads 16615577 Physicochemical Characterization of Coastal Aerosols over the Mediterranean Comparison with Weather Research and Forecasting-Chem Simulations
Authors: Stephane Laussac, Jacques Piazzola, Gilles Tedeschi
Abstract:
Estimation of the impact of atmospheric aerosols on the climate evolution is an important scientific challenge. One of a major source of particles is constituted by the oceans through the generation of sea-spray aerosols. In coastal areas, marine aerosols can affect air quality through their ability to interact chemically and physically with other aerosol species and gases. The integration of accurate sea-spray emission terms in modeling studies is then required. However, it was found that sea-spray concentrations are not represented with the necessary accuracy in some situations, more particularly at short fetch. In this study, the WRF-Chem model was implemented on a North-Western Mediterranean coastal region. WRF-Chem is the Weather Research and Forecasting (WRF) model online-coupled with chemistry for investigation of regional-scale air quality which simulates the emission, transport, mixing, and chemical transformation of trace gases and aerosols simultaneously with the meteorology. One of the objectives was to test the ability of the WRF-Chem model to represent the fine details of the coastal geography to provide accurate predictions of sea spray evolution for different fetches and the anthropogenic aerosols. To assess the performance of the model, a comparison between the model predictions using a local emission inventory and the physicochemical analysis of aerosol concentrations measured for different wind direction on the island of Porquerolles located 10 km south of the French Riviera is proposed.Keywords: sea-spray aerosols, coastal areas, sea-spray concentrations, short fetch, WRF-Chem model
Procedia PDF Downloads 19615576 Mathematical Model for Defection between Two Political Parties
Authors: Abdullahi Mohammed Auwal
Abstract:
Formation and change or decamping from one political party to another have now become a common trend in Nigeria. Many of the parties’ members who could not secure positions and or win elections in their parties or are not very much satisfied with the trends occurring in the party’s internal democratic principles and mechanisms, change their respective parties. This paper developed/presented and analyzed the used of non linear mathematical model for defections between two political parties using epidemiological approach. The whole population was assumed to be a constant and homogeneously mixed. Equilibria have been analytically obtained and their local and global stability discussed. Conditions for the co-existence of both the political parties have been determined, in the study of defections between People Democratic Party (PDP) and All Progressive Congress (APC) in Nigeria using numerical simulations to support the analytical results.Keywords: model, political parties, deffection, stability, equilibrium, epidemiology
Procedia PDF Downloads 63915575 Analysis of Brain Signals Using Neural Networks Optimized by Co-Evolution Algorithms
Authors: Zahra Abdolkarimi, Naser Zourikalatehsamad,
Abstract:
Up to 40 years ago, after recognition of epilepsy, it was generally believed that these attacks occurred randomly and suddenly. However, thanks to the advance of mathematics and engineering, such attacks can be predicted within a few minutes or hours. In this way, various algorithms for long-term prediction of the time and frequency of the first attack are presented. In this paper, by considering the nonlinear nature of brain signals and dynamic recorded brain signals, ANFIS model is presented to predict the brain signals, since according to physiologic structure of the onset of attacks, more complex neural structures can better model the signal during attacks. Contribution of this work is the co-evolution algorithm for optimization of ANFIS network parameters. Our objective is to predict brain signals based on time series obtained from brain signals of the people suffering from epilepsy using ANFIS. Results reveal that compared to other methods, this method has less sensitivity to uncertainties such as presence of noise and interruption in recorded signals of the brain as well as more accuracy. Long-term prediction capacity of the model illustrates the usage of planted systems for warning medication and preventing brain signals.Keywords: co-evolution algorithms, brain signals, time series, neural networks, ANFIS model, physiologic structure, time prediction, epilepsy suffering, illustrates model
Procedia PDF Downloads 28415574 Fuzzy Logic Based Fault Tolerant Model Predictive MLI Topology
Authors: Abhimanyu Kumar, Chirag Gupta
Abstract:
This work presents a comprehensive study on the employment of Model Predictive Control (MPC) for a three-phase voltage-source inverter to regulate the output voltage efficiently. The inverter is modeled via the Clarke Transformation, considering a scenario where the load is unknown. An LC filter model is developed, demonstrating its efficacy in Total Harmonic Distortion (THD) reduction. The system, when implemented with fault-tolerant multilevel inverter topologies, ensures reliable operation even under fault conditions, a requirement that is paramount with the increasing dependence on renewable energy sources. The research also integrates a Fuzzy Logic based fault tolerance system which identifies and manages faults, ensuring consistent inverter performance. The efficacy of the proposed methodology is substantiated through rigorous simulations and comparative results, shedding light on the voltage prediction efficiency and the robustness of the model even under fault conditions.Keywords: total harmonic distortion, fuzzy logic, renewable energy sources, MLI
Procedia PDF Downloads 13415573 Calibration and Validation of ArcSWAT Model for Estimation of Surface Runoff and Sediment Yield from Dhangaon Watershed
Authors: M. P. Tripathi, Priti Tiwari
Abstract:
Soil and Water Assessment Tool (SWAT) is a distributed parameter continuous time model and was tested on daily and fortnightly basis for a small agricultural watershed (Dhangaon) of Chhattisgarh state in India. The SWAT model recently interfaced with ArcGIS and called as ArcSWAT. The watershed and sub-watershed boundaries, drainage networks, slope and texture maps were generated in the environment of ArcGIS of ArcSWAT. Supervised classification method was used for land use/cover classification from satellite imageries of the years 2009 and 2012. Manning's roughness coefficient 'n' for overland flow and channel flow and Fraction of Field Capacity (FFC) were calibrated for monsoon season of the years 2009 and 2010. The model was validated on a daily basis for the years 2011 and 2012 by using the observed daily rainfall and temperature data. Calibration and validation results revealed that the model was predicting the daily surface runoff and sediment yield satisfactorily. Sensitivity analysis showed that the annual sediment yield was inversely proportional to the overland and channel 'n' values whereas; annual runoff and sediment yields were directly proportional to the FFC. The model was also tested (calibrated and validated) for the fortnightly runoff and sediment yield for the year 2009-10 and 2011-12, respectively. Simulated values of fortnightly runoff and sediment yield for the calibration and validation years compared well with their observed counterparts. The calibration and validation results revealed that the ArcSWAT model could be used for identification of critical sub-watershed and for developing management scenarios for the Dhangaon watershed. Further, the model should be tested for simulating the surface runoff and sediment yield using generated rainfall and temperature before applying it for developing the management scenario for the critical or priority sub-watersheds.Keywords: watershed, hydrologic and water quality, ArcSWAT model, remote sensing, GIS, runoff and sediment yield
Procedia PDF Downloads 38115572 Experimental Evaluation of UDP in Wireless LAN
Authors: Omar Imhemed Alramli
Abstract:
As Transmission Control Protocol (TCP), User Datagram Protocol (UDP) is transfer protocol in the transportation layer in Open Systems Interconnection model (OSI model) or in TCP/IP model of networks. The UDP aspects evaluation were not recognized by using the pcattcp tool on the windows operating system platform like TCP. The study has been carried out to find a tool which supports UDP aspects evolution. After the information collection about different tools, iperf tool was chosen and implemented on Cygwin tool which is installed on both Windows XP platform and also on Windows XP on virtual box machine on one computer only. Iperf is used to make experimental evaluation of UDP and to see what will happen during the sending the packets between the Host and Guest in wired and wireless networks. Many test scenarios have been done and the major UDP aspects such as jitter, packet losses, and throughput are evaluated.Keywords: TCP, UDP, IPERF, wireless LAN
Procedia PDF Downloads 35715571 Create a Dynamic Model in Project Control and Management
Authors: Hamed Saremi, Shahla Saremi
Abstract:
In this study, control and management of construction projects is evaluated through developing a dynamic model in which some means are used in order to evaluating planning assumptions and reviewing the effectiveness of some project control policies based on previous researches about time, cost, project schedule pressure management, source management, project control, adding elements and sub-systems from cost management such as estimating consumption budget from budget due to costs, budget shortage effects and etc. using sensitivity analysis, researcher has evaluated introduced model that during model simulation by VENSIM software and assuming optimistic times and adding information about doing job and changes rate and project is forecasted with 373 days (2 days sooner than forecasted) and final profit $ 1,960,670 (23% amount of contract) assuming 15% inflation rate in year and costs rate accordance with planned amounts and other input information and final profit.Keywords: dynamic planning, cost, time, performance, project management
Procedia PDF Downloads 47915570 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment
Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 46215569 Design and Analysis of Flexible Slider Crank Mechanism
Authors: Thanh-Phong Dao, Shyh-Chour Huang
Abstract:
This study presents the optimal design and formulation of a kinematic model of a flexible slider crank mechanism. The objective of the proposed innovative design is to take extra advantage of the compliant mechanism and maximize the fatigue life by applying the Taguchi method. A formulated kinematic model is developed using a Pseudo-Rigid-Body Model (PRBM). By means of mathematic models, the kinematic behaviors of the flexible slider crank mechanism are captured using MATLAB software. Finite Element Analysis (FEA) is used to show the stress distribution. The results show that the optimal shape of the flexible hinge includes a force of 8.5N, a width of 9mm and a thickness of 1.1mm. Analysis of variance shows that the thickness of the proposed hinge is the most significant parameter, with an F test of 15.5. Finally, a prototype is manufactured to prepare for testing the kinematic and dynamic behaviors.Keywords: kinematic behavior, fatigue life, pseudo-rigid-body model, flexible slider crank mechanism
Procedia PDF Downloads 46115568 Sampled-Data Model Predictive Tracking Control for Mobile Robot
Authors: Wookyong Kwon, Sangmoon Lee
Abstract:
In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.Keywords: model predictive control, sampled-data control, linear parameter varying systems, LPV
Procedia PDF Downloads 31215567 Estimation of Maize Yield by Using a Process-Based Model and Remote Sensing Data in the Northeast China Plain
Authors: Jia Zhang, Fengmei Yao, Yanjing Tan
Abstract:
The accurate estimation of crop yield is of great importance for the food security. In this study, a process-based mechanism model was modified to estimate yield of C4 crop by modifying the carbon metabolic pathway in the photosynthesis sub-module of the RS-P-YEC (Remote-Sensing-Photosynthesis-Yield estimation for Crops) model. The yield was calculated by multiplying net primary productivity (NPP) and the harvest index (HI) derived from the ratio of grain to stalk yield. The modified RS-P-YEC model was used to simulate maize yield in the Northeast China Plain during the period 2002-2011. The statistical data of maize yield from study area was used to validate the simulated results at county-level. The results showed that the Pearson correlation coefficient (R) was 0.827 (P < 0.01) between the simulated yield and the statistical data, and the root mean square error (RMSE) was 712 kg/ha with a relative error (RE) of 9.3%. From 2002-2011, the yield of maize planting zone in the Northeast China Plain was increasing with smaller coefficient of variation (CV). The spatial pattern of simulated maize yield was consistent with the actual distribution in the Northeast China Plain, with an increasing trend from the northeast to the southwest. Hence the results demonstrated that the modified process-based model coupled with remote sensing data was suitable for yield prediction of maize in the Northeast China Plain at the spatial scale.Keywords: process-based model, C4 crop, maize yield, remote sensing, Northeast China Plain
Procedia PDF Downloads 37815566 Study on Constitutive Model of Particle Filling Material Considering Volume Expansion
Authors: Xu Jinsheng, Tong Xin, Zheng Jian, Zhou Changsheng
Abstract:
The NEPE (nitrate ester plasticized polyether) propellant is a kind of particle filling material with relatively high filling fraction. The experimental results show that the microcracks, microvoids and dewetting can cause the stress softening of the material. In this paper, a series of mechanical testing in inclusion with CCD technique were conducted to analyze the evolution of internal defects of propellant. The volume expansion function of the particle filling material was established by measuring of longitudinal and transverse strain with optical deformation measurement system. By analyzing the defects and internal damages of the material, a visco-hyperelastic constitutive model based on free energy theory was proposed incorporating damage function. The proposed constitutive model could accurately predict the mechanical properties of uniaxial tensile tests and tensile-relaxation tests.Keywords: dewetting, constitutive model, uniaxial tensile tests, visco-hyperelastic, nonlinear
Procedia PDF Downloads 30315565 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model
Authors: Yuan-Jye Tseng, Shin-Han Lin
Abstract:
In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.Keywords: supply chain management, green supply chain, green design, green manufacturing, mathematical model
Procedia PDF Downloads 80915564 Facility Data Model as Integration and Interoperability Platform
Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes
Abstract:
Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.Keywords: airport ontology, energy management, facility data model, ontology modeling
Procedia PDF Downloads 45015563 Global Stability Analysis of a Coupled Model for Healthy and Cancerous Cells Dynamics in Acute Myeloid Leukemia
Authors: Abdelhafid Zenati, Mohamed Tadjine
Abstract:
The mathematical formulation of biomedical problems is an important phase to understand and predict the dynamic of the controlled population. In this paper we perform a stability analysis of a coupled model for healthy and cancerous cells dynamics in Acute Myeloid Leukemia, this represents our first aim. Second, we illustrate the effect of the interconnection between healthy and cancer cells. The PDE-based model is transformed to a nonlinear distributed state space model (delay system). For an equilibrium point of interest, necessary and sufficient conditions of global asymptotic stability are given. Thus, we came up to give necessary and sufficient conditions of global asymptotic stability of the origin and the healthy situation and control of the dynamics of normal hematopoietic stem cells and cancerous during myelode Acute leukemia. Simulation studies are given to illustrate the developed results.Keywords: distributed delay, global stability, modelling, nonlinear models, PDE, state space
Procedia PDF Downloads 25215562 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 43015561 Study of the Relationship between the Roughness Configuration of Channel Bottom and the Creation of Vortices at the Rough Area: Numerical Modelling
Authors: Youb Said, Fourar Ali
Abstract:
To describe the influence of bottom roughness on the free surface flows by numerical modeling, a two-dimensional model was developed. The equations of continuity and momentum (Naviers Stokes equations) are solved by the finite volume method. We considered a turbulent flow in an open channel with a bottom roughness. For our simulations, the K-ε model was used. After setting the initial and boundary conditions and solve the equations set, we were able to achieve the following results: vortex forming in the hollow causing substantial energy dissipation in the obstacle areas that form the bottom roughness. The comparison of our results with experimental ones shows a good agreement in terms of the results in the rough area. However, in other areas, differences were more or less important. These differences are in areas far from the bottom, especially the free surface area just after the bottom. These disagreements are probably due to experimental constants used by the k-ε model.Keywords: modeling, free surface flow, turbulence, bottom roughness, finite volume, K-ε model, energy dissipation
Procedia PDF Downloads 38215560 Using of Particle Swarm Optimization for Loss Minimization of Vector-Controlled Induction Motors
Authors: V. Rashtchi, H. Bizhani, F. R. Tatari
Abstract:
This paper presents a new online loss minimization for an induction motor drive. Among the many loss minimization algorithms (LMAs) for an induction motor, a particle swarm optimization (PSO) has the advantages of fast response and high accuracy. However, the performance of the PSO and other optimization algorithms depend on the accuracy of the modeling of the motor drive and losses. In the development of the loss model, there is always a trade off between accuracy and complexity. This paper presents a new online optimization to determine an optimum flux level for the efficiency optimization of the vector-controlled induction motor drive. An induction motor (IM) model in d-q coordinates is referenced to the rotor magnetizing current. This transformation results in no leakage inductance on the rotor side, thus the decomposition into d-q components in the steady-state motor model can be utilized in deriving the motor loss model. The suggested algorithm is simple for implementation.Keywords: induction machine, loss minimization, magnetizing current, particle swarm optimization
Procedia PDF Downloads 63415559 Estimation of the Length and Location of Ground Surface Deformation Caused by the Reverse Faulting
Authors: Nader Khalafian, Mohsen Ghaderi
Abstract:
Field observations have revealed many examples of structures which were damaged due to ground surface deformation caused by the faulting phenomena. In this paper some efforts were made in order to estimate the length and location of the ground surface where large displacements were created due to the reverse faulting. This research has conducted in two steps; (1) in the first step, a 2D explicit finite element model were developed using ABAQUS software. A subroutine for Mohr-Coulomb failure criterion with strain softening model was developed by the authors in order to properly model the stress strain behavior of the soil in the fault rapture zone. The results of the numerical analysis were verified with the results of available centrifuge experiments. Reasonable coincidence was found between the numerical and experimental data. (2) In the second step, the effects of the fault dip angle (δ), depth of soil layer (H), dilation and friction angle of sand (ψ and φ) and the amount of fault offset (d) on the soil surface displacement and fault rupture path were investigated. An artificial neural network-based model (ANN), as a powerful prediction tool, was developed to generate a general model for predicting faulting characteristics. A properly sized database was created to train and test network. It was found that the length and location of the zone of displaced ground surface can be accurately estimated using the proposed model.Keywords: reverse faulting, surface deformation, numerical, neural network
Procedia PDF Downloads 42115558 A Convolutional Neural Network-Based Model for Lassa fever Virus Prediction Using Patient Blood Smear Image
Authors: A. M. John-Otumu, M. M. Rahman, M. C. Onuoha, E. P. Ojonugwa
Abstract:
A Convolutional Neural Network (CNN) model for predicting Lassa fever was built using Python 3.8.0 programming language, alongside Keras 2.2.4 and TensorFlow 2.6.1 libraries as the development environment in order to reduce the current high risk of Lassa fever in West Africa, particularly in Nigeria. The study was prompted by some major flaws in existing conventional laboratory equipment for diagnosing Lassa fever (RT-PCR), as well as flaws in AI-based techniques that have been used for probing and prognosis of Lassa fever based on literature. There were 15,679 blood smear microscopic image datasets collected in total. The proposed model was trained on 70% of the dataset and tested on 30% of the microscopic images in avoid overfitting. A 3x3x3 convolution filter was also used in the proposed system to extract features from microscopic images. The proposed CNN-based model had a recall value of 96%, a precision value of 93%, an F1 score of 95%, and an accuracy of 94% in predicting and accurately classifying the images into clean or infected samples. Based on empirical evidence from the results of the literature consulted, the proposed model outperformed other existing AI-based techniques evaluated. If properly deployed, the model will assist physicians, medical laboratory scientists, and patients in making accurate diagnoses for Lassa fever cases, allowing the mortality rate due to the Lassa fever virus to be reduced through sound decision-making.Keywords: artificial intelligence, ANN, blood smear, CNN, deep learning, Lassa fever
Procedia PDF Downloads 12015557 Developing Measurement Instruments for Enterprise Resources Planning (ERP) Post-Implementation Failure Model
Authors: Malihe Motiei, Nor Hidayati Zakaria, Davide Aloini
Abstract:
This study aims to present a method to develop the failure measurement model for ERP post-implementation. To achieve this outcome, the study firstly evaluates the suitability of Technology-Organization-Environment framework for the proposed conceptual model. This study explains how to discover the constructs and subsequently to design and evaluate the constructs as formative or reflective. Constructs are used including reflective and purely formative. Then, the risk dimensions are investigated to determine the instruments to examine the impact of risk on ERP failure after implementation. Two construct as formative constructs consist inadequate implementation and poor organizational decision making. Subsequently six construct as reflective construct include technical risks, operational risks, managerial risks, top management risks, lack of external risks, and user’s inefficiency risks. A survey was conducted among Iranian industries to collect data. 69 data were collected from manufacturing sectors and the data were analyzed by Smart PLS software. The results indicated that all measurements included 39 critical risk factors were acceptable for the ERP post-implementation failure model.Keywords: critical risk factors (CRFs), ERP projects, ERP post-implementation, measurement instruments, ERP system failure measurement model
Procedia PDF Downloads 36415556 The Methodology of System Modeling of Mechatronic Systems
Authors: Lakhoua Najeh
Abstract:
Aims of the work: After a presentation of the functionality of an example of a mechatronic system which is a paint mixer system, we present the concepts of modeling and safe operation. This paper briefly discusses how to model and protect the functioning of a mechatronic system relying mainly on functional analysis and safe operation techniques. Methods: For the study of an example of a mechatronic system, we use methods for external functional analysis that illustrate the relationships between a mechatronic system and its external environment. Thus, we present the Safe-Structured Analysis Design Technique method (Safe-SADT) which allows the representation of a mechatronic system. A model of operating safety and automation is proposed. This model enables us to use a functional analysis technique of the mechatronic system based on the GRAFCET (Graphe Fonctionnel de Commande des Etapes et Transitions: Step Transition Function Chart) method; study of the safe operation of the mechatronic system based on the Safe-SADT method; automation of the mechatronic system based on a software tool. Results: The expected results are to propose a model and safe operation of a mechatronic system. This methodology enables us to analyze the relevance of the different models based on Safe-SADT and GRAFCET in relation to the control and monitoring functions and to study the means allowing exploiting their synergy. Conclusion: In order to propose a general model of a mechatronic system, a model of analysis, safety operation and automation of a mechatronic system has been developed. This is how we propose to validate this methodology through a case study of a paint mixer system.Keywords: mechatronic systems, system modeling, safe operation, Safe-SADT
Procedia PDF Downloads 24515555 Sensitive Analysis of the ZF Model for ABC Multi Criteria Inventory Classification
Authors: Makram Ben Jeddou
Abstract:
The ABC classification is widely used by managers for inventory control. The classical ABC classification is based on the Pareto principle and according to the criterion of the annual use value only. Single criterion classification is often insufficient for a closely inventory control. Multi-criteria inventory classification models have been proposed by researchers in order to take into account other important criteria. From these models, we will consider the ZF model in order to make a sensitive analysis on the composite score calculated for each item. In fact, this score based on a normalized average between a good and a bad optimized index can affect the ABC items classification. We will then focus on the weights assigned to each index and propose a classification compromise.Keywords: ABC classification, multi criteria inventory classification models, ZF-model
Procedia PDF Downloads 50815554 An Economic Order Quantity Model for Deteriorating Items with Ramp Type Demand, Time Dependent Holding Cost and Price Discount Offered on Backorders
Authors: Arjun Paul, Adrijit Goswami
Abstract:
In our present work, an economic order quantity inventory model with shortages is developed where holding cost is expressed as linearly increasing function of time and demand rate is a ramp type function of time. The items considered in the model are deteriorating in nature so that a small fraction of the items is depleted with the passage of time. In order to consider a more realistic situation, the deterioration rate is assumed to follow a continuous uniform distribution with the parameters involved being triangular fuzzy numbers. The inventory manager offers his customer a discount in case he is willing to backorder his demand when there is a stock-out. The optimum ordering policy and the optimum discount offered for each backorder are determined by minimizing the total cost in a replenishment interval. For better illustration of our proposed model in both the crisp and fuzzy sense and for providing richer insights, a numerical example is cited to exemplify the policy and to analyze the sensitivity of the model parameters.Keywords: fuzzy deterioration rate, price discount on backorder, ramp type demand, shortage, time varying holding cost
Procedia PDF Downloads 19915553 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard
Authors: Arash Gharibi
Abstract:
Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.Keywords: metamodel, modeling, interoperability, reuse
Procedia PDF Downloads 19815552 Optical and Double Folding Model Analysis for Alpha Particles Elastically Scattered from 9Be and 11B Nuclei at Different Energies
Authors: Ahmed H. Amer, A. Amar, Sh. Hamada, I. I. Bondouk, F. A. El-Hussiny
Abstract:
Elastic scattering of α-particles from 9Be and 11B nuclei at different alpha energies have been analyzed. Optical model parameters (OMPs) of α-particles elastic scattering by these nuclei at different energies have been obtained. In the present calculations, the real part of the optical potential are derived by folding of nucleon-nucleon (NN) interaction into nuclear matter density distribution of the projectile and target nuclei using computer code FRESCO. A density-dependent version of the M3Y interaction (CDM3Y6), which is based on the G-matrix elements of the Paris NN potential, has been used. Volumetric integrals of the real and imaginary potential depth (JR, JW) have been calculated and found to be energy dependent. Good agreement between the experimental data and the theoretical predictions in the whole angular range. In double folding (DF) calculations, the obtained normalization coefficient Nr is in the range 0.70–1.32.Keywords: elastic scattering, optical model, double folding model, density distribution
Procedia PDF Downloads 291