Search results for: model based engineering MBE
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 38969

Search results for: model based engineering MBE

38009 Modeling and Simulation of Fluid Catalytic Cracking Process

Authors: Sungho Kim, Dae Shik Kim, Jong Min Lee

Abstract:

Fluid catalytic cracking (FCC) process is one of the most important process in modern refinery industry. This paper focuses on the fluid catalytic cracking (FCC) process. As the FCC process is difficult to model well, due to its non linearities and various interactions between its process variables, rigorous process modeling of whole FCC plant is demanded for control and plant-wide optimization of the plant. In this study, a process design for the FCC plant includes riser reactor, main fractionator, and gas processing unit was developed. A reactor model was described based on four-lumped kinetic scheme. Main fractionator, gas processing unit and other process units are designed to simulate real plant data, using a process flow sheet simulator, Aspen PLUS. The custom reactor model was integrated with the process flow sheet simulator to develop an integrated process model.

Keywords: fluid catalytic cracking, simulation, plant data, process design

Procedia PDF Downloads 526
38008 Numerical Solutions of Fractional Order Epidemic Model

Authors: Sadia Arshad, Ayesha Sohail, Sana Javed, Khadija Maqbool, Salma Kanwal

Abstract:

The dynamical study of the carriers play an essential role in the evolution and global transmission of infectious diseases and will be discussed in this study. To make this approach novel, we will consider the fractional order model which is generalization of integer order derivative to an arbitrary number. Since the integration involved is non local therefore this property of fractional operator is very useful to study epidemic model for infectious diseases. An extended numerical method (ODE solver) is implemented on the model equations and we will present the simulations of the model for different values of fractional order to study the effect of carriers on transmission dynamics. Global dynamics of fractional model are established by using the reproduction number.

Keywords: Fractional differential equation, Numerical simulations, epidemic model, transmission dynamics

Procedia PDF Downloads 594
38007 Fuzzy Time Series Forecasting Based on Fuzzy Logical Relationships, PSO Technique, and Automatic Clustering Algorithm

Authors: A. K. M. Kamrul Islam, Abdelhamid Bouchachia, Suang Cang, Hongnian Yu

Abstract:

Forecasting model has a great impact in terms of prediction and continues to do so into the future. Although many forecasting models have been studied in recent years, most researchers focus on different forecasting methods based on fuzzy time series to solve forecasting problems. The forecasted models accuracy fully depends on the two terms that are the length of the interval in the universe of discourse and the content of the forecast rules. Moreover, a hybrid forecasting method can be an effective and efficient way to improve forecasts rather than an individual forecasting model. There are different hybrids forecasting models which combined fuzzy time series with evolutionary algorithms, but the performances are not quite satisfactory. In this paper, we proposed a hybrid forecasting model which deals with the first order as well as high order fuzzy time series and particle swarm optimization to improve the forecasted accuracy. The proposed method used the historical enrollments of the University of Alabama as dataset in the forecasting process. Firstly, we considered an automatic clustering algorithm to calculate the appropriate interval for the historical enrollments. Then particle swarm optimization and fuzzy time series are combined that shows better forecasting accuracy than other existing forecasting models.

Keywords: fuzzy time series (fts), particle swarm optimization, clustering algorithm, hybrid forecasting model

Procedia PDF Downloads 245
38006 How to Perform Proper Indexing?

Authors: Watheq Mansour, Waleed Bin Owais, Mohammad Basheer Kotit, Khaled Khan

Abstract:

Efficient query processing is one of the utmost requisites in any business environment to satisfy consumer needs. This paper investigates the various types of indexing models, viz. primary, secondary, and multi-level. The investigation is done under the ambit of various types of queries to which each indexing model performs with efficacy. This study also discusses the inherent advantages and disadvantages of each indexing model and how indexing models can be chosen based on a particular environment. This paper also draws parallels between various indexing models and provides recommendations that would help a Database administrator to zero-in on a particular indexing model attributed to the needs and requirements of the production environment. In addition, to satisfy industry and consumer needs attributed to the colossal data generation nowadays, this study has proposed two novel indexing techniques that can be used to index highly unstructured and structured Big Data with efficacy. The study also briefly discusses some best practices that the industry should follow in order to choose an indexing model that is apposite to their prerequisites and requirements.

Keywords: indexing, hashing, latent semantic indexing, B-tree

Procedia PDF Downloads 153
38005 Agent-Based Modelling to Improve Dairy-origin Beef Production: Model Description and Evaluation

Authors: Addisu H. Addis, Hugh T. Blair, Paul R. Kenyon, Stephen T. Morris, Nicola M. Schreurs, Dorian J. Garrick

Abstract:

Agent-based modeling (ABM) enables an in silico representation of complex systems and cap-tures agent behavior resulting from interaction with other agents and their environment. This study developed an ABM to represent a pasture-based beef cattle finishing systems in New Zea-land (NZ) using attributes of the rearer, finisher, and processor, as well as specific attributes of dairy-origin beef cattle. The model was parameterized using values representing 1% of NZ dairy-origin cattle, and 10% of rearers and finishers in NZ. The cattle agent consisted of 32% Holstein-Friesian, 50% Holstein-Friesian–Jersey crossbred, and 8% Jersey, with the remainder being other breeds. Rearers and finishers repetitively and simultaneously interacted to determine the type and number of cattle populating the finishing system. Rearers brought in four-day-old spring-born calves and reared them until 60 calves (representing a full truck load) on average had a live weight of 100 kg before selling them on to finishers. Finishers mainly attained weaners from rearers, or directly from dairy farmers when weaner demand was higher than the supply from rearers. Fast-growing cattle were sent for slaughter before the second winter, and the re-mainder were sent before their third winter. The model finished a higher number of bulls than heifers and steers, although it was 4% lower than the industry reported value. Holstein-Friesian and Holstein-Friesian–Jersey-crossbred cattle dominated the dairy-origin beef finishing system. Jersey cattle account for less than 5% of total processed beef cattle. Further studies to include re-tailer and consumer perspectives and other decision alternatives for finishing farms would im-prove the applicability of the model for decision-making processes.

Keywords: agent-based modelling, dairy cattle, beef finishing, rearers, finishers

Procedia PDF Downloads 93
38004 Delineating Floodplain along the Nasia River in Northern Ghana Using HAND Contour

Authors: Benjamin K. Ghansah, Richard K. Appoh, Iliya Nababa, Eric K. Forkuo

Abstract:

The Nasia River is an important source of water for domestic and agricultural purposes to the inhabitants of its catchment. Major farming activities takes place within the floodplain of the river and its network of tributaries. The actual inundation extent of the river system is; however, unknown. Reasons for this lack of information include financial constraints and inadequate human resources as flood modelling is becoming increasingly complex by the day. Knowledge of the inundation extent will help in the assessment of risk posed by the annual flooding of the river, and help in the planning of flood recession agricultural activities. This study used a simple terrain based algorithm, Height Above Nearest Drainage (HAND), to delineate the floodplain of the Nasia River and its tributaries. The HAND model is a drainage normalized digital elevation model, which has its height reference based on the local drainage systems rather than the average mean sea level (AMSL). The underlying principle guiding the development of the HAND model is that hillslope flow paths behave differently when the reference gradient is to the local drainage network as compared to the seaward gradient. The new terrain model of the catchment was created using the NASA’s SRTM Digital Elevation Model (DEM) 30m as the only data input. Contours (HAND Contour) were then generated from the normalized DEM. Based on field flood inundation survey, historical information of flooding of the area as well as satellite images, a HAND Contour of 2m was found to best correlates with the flood inundation extent of the river and its tributaries. A percentage accuracy of 75% was obtained when the surface area created by the 2m contour was compared with surface area of the floodplain computed from a satellite image captured during the peak flooding season in September 2016. It was estimated that the flooding of the Nasia River and its tributaries created a floodplain area of 1011 km².

Keywords: digital elevation model, floodplain, HAND contour, inundation extent, Nasia River

Procedia PDF Downloads 449
38003 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 237
38002 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing

Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah

Abstract:

The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.

Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing

Procedia PDF Downloads 421
38001 A Composite Beam Element Based on Global-Local Superposition Theory for Prediction of Delamination in Composite Laminates

Authors: Charles Mota Possatti Júnior, André Schwanz de Lima, Maurício Vicente Donadon, Alfredo Rocha de Faria

Abstract:

An interlaminar damage model is combined with a beam element formulation based on global-local superposition to assess delamination in composite laminates. The variations in the mechanical properties in the laminate, generated by the presence of delamination, are calculated as a function of the displacements in the interface layers. The global-local superposition of displacement fields ensures the zig-zag behaviour of stresses and displacement, and the number of degrees of freedom (DOFs) is independent of the number of layers. The displacements and stresses are calculated as a function of DOFs commonly used in traditional beam elements. Finally, the finite element(FE) formulation is extended to handle cases of different thicknesses, and then the FE model predictions are compared with results obtained from analytical solutions and commercial finite element codes.

Keywords: delamination, global-local superposition theory, single beam element, zig-zag, interlaminar damage model

Procedia PDF Downloads 114
38000 Finite Element Modeling of Heat and Moisture Transfer in Porous Material

Authors: V. D. Thi, M. Li, M. Khelifa, M. El Ganaoui, Y. Rogaume

Abstract:

This paper presents a two-dimensional model to study the heat and moisture transfer through porous building materials. Dynamic and static coupled models of heat and moisture transfer in porous material under low temperature are presented and the coupled models together with variable initial and boundary conditions have been considered in an analytical way and using the finite element method. The resulting coupled model is converted to two nonlinear partial differential equations, which is then numerically solved by an implicit iterative scheme. The numerical results of temperature and moisture potential changes are compared with the experimental measurements available in the literature. Predicted results demonstrate validation of the theoretical model and effectiveness of the developed numerical algorithms. It is expected to provide useful information for the porous building material design based on heat and moisture transfer model.

Keywords: finite element method, heat transfer, moisture transfer, porous materials, wood

Procedia PDF Downloads 396
37999 Evaluating Key Attributes of Effective Digital Games in Tertiary Education

Authors: Roopali Kulkarni, Yuliya Khrypko

Abstract:

A major problem in educational digital game design is that game developers are often focused on maintaining the fun and playability of an educational game, whereas educators are more concerned with the learning aspect of the game rather than its entertaining characteristics. There is a clear need to understand what key aspects of digital learning games make them an effective learning medium in tertiary education. Through a systematic literature review and content analysis, this paper identifies, evaluates, and summarizes twenty-three key attributes of digital games used in tertiary education and presents a summary digital game-based learning (DGBL) model for designing and evaluating an educational digital game of any genre that promotes effective learning in tertiary education. The proposed solution overcomes limitations of previously designed models for digital game evaluation, such as a small number of game attributes considered or applicability to a specific genre of digital games. The proposed DGBL model can be used to assist game designers and educators with creating effective and engaging educational digital games for the tertiary education curriculum.

Keywords: DGBL model, digital games, educational games, game-based learning, tertiary education

Procedia PDF Downloads 275
37998 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects

Authors: Sami Mestiri, Abdeljelil Farhat

Abstract:

The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.

Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC

Procedia PDF Downloads 537
37997 Influence of Security Attributes in Component-Based Software Development

Authors: Somayeh Zeinali

Abstract:

A component is generally defined as a piece of executable software with a published interface. Component-based software engineering (CBSE) has become recognized as a new sub-discipline of software engineering. In the component-based software development, components cannot be completely secure and thus easily become vulnerable. Some researchers have investigated this issue and proposed approaches to detect component intrusions or protect distributed components. Software security also refers to the process of creating software that is considered secure.The terms “dependability”, “trustworthiness”, and “survivability” are used interchangeably to describe the properties of software security.

Keywords: component-based software development, component-based software engineering , software security attributes, dependability, component

Procedia PDF Downloads 551
37996 Using Structural Equation Modeling to Analyze the Impact of Remote Work on Job Satisfaction

Authors: Florian Pfeffel, Valentin Nickolai, Christian Louis Kühner

Abstract:

Digitalization has disrupted the traditional workplace environment by allowing many employees to work from anywhere at any time. This trend of working from home was further accelerated due to the COVID-19 crisis, which forced companies to rethink their workplace models. While in many companies, this shift happened out of pure necessity; many employees were left more satisfied with their job due to the opportunity to work from home. This study focuses on employees’ job satisfaction in the service sector in dependence on the different work models, which are defined as a “work from home” model, the traditional “work in office” model, and a hybrid model. Using structural equation modeling (SEM), these three work models have been analyzed based on 13 influencing factors on job satisfaction that have been further summarized in the three groups “classic influencing factors”, “influencing factors changed by remote working”, and “new remote working influencing factors”. Based on the influencing factors on job satisfaction, a survey has been conducted with n = 684 employees in the service sector. Cronbach’s alpha of the individual constructs was shown to be suitable. Furthermore, the construct validity of the constructs was confirmed by face validity, content validity, convergent validity (AVE > 0.5: CR > 0.7), and discriminant validity. Additionally, confirmatory factor analysis (CFA) confirmed the model fit for the investigated sample (CMIN/DF: 2.567; CFI: 0.927; RMSEA: 0.048). The SEM-analysis has shown that the most significant influencing factor on job satisfaction is “identification with the work” with β = 0.540, followed by “Appreciation” (β = 0.151), “Compensation” (β = 0.124), “Work-Life-Balance” (β = 0.116), and “Communication and Exchange of Information” (β = 0.105). While the significance of each factor can vary depending on the work model, the SEM-analysis shows that the identification with the work is the most significant factor in all three work models and, in the case of the traditional office work model, it is the only significant influencing factor. The study shows that employees who work entirely remotely or have a hybrid work model are significantly more satisfied with their job, with a job satisfaction score of 5.0 respectively on a scale from 1 (very dissatisfied) to 7 (very satisfied), than employees do not have the option to work from home with a score of 4.6. This comes as a result of the lower identification with the work in the model without any remote working. Furthermore, the responses indicate that it is important to consider the individual preferences of each employee when it comes to the work model to achieve overall higher job satisfaction. Thus, it can be argued that companies can profit off of more motivation and higher productivity by considering the individual work model preferences, therefore, increasing the identification with the respective work.

Keywords: home-office, identification with work, job satisfaction, new work, remote work, structural equation modeling

Procedia PDF Downloads 79
37995 A Proposal for a Combustion Model Considering the Lewis Number and Its Evaluation

Authors: Fujio Akagi, Hiroaki Ito, Shin-Ichi Inage

Abstract:

The aim of this study is to develop a combustion model that can be applied uniformly to laminar and turbulent premixed flames while considering the effect of the Lewis number (Le). The model considers the effect of Le on the transport equations of the reaction progress, which varies with the chemical species and temperature. The distribution of the reaction progress variable is approximated by a hyperbolic tangent function, while the other distribution of the reaction progress variable is estimated using the approximated distribution and transport equation of the reaction progress variable considering the Le. The validity of the model was evaluated under the conditions of propane with Le > 1 and methane with Le = 1 (equivalence ratios of 0.5 and 1). The estimated results were found to be in good agreement with those of previous studies under all conditions. A method of introducing a turbulence model into this model is also described. It was confirmed that conventional turbulence models can be expressed as an approximate theory of this model in a unified manner.

Keywords: combustion model, laminar flame, Lewis number, turbulent flame

Procedia PDF Downloads 116
37994 Probabilistic Graphical Model for the Web

Authors: M. Nekri, A. Khelladi

Abstract:

The world wide web network is a network with a complex topology, the main properties of which are the distribution of degrees in power law, A low clustering coefficient and a weak average distance. Modeling the web as a graph allows locating the information in little time and consequently offering a help in the construction of the research engine. Here, we present a model based on the already existing probabilistic graphs with all the aforesaid characteristics. This work will consist in studying the web in order to know its structuring thus it will enable us to modelize it more easily and propose a possible algorithm for its exploration.

Keywords: clustering coefficient, preferential attachment, small world, web community

Procedia PDF Downloads 269
37993 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis

Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera

Abstract:

Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.

Keywords: log-linear model, multi spectral, residuals, spatial error model

Procedia PDF Downloads 293
37992 Real-Time Network Anomaly Detection Systems Based on Machine-Learning Algorithms

Authors: Zahra Ramezanpanah, Joachim Carvallo, Aurelien Rodriguez

Abstract:

This paper aims to detect anomalies in streaming data using machine learning algorithms. In this regard, we designed two separate pipelines and evaluated the effectiveness of each separately. The first pipeline, based on supervised machine learning methods, consists of two phases. In the first phase, we trained several supervised models using the UNSW-NB15 data-set. We measured the efficiency of each using different performance metrics and selected the best model for the second phase. At the beginning of the second phase, we first, using Argus Server, sniffed a local area network. Several types of attacks were simulated and then sent the sniffed data to a running algorithm at short intervals. This algorithm can display the results of each packet of received data in real-time using the trained model. The second pipeline presented in this paper is based on unsupervised algorithms, in which a Temporal Graph Network (TGN) is used to monitor a local network. The TGN is trained to predict the probability of future states of the network based on its past behavior. Our contribution in this section is introducing an indicator to identify anomalies from these predicted probabilities.

Keywords: temporal graph network, anomaly detection, cyber security, IDS

Procedia PDF Downloads 100
37991 The Establishment and Application of TRACE/FRAPTRAN Model for Kuosheng Nuclear Power Plant

Authors: S. W. Chen, W. K. Lin, J. R. Wang, C. Shih, H. T. Lin, H. C. Chang, W. Y. Li

Abstract:

Kuosheng nuclear power plant (NPP) is a BWR/6 type NPP and located on the northern coast of Taiwan. First, Kuosheng NPP TRACE model were developed in this research. In order to assess the system response of Kuosheng NPP TRACE model, startup tests data were used to evaluate Kuosheng NPP TRACE model. Second, the over pressurization transient analysis of Kuosheng NPP TRACE model was performed. Besides, in order to confirm the mechanical property and integrity of fuel rods, FRAPTRAN analysis was also performed in this study.

Keywords: TRACE, safety analysis, BWR/6, FRAPTRA

Procedia PDF Downloads 558
37990 An Educational Program Based on Health Belief Model to Prevent of Non-alcoholic Fatty Liver Disease Among Iranian Women

Authors: Arezoo Fallahi

Abstract:

Background and purpose: Non-alcoholic fatty liver is one of the most common liver disorders, which, as the most important cause of death from liver disease, has unpleasant consequences and complications. The aim of this study was to investigate the effect of an educational intervention based on a health belief model to prevent non-alcoholic fatty liver among women. Materials and Methods: This experimental study was performed among 110 women referring to comprehensive health service centers in Malayer City, west of Iran, in 2023. Using the available sampling method, 110 Participants were divided into experimental and control groups. The data collection tool included demographic characteristics and a questionnaire based on the health belief model. In The experimental group, three one-hour training sessions were conducted in the form of pamphlets, lectures and group discussions. Data were analyzed using SPSS software version 21, by correlation tests, paired t-tests independent t-tests. Results: The mean age of participants was 38.07±6.28 years, and Most of the participants were middle-aged, married, housewives with academic education, middle-income and overweight. After the educational intervention, the mean scores of the constructs include perceived sensitivity (p=0.01), perceived severity (p=0.01), perceived benefits (p=0.01), guidance for internal (p=0.01) and external action (p=0.01), and perceived self-efficacy (p=0.01) in the experimental group were significantly higher than the control group. The score of perceived barriers in the experimental group decreased after training. The perceived obstacles score in the test group decreased after the training (15.2 ± 3.9 v.s 11.2 ± 3.3, (p<0.01). Conclusion: The findings of the study showed that the design and implementation of educational programs based on the constructs of the health belief model can be effective in preventing women from developing higher levels of non-alcoholic fatty liver.

Keywords: health, education, believe, behaviour

Procedia PDF Downloads 44
37989 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 334
37988 Integrated Vegetable Production Planning Considering Crop Rotation Rules Using a Mathematical Mixed Integer Programming Model

Authors: Mohammadali Abedini Sanigy, Jiangang Fei

Abstract:

In this paper, a mathematical optimization model was developed to maximize the profit in a vegetable production planning problem. It serves as a decision support system that assists farmers in land allocation to crops and harvest scheduling decisions. The developed model can handle different rotation rules in two consecutive cycles of production, which is a common practice in organic production system. Moreover, different production methods of the same crop were considered in the model formulation. The main strength of the model is that it is not restricted to predetermined production periods, which makes the planning more flexible. The model is classified as a mixed integer programming (MIP) model and formulated in PYOMO -a Python package to formulate optimization models- and solved via Gurobi and CPLEX optimizer packages. The model was tested with secondary data from 'Australian vegetable growing farms', and the results were obtained and discussed with the computational test runs. The results show that the model can successfully provide reliable solutions for real size problems.

Keywords: crop rotation, harvesting, mathematical model formulation, vegetable production

Procedia PDF Downloads 183
37987 A Methodological Approach to Development of Mental Script for Mental Practice of Micro Suturing

Authors: Vaikunthan Rajaratnam

Abstract:

Intro: Motor imagery (MI) and mental practice (MP) can be an alternative to acquire mastery of surgical skills. One component of using this technique is the use of a mental script. The aim of this study was to design and develop a mental script for basic micro suturing training for skill acquisition using a low-fidelity rubber glove model and to describe the detailed methodology for this process. Methods: This study was based on a design and development research framework. The mental script was developed with 5 expert surgeons performing a cognitive walkthrough of the repair of a vertical opening in a rubber glove model using 8/0 nylon. This was followed by a hierarchal task analysis. A draft script was created, and face and content validity assessed with a checking-back process. The final script was validated with the recruitment of 28 participants, assessed using the Mental Imagery Questionnaire (MIQ). Results: The creation of the mental script is detailed in the full text. After assessment by the expert panel, the mental script had good face and content validity. The average overall MIQ score was 5.2 ± 1.1, demonstrating the validity of generating mental imagery from the mental script developed in this study for micro suturing in the rubber glove model. Conclusion: The methodological approach described in this study is based on an instructional design framework to teach surgical skills. This MP model is inexpensive and easily accessible, addressing the challenge of reduced opportunities to practice surgical skills. However, while motor skills are important, other non-technical expertise required by the surgeon is not addressed with this model. Thus, this model should act a surgical training augment, but not replace it.

Keywords: mental script, motor imagery, cognitive walkthrough, verbal protocol analysis, hierarchical task analysis

Procedia PDF Downloads 100
37986 Analysis of Risk-Based Disaster Planning in Local Communities

Authors: R. A. Temah, L. A. Nkengla-Asi

Abstract:

Planning for future disasters sets the stage for a variety of activities that may trigger multiple recurring operations and expose the community to opportunities to minimize risks. Local communities are increasingly embracing the necessity for planning based on local risks, but are also significantly challenged to effectively plan and response to disasters. This research examines basic risk-based disaster planning model and compares it with advanced risk-based planning that introduces the identification and alignment of varieties of local capabilities within and out of the local community that can be pivotal to facilitate the management of local risks and cascading effects prior to a disaster. A critical review shows that the identification and alignment of capabilities can potentially enhance risk-based disaster planning. A tailored holistic approach to risk based disaster planning is pivotal to enhance collective action and a reduction in disaster collective cost.

Keywords: capabilities, disaster planning, hazards, local community, risk-based

Procedia PDF Downloads 201
37985 Time Series Modelling and Prediction of River Runoff: Case Study of Karkheh River, Iran

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Rainfall and runoff phenomenon is a chaotic and complex outcome of nature which requires sophisticated modelling and simulation methods for explanation and use. Time Series modelling allows runoff data analysis and can be used as forecasting tool. In the paper attempt is made to model river runoff data and predict the future behavioural pattern of river based on annual past observations of annual river runoff. The river runoff analysis and predict are done using ARIMA model. For evaluating the efficiency of prediction to hydrological events such as rainfall, runoff and etc., we use the statistical formulae applicable. The good agreement between predicted and observation river runoff coefficient of determination (R2) display that the ARIMA (4,1,1) is the suitable model for predicting Karkheh River runoff at Iran.

Keywords: time series modelling, ARIMA model, river runoff, Karkheh River, CLS method

Procedia PDF Downloads 335
37984 Human Resources and Business Result: An Empirical Approach Based on RBV Theory

Authors: Xhevrie Mamaqi

Abstract:

Organization capacity learning is a process referring to the sum total of individual and collective learning through training programs, experience and experimentation, among others. Today, in-business ongoing training is one of the most important strategies for human capital development and it is crucial to sustain and improve workers’ knowledge and skills. Many organizations, firms and business are adopting a strategy of continuous learning, encouraging employees to learn new skills continually to be innovative and to try new processes and work in order to achieve a competitive advantage and superior business results. This paper uses the Resource Based View and Capacities (RBV) approach to construct a hypothetical relationships model between training and business results. The test of the model is applied on transversal data. A sample of 266 business of Spanish sector service has been selected. A Structural Equation Model (SEM) is used to estimate the relationship between ongoing training, represented by two latent dimension denominated Human and Social Capital resources and economic business results. The coefficients estimated have shown the efficient of some training aspects explaining the variation in business results.

Keywords: business results, human and social capital resources, training, RBV theory, SEM

Procedia PDF Downloads 296
37983 Mathematical Modelling and AI-Based Degradation Analysis of the Second-Life Lithium-Ion Battery Packs for Stationary Applications

Authors: Farhad Salek, Shahaboddin Resalati

Abstract:

The production of electric vehicles (EVs) featuring lithium-ion battery technology has substantially escalated over the past decade, demonstrating a steady and persistent upward trajectory. The imminent retirement of electric vehicle (EV) batteries after approximately eight years underscores the critical need for their redirection towards recycling, a task complicated by the current inadequacy of recycling infrastructures globally. A potential solution for such concerns involves extending the operational lifespan of electric vehicle (EV) batteries through their utilization in stationary energy storage systems during secondary applications. Such adoptions, however, require addressing the safety concerns associated with batteries’ knee points and thermal runaways. This paper develops an accurate mathematical model representative of the second-life battery packs from a cell-to-pack scale using an equivalent circuit model (ECM) methodology. Neural network algorithms are employed to forecast the degradation parameters based on the EV batteries' aging history to develop a degradation model. The degradation model is integrated with the ECM to reflect the impacts of the cycle aging mechanism on battery parameters during operation. The developed model is tested under real-life load profiles to evaluate the life span of the batteries in various operating conditions. The methodology and the algorithms introduced in this paper can be considered the basis for Battery Management System (BMS) design and techno-economic analysis of such technologies.

Keywords: second life battery, electric vehicles, degradation, neural network

Procedia PDF Downloads 59
37982 Early Requirement Engineering for Design of Learner Centric Dynamic LMS

Authors: Kausik Halder, Nabendu Chaki, Ranjan Dasgupta

Abstract:

We present a modelling framework that supports the engineering of early requirements specifications for design of learner centric dynamic Learning Management System. The framework is based on i* modelling tool and Means End Analysis, that adopts primitive concepts for modelling early requirements (such as actor, goal, and strategic dependency). We show how pedagogical and computational requirements for designing a learner centric Learning Management system can be adapted for the automatic early requirement engineering specifications. Finally, we presented a model on a Learner Quanta based adaptive Courseware. Our early requirement analysis shows that how means end analysis reveals gaps and inconsistencies in early requirements specifications that are by no means trivial to discover without the help of formal analysis tool.

Keywords: adaptive courseware, early requirement engineering, means end analysis, organizational modelling, requirement modelling

Procedia PDF Downloads 494
37981 1-g Shake Table Tests to Study the Impact of PGA on Foundation Settlement in Liquefiable Soil

Authors: Md. Kausar Alam, Mohammad Yazdi, Peiman Zogh, Ramin Motamed

Abstract:

The liquefaction-induced ground settlement has caused severe damage to structures in the past decades. However, the amount of building settlement caused by liquefaction is directly proportional to the intensity of the ground shaking. To reduce this soil liquefaction effect, it is essential to examine the influence of peak ground acceleration (PGA). Unfortunately, limited studies have been carried out on this issue. In this study, a series of moderate scale 1g shake table experiments were conducted at the University of Nevada Reno to evaluate the influence of PGA with the same duration in liquefiable soil layers. The model is prepared based on a large-scale shake table with a scaling factor of N = 5, which has been conducted at the University of California, San Diego. The model ground has three soil layers with relative densities of 50% for crust, 30% for liquefiable, and 90% for dense layer, respectively. In addition, a shallow foundation is seated over an unsaturated crust layer. After preparing the model, the input motions having various peak ground accelerations (i.e., 0.16g, 0.25g, and 0.37g) for the same duration (10 sec) were applied. Based on the experimental results, when the PGA increased from 0.16g to 0.37g, the foundation increased from 20 mm to 100 mm. In addition, the expected foundation settlement based on the scaling factor was 25 mm, while the actual settlement for PGA 0.25g for 10 seconds was 50 mm.

Keywords: foundation settlement, liquefaction, peak ground acceleration, shake table test

Procedia PDF Downloads 74
37980 The Concept of an Agile Enterprise Research Model

Authors: Maja Sajdak

Abstract:

The aim of this paper is to present the concept of an agile enterprise model and to initiate discussion on the research assumptions of the model presented. The implementation of the research project "The agility of enterprises in the process of adapting to the environment and its changes" began in August 2014 and is planned to last three years. The article has the form of a work-in-progress paper which aims to verify and initiate a debate over the proposed research model. In the literature there are very few publications relating to research into agility; it can be concluded that the most controversial issue in this regard is the method of measuring agility. In previous studies the operationalization of agility was often fragmentary, focusing only on selected areas of agility, for example manufacturing, or analysing only selected sectors. As a result the measures created to date can only be treated as contributory to the development of precise measurement tools. This research project aims to fill a cognitive gap in the literature with regard to the conceptualization and operationalization of an agile company. Thus, the original contribution of the author of this project is the construction of a theoretical model that integrates manufacturing agility (consisting mainly in adaptation to the environment) and strategic agility (based on proactive measures). The author of this research project is primarily interested in the attributes of an agile enterprise which indicate that the company is able to rapidly adapt to changing circumstances and behave pro-actively.

Keywords: agile company, acuity, entrepreneurship, flexibility, research model, strategic leadership

Procedia PDF Downloads 340