Search results for: lumped RC model
16140 Three-Dimensional Numerical Model of an Earth Air Heat Exchanger under a Constrained Urban Environment in India: Modeling and Validation
Authors: V. Rangarajan, Priyanka Kaushal
Abstract:
This study investigates the effectiveness of a typical Earth Air Heat Exchanger (EATHE) for energy efficient space cooling in an urban environment typified by space and soil-related constraints that preclude an optimal design. It involves the development of a three-dimensional numerical transient model that is validated by measurements at a live site in India. It is found that the model accurately predicts the soil temperatures at various depths as well as the EATHE outlet air temperature. The study shows that such an EATHE, even when designed under constraints, does provide effective space cooling especially during the hot months of the year.Keywords: earth air heat exchanger (EATHE), India, MATLAB, model, simulation
Procedia PDF Downloads 32216139 Urban Design via Estimation Model for Traffic Index of Cities Based on an Artificial Intelligence
Authors: Seyed Sobhan Alvani, Mohammad Gohari
Abstract:
By developing cities and increasing the population, traffic congestion has become a vital problem. Due to this crisis, urban designers try to present solutions to decrease this difficulty. On the other hand, predicting the model with perfect accuracy is essential for solution-providing. The current study presents a model based on artificial intelligence which can predict traffic index based on city population, growth rate, and area. The accuracy of the model was evaluated, which is acceptable and it is around 90%. Thus, urban designers and planners can employ it for predicting traffic index in the future to provide strategies.Keywords: traffic index, population growth rate, cities wideness, artificial neural network
Procedia PDF Downloads 4016138 Frailty Models for Modeling Heterogeneity: Simulation Study and Application to Quebec Pension Plan
Authors: Souad Romdhane, Lotfi Belkacem
Abstract:
When referring to actuarial analysis of lifetime, only models accounting for observable risk factors have been developed. Within this context, Cox proportional hazards model (CPH model) is commonly used to assess the effects of observable covariates as gender, age, smoking habits, on the hazard rates. These covariates may fail to fully account for the true lifetime interval. This may be due to the existence of another random variable (frailty) that is still being ignored. The aim of this paper is to examine the shared frailty issue in the Cox proportional hazard model by including two different parametric forms of frailty into the hazard function. Four estimated methods are used to fit them. The performance of the parameter estimates is assessed and compared between the classical Cox model and these frailty models through a real-life data set from the Quebec Pension Plan and then using a more general simulation study. This performance is investigated in terms of the bias of point estimates and their empirical standard errors in both fixed and random effect parts. Both the simulation and the real dataset studies showed differences between classical Cox model and shared frailty model.Keywords: life insurance-pension plan, survival analysis, risk factors, cox proportional hazards model, multivariate failure-time data, shared frailty, simulations study
Procedia PDF Downloads 35916137 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data
Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer
Abstract:
This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.Keywords: non-stationary, BINARMA(1, 1) model, Poisson innovations, conditional maximum likelihood, CML
Procedia PDF Downloads 12916136 The Discriminate Analysis and Relevant Model for Mapping Export Potential
Authors: Jana Gutierez Chvalkovska, Michal Mejstrik, Matej Urban
Abstract:
There are pending discussions over the mapping of country export potential in order to refocus export strategy of firms and its evidence-based promotion by the Export Credit Agencies (ECAs) and other permitted vehicles of governments. In this paper we develop our version of an applied model that offers “stepwise” elimination of unattractive markets. We modify and calibrate the model for the particular features of the Czech Republic and specific pilot cases where we apply an individual approach to each sector.Keywords: export strategy, modeling export, calibration, export promotion
Procedia PDF Downloads 49816135 Control of an SIR Model for Basic Reproduction Number Regulation
Authors: Enrique Barbieri
Abstract:
The basic disease-spread model described by three states denoting the susceptible (S), infectious (I), and removed (recovered and deceased) (R) sub-groups of the total population N, or SIR model, has been considered. Heuristic mitigating action profiles of the pharmaceutical and non-pharmaceutical types may be developed in a control design setting for the purpose of reducing the transmission rate or improving the recovery rate parameters in the model. Even though the transmission and recovery rates are not control inputs in the traditional sense, a linear observer and feedback controller can be tuned to generate an asymptotic estimate of the transmission rate for a linearized, discrete-time version of the SIR model. Then, a set of mitigating actions is suggested to steer the basic reproduction number toward unity, in which case the disease does not spread, and the infected population state does not suffer from multiple waves. The special case of piecewise constant transmission rate is described and applied to a seventh-order SEIQRDP model, which segments the population into four additional states. The offline simulations in discrete time may be used to produce heuristic policies implemented by public health and government organizations.Keywords: control of SIR, observer, SEIQRDP, disease spread
Procedia PDF Downloads 11016134 Open Innovation Strategy (OIS) Paradigm and an OIS Capabilities Model
Authors: Anastasis D. Petrou
Abstract:
Innovation and strategy discussions do highlight open innovation as a new paradigm in business. Yet, a number of stumbling blocks in the form of closed innovation principles weaved into the fabric of a traditional business model stand in the way of the new paradigm’s momentum to increase value in various business contexts. The paper argues that businesses considering an engagement with the open innovation paradigm would need to take steps to improve their multiplicative, absorptive and relational capabilities, respectively. The needed improvements would amount to a business model evolutionary transformation and eventually bring about a paradigm overhaul in business. The transformation is worth staging over time to ensure that open innovation is developed across interconnected and partnered areas of strategic importance. This article develops an open innovation strategy (OIS) capabilities model, and employs examples from different industries to briefly discuss OIS’s potential to augment business value in a number of suggested areas for future research.Keywords: close innovation, open innovation paradigm, open innovation strategy (OIS) paradigm, OIS capabilities model, multiplicative capability, absorptive capability, relational capability
Procedia PDF Downloads 52016133 Electricity Demand Modeling and Forecasting in Singapore
Authors: Xian Li, Qing-Guo Wang, Jiangshuai Huang, Jidong Liu, Ming Yu, Tan Kok Poh
Abstract:
In power industry, accurate electricity demand forecasting for a certain leading time is important for system operation and control, etc. In this paper, we investigate the modeling and forecasting of Singapore’s electricity demand. Several standard models, such as HWT exponential smoothing model, the ARMA model and the ANNs model have been proposed based on historical demand data. We applied them to Singapore electricity market and proposed three refinements based on simulation to improve the modeling accuracy. Compared with existing models, our refined model can produce better forecasting accuracy. It is demonstrated in the simulation that by adding forecasting error into the forecasting equation, the modeling accuracy could be improved greatly.Keywords: power industry, electricity demand, modeling, forecasting
Procedia PDF Downloads 64016132 Saltwater Intrusion Studies in the Cai River in the Khanh Hoa Province, Vietnam
Authors: B. Van Kessel, P. T. Kockelkorn, T. R. Speelman, T. C. Wierikx, C. Mai Van, T. A. Bogaard
Abstract:
Saltwater intrusion is a common problem in estuaries around the world, as it could hinder the freshwater supply of coastal zones. This problem is likely to grow due to climate change and sea-level rise. The influence of these factors on the saltwater intrusion was investigated for the Cai River in the Khanh Hoa province in Vietnam. In addition, the Cai River has high seasonal fluctuations in discharge, leading to increased saltwater intrusion during the dry season. Sea level rise, river discharge changes, river mouth widening and a proposed saltwater intrusion prevention dam can have influences on the saltwater intrusion but have not been quantified for the Cai River estuary. This research used both an analytical and numerical model to investigate the effect of the aforementioned factors. The analytical model was based on a model proposed by Savenije and was calibrated using limited in situ data. The numerical model was a 3D hydrodynamic model made using the Delft3D4 software. The analytical model and numerical model agreed with in situ data, mostly for tidally average data. Both models indicated a roughly similar dependence on discharge, also agreeing that this parameter had the most severe influence on the modeled saltwater intrusion. Especially for discharges below 10 m/s3, the saltwater was predicted to reach further than 10 km. In the models, both sea-level rise and river widening mainly resulted in salinity increments up to 3 kg/m3 in the middle part of the river. The predicted sea-level rise in 2070 was simulated to lead to an increase of 0.5 km in saltwater intrusion length. Furthermore, the effect of the saltwater intrusion dam seemed significant in the model used, but only for the highest position of the gate.Keywords: Cai River, hydraulic models, river discharge, saltwater intrusion, tidal barriers
Procedia PDF Downloads 11116131 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 46916130 Methodology for Obtaining Static Alignment Model
Authors: Lely A. Luengas, Pedro R. Vizcaya, Giovanni Sánchez
Abstract:
In this paper, a methodology is presented to obtain the Static Alignment Model for any transtibial amputee person. The proposed methodology starts from experimental data collected on the Hospital Militar Central, Bogotá, Colombia. The effects of transtibial prosthesis malalignment on amputees were measured in terms of joint angles, center of pressure (COP) and weight distribution. Some statistical tools are used to obtain the model parameters. Mathematical predictive models of prosthetic alignment were created. The proposed models are validated in amputees and finding promising results for the prosthesis Static Alignment. Static alignment process is unique to each subject; nevertheless the proposed methodology can be used in each transtibial amputee.Keywords: information theory, prediction model, prosthetic alignment, transtibial prosthesis
Procedia PDF Downloads 25616129 Design and Implementation of Low-code Model-building Methods
Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu
Abstract:
This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment
Procedia PDF Downloads 2916128 Effect of Sand Particle Distribution in Oil and Gas Pipeline Erosion
Authors: Christopher Deekia Nwimae, Nigel Simms, Liyun Lao
Abstract:
Erosion in pipe bends caused by particles is a major obstacle in the oil and gas fields and might cause the breakdown of production equipment. This work studied the effects imposed by flow velocity and impact of solid particles diameter in an elbow; erosion rate was verified with experimental data using the computational fluid dynamics (CFD) approach. Two-way coupled Euler-Lagrange and discrete phase model was employed to calculate the air/solid particle flow in an elbow. One erosion model and three-particle rebound models were used to predict the erosion rate on the 90° elbows. The generic erosion model was used in the CFD-based erosion model, and after comparing it with experimental data, results showed agreement with the CFD-based predictions as observed.Keywords: erosion, prediction, elbow, computational fluid dynamics
Procedia PDF Downloads 15716127 6D Posture Estimation of Road Vehicles from Color Images
Authors: Yoshimoto Kurihara, Tad Gonsalves
Abstract:
Currently, in the field of object posture estimation, there is research on estimating the position and angle of an object by storing a 3D model of the object to be estimated in advance in a computer and matching it with the model. However, in this research, we have succeeded in creating a module that is much simpler, smaller in scale, and faster in operation. Our 6D pose estimation model consists of two different networks – a classification network and a regression network. From a single RGB image, the trained model estimates the class of the object in the image, the coordinates of the object, and its rotation angle in 3D space. In addition, we compared the estimation accuracy of each camera position, i.e., the angle from which the object was captured. The highest accuracy was recorded when the camera position was 75°, the accuracy of the classification was about 87.3%, and that of regression was about 98.9%.Keywords: 6D posture estimation, image recognition, deep learning, AlexNet
Procedia PDF Downloads 15516126 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain
Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee
Abstract:
In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization
Procedia PDF Downloads 41516125 Generalized Additive Model Approach for the Chilean Hake Population in a Bio-Economic Context
Authors: Selin Guney, Andres Riquelme
Abstract:
The traditional bio-economic method for fisheries modeling uses some estimate of the growth parameters and the system carrying capacity from a biological model for the population dynamics (usually a logistic population growth model) which is then analyzed as a traditional production function. The stock dynamic is transformed into a revenue function and then compared with the extraction costs to estimate the maximum economic yield. In this paper, the logistic population growth model for the population is combined with a forecast of the abundance and location of the stock by using a generalized additive model approach. The paper focuses on the Chilean hake population. This method allows for the incorporation of climatic variables and the interaction with other marine species, which in turn will increase the reliability of the estimates and generate better extraction paths for different conservation objectives, such as the maximum biological yield or the maximum economic yield.Keywords: bio-economic, fisheries, GAM, production
Procedia PDF Downloads 25216124 A Model-Reference Sliding Mode for Dual-Stage Actuator Servo Control in HDD
Authors: S. Sonkham, U. Pinsopon, W. Chatlatanagulchai
Abstract:
This paper presents a method of sliding mode control (SMC) designing and developing for the servo system in a dual-stage actuator (DSA) hard disk drive. Mathematical modelling of hard disk drive actuators is obtained, extracted from measuring frequency response of the voice-coil motor (VCM) and PZT micro-actuator separately. Matlab software tools are used for mathematical model estimation and also for controller design and simulation. A model-reference approach for tracking requirement is selected as a proposed technique. The simulation results show that performance of a model-reference SMC controller design in DSA servo control can be satisfied in the tracking error, as well as keeping the positioning of the head within the boundary of +/-5% of track width under the presence of internal and external disturbance. The overall results of model-reference SMC design in DSA are met per requirement specifications and significant reduction in %off track is found when compared to the single-state actuator (SSA).Keywords: hard disk drive, dual-stage actuator, track following, hdd servo control, sliding mode control, model-reference, tracking control
Procedia PDF Downloads 36516123 Stabilization Control of the Nonlinear AIDS Model Based on the Theory of Polynomial Fuzzy Control Systems
Authors: Shahrokh Barati
Abstract:
In this paper, we introduced AIDS disease at first, then proposed dynamic model illustrate its progress, after expression of a short history of nonlinear modeling by polynomial phasing systems, we considered the stability conditions of the systems, which contained a huge amount of researches in order to modeling and control of AIDS in dynamic nonlinear form, in this approach using a frame work of control any polynomial phasing modeling system which have been generalized by part of phasing model of T-S, in order to control the system in better way, the stability conditions were achieved based on polynomial functions, then we focused to design the appropriate controller, firstly we considered the equilibrium points of system and their conditions and in order to examine changes in the parameters, we presented polynomial phase model that was the generalized approach rather than previous Takagi Sugeno models, then with using case we evaluated the equations in both open loop and close loop and with helping the controlling feedback, the close loop equations of system were calculated, to simulate nonlinear model of AIDS disease, we used polynomial phasing controller output that was capable to make the parameters of a nonlinear system to follow a sustainable reference model properly.Keywords: polynomial fuzzy, AIDS, nonlinear AIDS model, fuzzy control systems
Procedia PDF Downloads 46816122 Vibration-Based Data-Driven Model for Road Health Monitoring
Authors: Guru Prakash, Revanth Dugalam
Abstract:
A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.Keywords: SVM, data-driven, road health monitoring, pot-hole
Procedia PDF Downloads 8616121 An Integreated Intuitionistic Fuzzy ELECTRE Model for Multi-Criteria Decision-Making
Authors: Babek Erdebilli
Abstract:
The aim of this study is to develop and describe a new methodology for the Multi-Criteria Decision-Making (MCDM) problem using IFE (Elimination Et Choix Traduisant La Realite (ELECTRE) model. The proposed models enable Decision-Makers (DMs) on the assessment and use Intuitionistic Fuzzy Numbers (IFN). A numerical example is provided to demonstrate and clarify the proposed analysis procedure. Also, an empirical experiment is conducted to validation the effectiveness.Keywords: multi-criteria decision-making, IFE, DM’s, fuzzy electre model
Procedia PDF Downloads 65116120 Computationally Efficient Electrochemical-Thermal Li-Ion Cell Model for Battery Management System
Authors: Sangwoo Han, Saeed Khaleghi Rahimian, Ying Liu
Abstract:
Vehicle electrification is gaining momentum, and many car manufacturers promise to deliver more electric vehicle (EV) models to consumers in the coming years. In controlling the battery pack, the battery management system (BMS) must maintain optimal battery performance while ensuring the safety of a battery pack. Tasks related to battery performance include determining state-of-charge (SOC), state-of-power (SOP), state-of-health (SOH), cell balancing, and battery charging. Safety related functions include making sure cells operate within specified, static and dynamic voltage window and temperature range, derating power, detecting faulty cells, and warning the user if necessary. The BMS often utilizes an RC circuit model to model a Li-ion cell because of its robustness and low computation cost among other benefits. Because an equivalent circuit model such as the RC model is not a physics-based model, it can never be a prognostic model to predict battery state-of-health and avoid any safety risk even before it occurs. A physics-based Li-ion cell model, on the other hand, is more capable at the expense of computation cost. To avoid the high computation cost associated with a full-order model, many researchers have demonstrated the use of a single particle model (SPM) for BMS applications. One drawback associated with the single particle modeling approach is that it forces to use the average current density in the calculation. The SPM would be appropriate for simulating drive cycles where there is insufficient time to develop a significant current distribution within an electrode. However, under a continuous or high-pulse electrical load, the model may fail to predict cell voltage or Li⁺ plating potential. To overcome this issue, a multi-particle reduced-order model is proposed here. The use of multiple particles combined with either linear or nonlinear charge-transfer reaction kinetics enables to capture current density distribution within an electrode under any type of electrical load. To maintain computational complexity like that of an SPM, governing equations are solved sequentially to minimize iterative solving processes. Furthermore, the model is validated against a full-order model implemented in COMSOL Multiphysics.Keywords: battery management system, physics-based li-ion cell model, reduced-order model, single-particle and multi-particle model
Procedia PDF Downloads 11116119 Forecasting Model to Predict Dengue Incidence in Malaysia
Authors: W. H. Wan Zakiyatussariroh, A. A. Nasuhar, W. Y. Wan Fairos, Z. A. Nazatul Shahreen
Abstract:
Forecasting dengue incidence in a population can provide useful information to facilitate the planning of the public health intervention. Many studies on dengue cases in Malaysia were conducted but are limited in modeling the outbreak and forecasting incidence. This article attempts to propose the most appropriate time series model to explain the behavior of dengue incidence in Malaysia for the purpose of forecasting future dengue outbreaks. Several seasonal auto-regressive integrated moving average (SARIMA) models were developed to model Malaysia’s number of dengue incidence on weekly data collected from January 2001 to December 2011. SARIMA (2,1,1)(1,1,1)52 model was found to be the most suitable model for Malaysia’s dengue incidence with the least value of Akaike information criteria (AIC) and Bayesian information criteria (BIC) for in-sample fitting. The models further evaluate out-sample forecast accuracy using four different accuracy measures. The results indicate that SARIMA (2,1,1)(1,1,1)52 performed well for both in-sample fitting and out-sample evaluation.Keywords: time series modeling, Box-Jenkins, SARIMA, forecasting
Procedia PDF Downloads 48416118 Optimization Model for Support Decision for Maximizing Production of Mixed Fresh Fruit Farms
Authors: Andrés I. Ávila, Patricia Aros, César San Martín, Elizabeth Kehr, Yovana Leal
Abstract:
Planning models for fresh products is a very useful tool for improving the net profits. To get an efficient supply chain model, several functions should be considered to get a complete simulation of several operational units. We consider a linear programming model to help farmers to decide if it is convenient to choose what area should be planted for three kinds of export fruits considering their future investment. We consider area, investment, water, productivity minimal unit, and harvest restrictions to develop a monthly based model to compute the average income in five years. Also, conditions on the field as area, water availability, and initial investment are required. Using the Chilean costs and dollar-peso exchange rate, we can simulate several scenarios to understand the possible risks associated to this market. Also, this tool help to support decisions for government and individual farmers.Keywords: mixed integer problem, fresh fruit production, support decision model, agricultural and biosystems engineering
Procedia PDF Downloads 43816117 Analysis of the Impact of NVivo and EndNote on Academic Research Productivity
Authors: Sujit K. Basak
Abstract:
The aim of this paper is to analyze the impact of literature review software on researchers. The aim of this study was achieved by analyzing models in terms of perceived usefulness, perceived ease of use, and acceptance level. Collected data was analyzed using WarpPLS 4.0 software. This study used two theoretical frameworks namely Technology Acceptance Model and the Training Needs Assessment Model. The study was experimental and was conducted at a public university in South Africa. The results of the study showed that acceptance level has a high impact on research workload and productivity followed by perceived usefulness and perceived ease of use.Keywords: technology acceptance model, training needs assessment model, literature review software, research productivity
Procedia PDF Downloads 50216116 A Spatial Approach to Model Mortality Rates
Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang
Abstract:
Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection
Procedia PDF Downloads 17116115 Impact of VARK Learning Model at Tertiary Level Education
Authors: Munazza A. Mirza, Khawar Khurshid
Abstract:
Individuals are generally associated with different learning styles, which have been explored extensively in recent past. The learning styles refer to the potential of an individual by which s/he can easily comprehend and retain information. Among various learning style models, VARK is the most accepted model which categorizes the learners with respect to their sensory characteristics. Based on the number of preferred learning modes, the learners can be categorized as uni-modal, bi-modal, tri-modal, or quad/multi-modal. Although there is a prevalent belief in the learning styles, however, the model is not being frequently and effectively utilized in the higher education. This research describes the identification model to validate teacher’s didactic practice and student’s performance linkage with the learning styles. The identification model is recommended to check the effective application and evaluation of the various learning styles. The proposed model is a guideline to effectively implement learning styles inventory in order to ensure that it will validate performance linkage with learning styles. If performance is linked with learning styles, this may help eradicate the distrust on learning style theory. For this purpose, a comprehensive study was conducted to compare and understand how VARK inventory model is being used to identify learning preferences and their correlation with learner’s performance. A comparative analysis of the findings of these studies is presented to understand the learning styles of tertiary students in various disciplines. It is concluded with confidence that the learning styles of students cannot be associated with any specific discipline. Furthermore, there is not enough empirical proof to link performance with learning styles.Keywords: learning style, VARK, sensory preferences, identification model, didactic practices
Procedia PDF Downloads 27716114 Applying the Extreme-Based Teaching Model in Post-Secondary Online Classroom Setting: A Field Experiment
Authors: Leon Pan
Abstract:
The first programming course within post-secondary education has long been recognized as a challenging endeavor for both educators and students alike. Historically, these courses have exhibited high failure rates and a notable number of dropouts. Instructors often lament students' lack of effort in their coursework, and students often express frustration that the teaching methods employed are not effective. Drawing inspiration from the successful principles of Extreme Programming, this study introduces an approach—the Extremes-based teaching model — aimed at enhancing the teaching of introductory programming courses. To empirically determine the effectiveness of the model, a comparison was made between a section taught using the extreme-based model and another utilizing traditional teaching methods. Notably, the extreme-based teaching class required students to work collaboratively on projects while also demanding continuous assessment and performance enhancement within groups. This paper details the application of the extreme-based model within the post-secondary online classroom context and presents the compelling results that emphasize its effectiveness in advancing the teaching and learning experiences. The extreme-based model led to a significant increase of 13.46 points in the weighted total average and a commendable 10% reduction in the failure rate.Keywords: extreme-based teaching model, innovative pedagogical methods, project-based learning, team-based learning
Procedia PDF Downloads 5916113 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for software-intensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.Keywords: functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis
Procedia PDF Downloads 29316112 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland
Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli
Abstract:
This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.Keywords: analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges
Procedia PDF Downloads 16216111 Robustified Asymmetric Logistic Regression Model for Global Fish Stock Assessment
Authors: Osamu Komori, Shinto Eguchi, Hiroshi Okamura, Momoko Ichinokawa
Abstract:
The long time-series data on population assessments are essential for global ecosystem assessment because the temporal change of biomass in such a database reflects the status of global ecosystem properly. However, the available assessment data usually have limited sample sizes and the ratio of populations with low abundance of biomass (collapsed) to those with high abundance (non-collapsed) is highly imbalanced. To allow for the imbalance and uncertainty involved in the ecological data, we propose a binary regression model with mixed effects for inferring ecosystem status through an asymmetric logistic model. In the estimation equation, we observe that the weights for the non-collapsed populations are relatively reduced, which in turn puts more importance on the small number of observations of collapsed populations. Moreover, we extend the asymmetric logistic regression model using propensity score to allow for the sample biases observed in the labeled and unlabeled datasets. It robustified the estimation procedure and improved the model fitting.Keywords: double robust estimation, ecological binary data, mixed effect logistic regression model, propensity score
Procedia PDF Downloads 266