Search results for: a causal model
15715 Two-Dimensional Modeling of Seasonal Freeze and Thaw in an Idealized River Bank
Authors: Jiajia Pan, Hung Tao Shen
Abstract:
Freeze and thaw occurs seasonally in river banks in northern countries. Little is known on how the riverbank soil temperature responds to air temperature changes and how freeze and thaw develops in a river bank seasonally. This study presents a two-dimensional heat conduction model for numerical investigations of seasonal freeze and thaw processes in an idealized river bank. The model uses the finite difference method and it is convenient for applications. The model is validated with an analytical solution and a field case with soil temperature distributions. It is then applied to the idealized river bank in terms of partially and fully saturated conditions with or without ice cover influence. Simulated results illustrate the response processes of the river bank to seasonal air temperature variations. It promotes the understanding of freeze and thaw processes in river banks and prepares for further investigation of frost and thaw impacts on riverbank stability.Keywords: freeze and thaw, riverbanks, 2D model, heat conduction
Procedia PDF Downloads 12815714 Evaluating the Feasibility of Chemical Dermal Exposure Assessment Model
Authors: P. S. Hsi, Y. F. Wang, Y. F. Ho, P. C. Hung
Abstract:
The aim of the present study was to explore the dermal exposure assessment model of chemicals that have been developed abroad and to evaluate the feasibility of chemical dermal exposure assessment model for manufacturing industry in Taiwan. We conducted and analyzed six semi-quantitative risk management tools, including UK - Control of substances hazardous to health ( COSHH ) Europe – Risk assessment of occupational dermal exposure ( RISKOFDERM ), Netherlands - Dose related effect assessment model ( DREAM ), Netherlands – Stoffenmanager ( STOFFEN ), Nicaragua-Dermal exposure ranking method ( DERM ) and USA / Canada - Public Health Engineering Department ( PHED ). Five types of manufacturing industry were selected to evaluate. The Monte Carlo simulation was used to analyze the sensitivity of each factor, and the correlation between the assessment results of each semi-quantitative model and the exposure factors used in the model was analyzed to understand the important evaluation indicators of the dermal exposure assessment model. To assess the effectiveness of the semi-quantitative assessment models, this study also conduct quantitative dermal exposure results using prediction model and verify the correlation via Pearson's test. Results show that COSHH was unable to determine the strength of its decision factor because the results evaluated at all industries belong to the same risk level. In the DERM model, it can be found that the transmission process, the exposed area, and the clothing protection factor are all positively correlated. In the STOFFEN model, the fugitive, operation, near-field concentrations, the far-field concentration, and the operating time and frequency have a positive correlation. There is a positive correlation between skin exposure, work relative time, and working environment in the DREAM model. In the RISKOFDERM model, the actual exposure situation and exposure time have a positive correlation. We also found high correlation with the DERM and RISKOFDERM models, with coefficient coefficients of 0.92 and 0.93 (p<0.05), respectively. The STOFFEN and DREAM models have poor correlation, the coefficients are 0.24 and 0.29 (p>0.05), respectively. According to the results, both the DERM and RISKOFDERM models are suitable for performance in these selected manufacturing industries. However, considering the small sample size evaluated in this study, more categories of industries should be evaluated to reduce its uncertainty and enhance its applicability in the future.Keywords: dermal exposure, risk management, quantitative estimation, feasibility evaluation
Procedia PDF Downloads 16915713 Knowledge Sharing in Virtual Community: Societal Culture Considerations
Authors: Shahnaz Bashir, Abel Usoro, Imran Khan
Abstract:
Hofstede’s culture model is an important model to study culture between different societies. He collected data from world-wide and performed a comprehensive study. Hofstede’s cultural model is widely accepted and has been used to study cross cultural influences in different areas like cross-cultural psychology, cross cultural management, information technology, and intercultural communication. This study investigates the societal cultural aspects of knowledge sharing in virtual communities.Keywords: knowledge management, knowledge sharing, societal culture, virtual communities
Procedia PDF Downloads 40515712 Economic Analysis of Endogenous Growth Model with ICT Capital
Authors: Shoji Katagiri, Hugang Han
Abstract:
This paper clarifies the role of ICT capital in Economic Growth. Albeit ICT remarkably contributes to economic growth, there are few studies on ICT capital in ICT sector from theoretical point of view. In this paper, production function of ICT which is used as input of intermediate good in final good and ICT sectors is incorporated into our model. In this setting, we analyze the role of ICT on balance growth path and show the possibility of general equilibrium solutions for this model. Through the simulation of the equilibrium solutions, we find that when ICT impacts on economy and economic growth increases, it is necessary that increases of efficiency at ICT sector and of accumulation of non-ICT and ICT capitals occur simultaneously.Keywords: endogenous economic growth, ICT, intensity, capital accumulation
Procedia PDF Downloads 45515711 Plasma Actuator Application to Control Surfaces of a Model Aircraft
Authors: Yuta Moriyama, Etsuo Morishita
Abstract:
Plasma actuator is very effective to recover stall flows over an upper airfoil surface. We first manufacture the actuator, test the stability of the device by trial and error basis and find the conditions for steady operations. We visualize the flow around an airfoil in the smoke tunnel and observe the stall recovery. The plasma actuator is stationary device and has no moving parts, and it might be an ideal device to control a model aircraft. We can use the actuator not only as a stall recovery device but also as a spoiler. We put the actuator near the leading edge of an elevator of a model aircraft as a spoiler, and measure the aerodynamic forces by a three-component balance. We observe the effect of the plasma actuator on the aerodynamic forces and the device effectiveness changes depending on the angle of attack whether it is positive or negative. We also visualize the flow caused by the plasma actuator by a desk-top Schlieren photography which is otherwise very difficult in a low-speed wind tunnel experiment.Keywords: aerodynamics, plasma actuator, model aircraft, wind tunnel
Procedia PDF Downloads 37315710 Sorting Maize Haploids from Hybrids Using Single-Kernel Near-Infrared Spectroscopy
Authors: Paul R Armstrong
Abstract:
Doubled haploids (DHs) have become an important breeding tool for creating maize inbred lines, although several bottlenecks in the DH production process limit wider development, application, and adoption of the technique. DH kernels are typically sorted manually and represent about 10% of the seeds in a much larger pool where the remaining 90% are hybrid siblings. This introduces time constraints on DH production and manual sorting is often not accurate. Automated sorting based on the chemical composition of the kernel can be effective, but devices, namely NMR, have not achieved the sorting speed to be a cost-effective replacement to manual sorting. This study evaluated a single kernel near-infrared reflectance spectroscopy (skNIR) platform to accurately identify DH kernels based on oil content. The skNIR platform is a higher-throughput device, approximately 3 seeds/s, that uses spectra to predict oil content of each kernel from maize crosses intentionally developed to create larger than normal oil differences, 1.5%-2%, between DH and hybrid kernels. Spectra from the skNIR were used to construct a partial least squares regression (PLS) model for oil and for a categorical reference model of 1 (DH kernel) or 2 (hybrid kernel) and then used to sort several crosses to evaluate performance. Two approaches were used for sorting. The first used a general PLS model developed from all crosses to predict oil content and then used for sorting each induction cross, the second was the development of a specific model from a single induction cross where approximately fifty DH and one hundred hybrid kernels used. This second approach used a categorical reference value of 1 and 2, instead of oil content, for the PLS model and kernels selected for the calibration set were manually referenced based on traditional commercial methods using coloration of the tip cap and germ areas. The generalized PLS oil model statistics were R2 = 0.94 and RMSE = .93% for kernels spanning an oil content of 2.7% to 19.3%. Sorting by this model resulted in extracting 55% to 85% of haploid kernels from the four induction crosses. Using the second method of generating a model for each cross yielded model statistics ranging from R2s = 0.96 to 0.98 and RMSEs from 0.08 to 0.10. Sorting in this case resulted in 100% correct classification but required models that were cross. In summary, the first generalized model oil method could be used to sort a significant number of kernels from a kernel pool but was not close to the accuracy of developing a sorting model from a single cross. The penalty for the second method is that a PLS model would need to be developed for each individual cross. In conclusion both methods could find useful application in the sorting of DH from hybrid kernels.Keywords: NIR, haploids, maize, sorting
Procedia PDF Downloads 30215709 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia
Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline
Procedia PDF Downloads 33915708 Relationship between Electricity Consumption and Economic Growth: Evidence from Nigeria (1971-2012)
Authors: N. E Okoligwe, Okezie A. Ihugba
Abstract:
Few scholars disagrees that electricity consumption is an important supporting factor for economy growth. However, the relationship between electricity consumption and economy growth has different manifestation in different countries according to previous studies. This paper examines the causal relationship between electricity consumption and economic growth for Nigeria. In an attempt to do this, the paper tests the validity of the modernization or depending hypothesis by employing various econometric tools such as Augmented Dickey Fuller (ADF) and Johansen Co-integration test, the Error Correction Mechanism (ECM) and Granger Causality test on time series data from 1971-2012. The Granger causality is found not to run from electricity consumption to real GDP and from GDP to electricity consumption during the year of study. The null hypothesis is accepted at the 5 per cent level of significance where the probability value (0.2251 and 0.8251) is greater than five per cent level of significance because both of them are probably determined by some other factors like; increase in urban population, unemployment rate and the number of Nigerians that benefit from the increase in GDP and increase in electricity demand is not determined by the increase in GDP (income) over the period of study because electricity demand has always been greater than consumption. Consequently; the policy makers in Nigeria should place priority in early stages of reconstruction on building capacity additions and infrastructure development of the electric power sector as this would force the sustainable economic growth in Nigeria.Keywords: economic growth, electricity consumption, error correction mechanism, granger causality test
Procedia PDF Downloads 30915707 Numerical Investigation of Geotextile Application in Clay Reinforcement in ABAQUS Software
Authors: Seyed Abolhasan Naeini, Eisa Aliagahei
Abstract:
Today, the use of geosynthetic materials in geotechnical activities is increasing significantly. One of the main uses of these materials is to increase the compressive strength of clay reinforced by geotextile layers. In the present study, the effect of clay reinforcement by geotextile layers in increasing the compressive strength of clay has been investigated using modeling in ABAQUS 6.11.3 software. For this purpose, the modified Drager Prager model has been chosen to simulate the stress-strain behavior of soil layers and the linear elastic model for the geotextile layer. Unreinforced samples and reinforced samples are modeled by geotextile layers (1, 2 and 3 geotextile layers) by software. In order to validate the results, an article in the same field was used and the numerical modeling results were calibrated with the laboratory results. Based on the obtained results, the software has a suitable capability for modeling and the results of the numerical model overlap with the laboratory results to a very acceptable extent, by increasing the number of geotextile layers, the error between the results of the laboratory sample and the software model increases. The highest amount of error is related to the sample reinforced with three layers of geotextile and is 7.3%.Keywords: Abaqus, cap model, clay, geotextile layer, reinforced soil
Procedia PDF Downloads 8815706 Stochastic Richelieu River Flood Modeling and Comparison of Flood Propagation Models: WMS (1D) and SRH (2D)
Authors: Maryam Safrai, Tewfik Mahdi
Abstract:
This article presents the stochastic modeling of the Richelieu River flood in Quebec, Canada, occurred in the spring of 2011. With the aid of the one-dimensional Watershed Modeling System (WMS (v.10.1) and HEC-RAS (v.4.1) as a flood simulator, the delineation of the probabilistic flooded areas was considered. Based on the Monte Carlo method, WMS (v.10.1) delineated the probabilistic flooded areas with corresponding occurrence percentages. Furthermore, results of this one-dimensional model were compared with the results of two-dimensional model (SRH-2D) for the evaluation of efficiency and precision of each applied model. Based on this comparison, computational process in two-dimensional model is longer and more complicated versus brief one-dimensional one. Although, two-dimensional models are more accurate than one-dimensional method, but according to existing modellers, delineation of probabilistic flooded areas based on Monte Carlo method is achievable via one-dimensional modeler. The applied software in this case study greatly responded to verify the research objectives. As a result, flood risk maps of the Richelieu River with the two applied models (1d, 2d) could elucidate the flood risk factors in hydrological, hydraulic, and managerial terms.Keywords: flood modeling, HEC-RAS, model comparison, Monte Carlo simulation, probabilistic flooded area, SRH-2D, WMS
Procedia PDF Downloads 14015705 Performance of the Strong Stability Method in the Univariate Classical Risk Model
Authors: Safia Hocine, Zina Benouaret, Djamil A¨ıssani
Abstract:
In this paper, we study the performance of the strong stability method of the univariate classical risk model. We interest to the stability bounds established using two approaches. The first based on the strong stability method developed for a general Markov chains. The second approach based on the regenerative processes theory . By adopting an algorithmic procedure, we study the performance of the stability method in the case of exponential distribution claim amounts. After presenting numerically and graphically the stability bounds, an interpretation and comparison of the results have been done.Keywords: Marcov chain, regenerative process, risk model, ruin probability, strong stability
Procedia PDF Downloads 32415704 Flow Dynamics of Nanofluids in a Horizontal Cylindrical Annulus Using Nonhomogeneous Dynamic Model
Authors: M. J. Uddin, M. M. Rahman
Abstract:
Transient natural convective flow dynamics of nanofluids in a horizontal homocentric annulus using nonhomogeneous dynamic model has been experimented numerically. The simulation is carried out for four different shapes of the inner wall, which is either cylindrical, elliptical, square or triangular. The outer surface of the annulus is maintained at constant low temperature while the inner wall is maintained at a uniform temperature; higher than the outer one. The enclosure is permeated by a uniform magnetic field having variable orientation. The Brownian motion and thermophoretic deposition phenomena of the nanoparticles are taken into account in model construction. The governing nonlinear momentum, energy, and concentration equations are solved numerically using Galerkin weighted residual finite element method. To find the best performer, the local Nusselt number is demonstrated for different shapes of the inner wall. The heat transfer enhancement for different nanofluids for four different shapes of the inner wall is exhibited.Keywords: nanofluids, annulus, nonhomogeneous dynamic model, heat transfer
Procedia PDF Downloads 17015703 Finite Element Modelling and Analysis of Human Knee Joint
Authors: R. Ranjith Kumar
Abstract:
Computer modeling and simulation of human movement is playing an important role in sports and rehabilitation. Accurate modeling and analysis of human knee join is more complex because of complicated structure whose geometry is not easily to represent by a solid model. As part of this project, from the number of CT scan images of human knee join surface reconstruction is carried out using 3D slicer software, an open source software. From this surface reconstruction model, using mesh lab (another open source software) triangular meshes are created on reconstructed surface. This final triangular mesh model is imported to Solid Works, 3D mechanical CAD modeling software. Finally this CAD model is imported to ABAQUS, finite element analysis software for analyzing the knee joints. The results obtained are encouraging and provides an accurate way of modeling and analysis of biological parts without human intervention.Keywords: solid works, CATIA, Pro-e, CAD
Procedia PDF Downloads 12415702 Logistic Regression Based Model for Predicting Students’ Academic Performance in Higher Institutions
Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu
Abstract:
In recent years, there has been a desire to forecast student academic achievement prior to graduation. This is to help them improve their grades, particularly for individuals with poor performance. The goal of this study is to employ supervised learning techniques to construct a predictive model for student academic achievement. Many academics have already constructed models that predict student academic achievement based on factors such as smoking, demography, culture, social media, parent educational background, parent finances, and family background, to name a few. This feature and the model employed may not have correctly classified the students in terms of their academic performance. This model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester as a prerequisite to predict if the student will perform well in future on related courses. The model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost, returning a 96.7% accuracy. This model is available as a desktop application, allowing both instructors and students to benefit from user-friendly interfaces for predicting student academic achievement. As a result, it is recommended that both students and professors use this tool to better forecast outcomes.Keywords: artificial intelligence, ML, logistic regression, performance, prediction
Procedia PDF Downloads 9715701 A Curricular Approach to Organizational Mentoring Programs: The Integrated Mentoring Curriculum Model
Authors: Christopher Webb
Abstract:
This work presents a new model of mentoring in an organizational environment and has important implications for both practice and research, the model frames the organizational environment as organizational curriculum, which includes the elements that affect learning within the organization. This includes the organizational structure and culture, roles within the organization, and accessibility of knowledge. The program curriculum includes the elements of the mentoring program, including materials, training, and scheduled events for the program participants. The term dyadic curriculum is coined in this work. The dyadic curriculum describes the participation, behavior, and identities of the pairs participating in mentorships. This also includes the identity work of the participants and their views of each other. Much of this curriculum is unprescribed and is unique within each dyad. It describes how participants mediate the elements of organizational and program curricula. These three curricula interact and affect each other in predictable ways. A detailed example of a mentoring program framed in this model is provided.Keywords: curriculum, mentoring, organizational learning and development, social learning
Procedia PDF Downloads 20215700 Biocultural Biographies and Molecular Memories: A Study of Neuroepigenetics and How Trauma Gets under the Skull
Authors: Elsher Lawson-Boyd
Abstract:
In the wake of the Human Genome Project, the life sciences have undergone some fascinating changes. In particular, conventional beliefs relating to gene expression are being challenged by advances in postgenomic sciences, especially by the field of epigenetics. Epigenetics is the modification of gene expression without changes in the DNA sequence. In other words, epigenetics dictates that gene expression, the process by which the instructions in DNA are converted into products like proteins, is not solely controlled by DNA itself. Unlike gene-centric theories of heredity that characterized much of the 20th Century (where the genes were considered as having almost god-like power to create life), gene expression in epigenetics insists on environmental ‘signals’ or ‘exposures’, a point that radically deviates from gene-centric thinking. Science and Technology Studies (STS) scholars have shown that epigenetic research is having vast implications for the ways in which chronic, non-communicable diseases are conceptualized, treated, and governed. However, to the author’s knowledge, there have not yet been any in-depth sociological engagements with neuroepigenetics that examine how the field is affecting mental health and trauma discourse. In this paper, the author discusses preliminary findings from a doctoral ethnographic study on neuroepigenetics, trauma, and embodiment. Specifically, this study investigates the kinds of causal relations neuroepigenetic researchers are making between experiences of trauma and the development of mental illnesses like complex post-traumatic stress disorder (PTSD), both throughout a human’s lifetime and across generations. Using qualitative interviews and nonparticipant observation, the author focuses on two public-facing research centers based in Melbourne: Florey Institute of Neuroscience and Mental Health (FNMH), and Murdoch Children’s Research Institute (MCRI). Preliminary findings indicate that a great deal of ambiguity characterizes this infant field, particularly when animal-model experiments are employed and the results are translated into human frameworks. Nevertheless, researchers at the FNMH and MCRI strongly suggest that adverse and traumatic life events have a significant effect on gene expression, especially when experienced during early development. Furthermore, they predict that neuroepigenetic research will have substantial implications for the ways in which mental illnesses like complex PTSD are diagnosed and treated. These preliminary findings shed light on why medical and health sociologists have good reason to be chiming in, engaging with and de-black-boxing ideations emerging from postgenomic sciences, as they may indeed have significant effects for vulnerable populations not only in Australia but other developing countries in the Global South.Keywords: genetics, mental illness, neuroepigenetics, trauma
Procedia PDF Downloads 12415699 A Stochastic Volatility Model for Optimal Market-Making
Authors: Zubier Arfan, Paul Johnson
Abstract:
The electronification of financial markets and the rise of algorithmic trading has sparked a lot of interest from the mathematical community, for the market making-problem in particular. The research presented in this short paper solves the classic stochastic control problem in order to derive the strategy for a market-maker. It also shows how to calibrate and simulate the strategy with real limit order book data for back-testing. The ambiguity of limit-order priority in back-testing is dealt with by considering optimistic and pessimistic priority scenarios. The model, although it does outperform a naive strategy, assumes constant volatility, therefore, is not best suited to the LOB data. The Heston model is introduced to describe the price and variance process of the asset. The Trader's constant absolute risk aversion utility function is optimised by numerically solving a 3-dimensional Hamilton-Jacobi-Bellman partial differential equation to find the optimal limit order quotes. The results show that the stochastic volatility market-making model is more suitable for a risk-averse trader and is also less sensitive to calibration error than the constant volatility model.Keywords: market-making, market-microsctrucure, stochastic volatility, quantitative trading
Procedia PDF Downloads 15015698 Using Lean Six-Sigma in the Improvement of Service Quality at Aviation Industry: Case Study at the Departure Area in KKIA
Authors: Tareq Al Muhareb, Jasper Graham-Jones
Abstract:
The service quality is a significant element in aviation industry especially in the international airports. Through this paper, the researchers built a model based on Lean six sigma methodologies and applied it in the departure area at KKIA (King Khalid International Airport) in order to assess it. This model characterized with many special features that can become over the cultural differences in aviation industry since it is considered the most critical circumstance in this field. Applying the model of this study is depending on following the DMAIC procedure systemized in lean thinking aspects. This model of Lean-six-sigma as a managerial procedure is mostly focused on the change management culture that requires high level of planning, organizing, modifying, and controlling in order to benefit from strengths as well as revoke weaknesses.Keywords: lean-six-sigma, service quality, aviation industry, KKIA (King Khalid International Airport), SERVQUAL
Procedia PDF Downloads 43015697 Optimization of Syngas Quality for Fischer-Tropsch Synthesis
Authors: Ali Rabah
Abstract:
This research received no grant or financial support from any public, commercial, or none governmental agency. The author conducted this work as part of his normal research activities as a professor of Chemical Engineering at the University of Khartoum, Sudan. Abstract While fossil oil reserves have been receding, the demand for diesel and gasoline has been growing. In recent years, syngas of biomass origin has been emerging as a viable feedstock for Fischer-Tropsch (FT) synthesis, a process for manufacturing synthetic gasoline and diesel. This paper reports the optimization of syngas quality to match FT synthesis requirements. The optimization model maximizes the thermal efficiency under the constraint of H2/CO≥2.0 and operating conditions of equivalent ratio (0 ≤ ER ≤ 1.0), steam to biomass ratio (0 ≤ SB ≤ 5), and gasification temperature (500 °C ≤ Tg ≤ 1300 °C). The optimization model is executed using the optimization section of the Model Analysis Tools of the Aspen Plus simulator. The model is tested using eleven (11) types of MSW. The optimum operating conditions under which the objective function and the constraint are satisfied are ER=0, SB=0.66-1.22, and Tg=679 - 763°C. Under the optimum operating conditions, the syngas quality is H2=52.38 - 58.67-mole percent, LHV=12.55 - 17.15 MJ/kg, N2=0.38 - 2.33-mole percent, and H2/CO≥2.15. The generalized optimization model reported could be extended to any other type of biomass and coal. Keywords: MSW, Syngas, Optimization, Fischer-Tropsch.Keywords: syngas, MSW, optimization, Fisher-Tropsh
Procedia PDF Downloads 8015696 Extension of a Competitive Location Model Considering a Given Number of Servers and Proposing a Heuristic for Solving
Authors: Mehdi Seifbarghy, Zahra Nasiri
Abstract:
Competitive location problem deals with locating new facilities to provide a service (or goods) to the customers of a given geographical area where other facilities (competitors) offering the same service are already present. The new facilities will have to compete with the existing facilities for capturing the market share. This paper proposes a new model to maximize the market share in which customers choose the facilities based on traveling time, waiting time and attractiveness. The attractiveness of a facility is considered as a parameter in the model. A heuristic is proposed to solve the problem.Keywords: competitive location, market share, facility attractiveness, heuristic
Procedia PDF Downloads 52315695 Research on Air pollution Spatiotemporal Forecast Model Based on LSTM
Authors: JingWei Yu, Hong Yang Yu
Abstract:
At present, the increasingly serious air pollution in various cities of China has made people pay more attention to the air quality index(hereinafter referred to as AQI) of their living areas. To face this situation, it is of great significance to predict air pollution in heavily polluted areas. In this paper, based on the time series model of LSTM, a spatiotemporal prediction model of PM2.5 concentration in Mianyang, Sichuan Province, is established. The model fully considers the temporal variability and spatial distribution characteristics of PM2.5 concentration. The spatial correlation of air quality at different locations is based on the Air quality status of other nearby monitoring stations, including AQI and meteorological data to predict the air quality of a monitoring station. The experimental results show that the method has good prediction accuracy that the fitting degree with the actual measured data reaches more than 0.7, which can be applied to the modeling and prediction of the spatial and temporal distribution of regional PM2.5 concentration.Keywords: LSTM, PM2.5, neural networks, spatio-temporal prediction
Procedia PDF Downloads 13415694 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner
Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.Keywords: Bayesian network, IoT, learning, situation -awareness, smart home
Procedia PDF Downloads 52315693 Earnings Volatility and Earnings Predictability
Authors: Yosra Ben Mhamed
Abstract:
Most previous research that investigates the importance of earnings volatility for a firm’s value has focused on the effects of earnings volatility on the cost of capital. Many study illustrate that earnings volatility can reduce the firm’s value by enhancing the cost of capital. However, a few recent studies directly examine the relation between earnings volatility and subsequent earnings levels. In our study, we further explore the role of volatility in forecasting. Our study makes two primary contributions to the literature. First, taking into account the level of current firm’s performance, we provide causal theory to the link between volatility and earnings predictability. Nevertheless, previous studies testing the linearity of this relationship have not mentioned any underlying theory. Secondly, our study contributes to the vast body of fundamental analysis research that identifies a set of variables that improve valuation, by showing that earnings volatility affects the estimation of future earnings. Projections of earnings are used by valuation research and practice to derive estimates of firm value. Since we want to examine the impact of volatility on earnings predictability, we sort the sample into three portfolios according to the level of their earnings volatility in ascending order. For each quintile, we present the predictability coefficient. In a second test, each of these portfolios is, then, sorted into three further quintiles based on their level of current earnings. These yield nine quintiles. So we can observe whether volatility strongly predicts decreases on earnings predictability only for highest quintile of earnings. In general, we find that earnings volatility has an inverse relationship with earnings predictability. Our results also show that the sensibility of earnings predictability to ex-ante volatility is more pronounced among profitability firms. The findings are most consistent with overinvestment and persistence explanations.Keywords: earnings volatility, earnings predictability, earnings persistence, current profitability
Procedia PDF Downloads 43315692 Evaluation of Ceres Wheat and Rice Model for Climatic Conditions in Haryana, India
Authors: Mamta Rana, K. K. Singh, Nisha Kumari
Abstract:
The simulation models with its soil-weather-plant atmosphere interacting system are important tools for assessing the crops in changing climate conditions. The CERES-Wheat & Rice vs. 4.6 DSSAT was calibrated and evaluated for one of the major producers of wheat and rice state- Haryana, India. The simulation runs were made under irrigated conditions and three fertilizer applications dose of N-P-K to estimate crop yield and other growth parameters along with the phenological development of the crop. The genetic coefficients derived by iteratively manipulating the relevant coefficients that characterize the phenological process of wheat and rice crop to the best fit match between the simulated and observed anthesis, physological maturity and final grain yield. The model validated by plotting the simulated and remote sensing derived LAI. LAI product from remote sensing provides the edge of spatial, timely and accurate assessment of crop. For validating the yield and yield components, the error percentage between the observed and simulated data was calculated. The analysis shows that the model can be used to simulate crop yield and yield components for wheat and rice cultivar under different management practices. During the validation, the error percentage was less than 10%, indicating the utility of the calibrated model for climate risk assessment in the selected region.Keywords: simulation model, CERES-wheat and rice model, crop yield, genetic coefficient
Procedia PDF Downloads 30515691 Single Ended Primary Inductance Converter with Internal Model Controller
Authors: Fatih Suleyman Taskincan, Ahmet Karaarslan
Abstract:
In this article, the study and analysis of Single Ended Primary Inductance Converter (SEPIC) are presented for battery charging applications that will be used in military applications. The usage of this kind of converters come from its advantage of non-reverse polarity at outputs. As capacitors charge and discharge through inductance, peak current does not occur on capacitors. Therefore, the efficiency will be high compared to buck-boost converters. In this study, the converter (SEPIC) is designed to be operated with Internal Model Controller (IMC). The traditional controllers like Proportional Integral Controller are not preferred as its linearity behavior. Hence IMC is designed for this converter. This controller is a model-based control and provides more robustness and better set point monitoring. Moreover, it can be used for an unstable process where the conventional controller cannot handle the dynamic operation. Matlab/Simulink environment is used to simulate the converter and its controller, then, the results are shown and discussed.Keywords: DC/DC converter, single ended primary inductance converter, SEPIC, internal model controller, IMC, switched mode power supply
Procedia PDF Downloads 62915690 VISMA: A Method for System Analysis in Early Lifecycle Phases
Authors: Walter Sebron, Hans Tschürtz, Peter Krebs
Abstract:
The choice of applicable analysis methods in safety or systems engineering depends on the depth of knowledge about a system, and on the respective lifecycle phase. However, the analysis method chain still shows gaps as it should support system analysis during the lifecycle of a system from a rough concept in pre-project phase until end-of-life. This paper’s goal is to discuss an analysis method, the VISSE Shell Model Analysis (VISMA) method, which aims at closing the gap in the early system lifecycle phases, like the conceptual or pre-project phase, or the project start phase. It was originally developed to aid in the definition of the system boundary of electronic system parts, like e.g. a control unit for a pump motor. Furthermore, it can be also applied to non-electronic system parts. The VISMA method is a graphical sketch-like method that stratifies a system and its parts in inner and outer shells, like the layers of an onion. It analyses a system in a two-step approach, from the innermost to the outermost components followed by the reverse direction. To ensure a complete view of a system and its environment, the VISMA should be performed by (multifunctional) development teams. To introduce the method, a set of rules and guidelines has been defined in order to enable a proper shell build-up. In the first step, the innermost system, named system under consideration (SUC), is selected, which is the focus of the subsequent analysis. Then, its directly adjacent components, responsible for providing input to and receiving output from the SUC, are identified. These components are the content of the first shell around the SUC. Next, the input and output components to the components in the first shell are identified and form the second shell around the first one. Continuing this way, shell by shell is added with its respective parts until the border of the complete system (external border) is reached. Last, two external shells are added to complete the system view, the environment and the use case shell. This system view is also stored for future use. In the second step, the shells are examined in the reverse direction (outside to inside) in order to remove superfluous components or subsystems. Input chains to the SUC, as well as output chains from the SUC are described graphically via arrows, to highlight functional chains through the system. As a result, this method offers a clear and graphical description and overview of a system, its main parts and environment; however, the focus still remains on a specific SUC. It helps to identify the interfaces and interfacing components of the SUC, as well as important external interfaces of the overall system. It supports the identification of the first internal and external hazard causes and causal chains. Additionally, the method promotes a holistic picture and cross-functional understanding of a system, its contributing parts, internal relationships and possible dangers within a multidisciplinary development team.Keywords: analysis methods, functional safety, hazard identification, system and safety engineering, system boundary definition, system safety
Procedia PDF Downloads 22415689 Study of Inhibition of the End Effect Based on AR Model Predict of Combined Data Extension and Window Function
Authors: Pan Hongxia, Wang Zhenhua
Abstract:
In this paper, the EMD decomposition in the process of endpoint effect adopted data based on AR model to predict the continuation and window function method of combining the two effective inhibition. Proven by simulation of the simulation signal obtained the ideal effect, then, apply this method to the gearbox test data is also achieved good effect in the process, for the analysis of the subsequent data processing to improve the calculation accuracy. In the end, under various working conditions for the gearbox fault diagnosis laid a good foundation.Keywords: gearbox, fault diagnosis, ar model, end effect
Procedia PDF Downloads 36615688 Media Planning Decisions and Preferences through a Goal Programming Model: An Application to a Media Campaign for a Mature Product in Italy
Authors: Cinzia Colapinto, Davide La Torre
Abstract:
Goal Programming (GP) and its variants were applied to marketing and specific marketing issues, such as media scheduling problems in the last decades. The concept of satisfaction functions has been widely utilized in the GP model to explicitly integrate the Decision-Maker’s preferences. These preferences can be guided by the available information regarding the decision-making situation. A GP model with satisfaction functions for media planning decisions is proposed and then illustrated through a case study related to a marketing/media campaign in the Italian market.Keywords: goal programming, satisfaction functions, media planning, tourism management
Procedia PDF Downloads 39915687 Measurement Tools of the Maturity Model for IT Service Outsourcing in Higher Education Institutions
Authors: Victoriano Valencia García, Luis Usero Aragonés, Eugenio J. Fernández Vicente
Abstract:
Nowadays, the successful implementation of ICTs is vital for almost any kind of organization. Good governance and ICT management are essential for delivering value, managing technological risks, managing resources and performance measurement. In addition, outsourcing is a strategic IT service solution which complements IT services provided internally in organizations. This paper proposes the measurement tools of a new holistic maturity model based on standards ISO/IEC 20000 and ISO/IEC 38500, and the frameworks and best practices of ITIL and COBIT, with a specific focus on IT outsourcing. These measurement tools allow independent validation and practical application in the field of higher education, using a questionnaire, metrics tables, and continuous improvement plan tables as part of the measurement process. Guidelines and standards are proposed in the model for facilitating adaptation to universities and achieving excellence in the outsourcing of IT services.Keywords: IT governance, IT management, IT services, outsourcing, maturity model, measurement tools
Procedia PDF Downloads 59215686 Variability Management of Contextual Feature Model in Multi-Software Product Line
Authors: Muhammad Fezan Afzal, Asad Abbas, Imran Khan, Salma Imtiaz
Abstract:
Software Product Line (SPL) paradigm is used for the development of the family of software products that share common and variable features. Feature model is a domain of SPL that consists of common and variable features with predefined relationships and constraints. Multiple SPLs consist of a number of similar common and variable features, such as mobile phones and Tabs. Reusability of common and variable features from the different domains of SPL is a complex task due to the external relationships and constraints of features in the feature model. To increase the reusability of feature model resources from domain engineering, it is required to manage the commonality of features at the level of SPL application development. In this research, we have proposed an approach that combines multiple SPLs into a single domain and converts them to a common feature model. Extracting the common features from different feature models is more effective, less cost and time to market for the application development. For extracting features from multiple SPLs, the proposed framework consists of three steps: 1) find the variation points, 2) find the constraints, and 3) combine the feature models into a single feature model on the basis of variation points and constraints. By using this approach, reusability can increase features from the multiple feature models. The impact of this research is to reduce the development of cost, time to market and increase products of SPL.Keywords: software product line, feature model, variability management, multi-SPLs
Procedia PDF Downloads 69