Search results for: statistical models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9784

Search results for: statistical models

9694 E-Consumers’ Attribute Non-Attendance Switching Behavior: Effect of Providing Information on Attributes

Authors: Leonard Maaya, Michel Meulders, Martina Vandebroek

Abstract:

Discrete Choice Experiments (DCE) are used to investigate how product attributes affect decision-makers’ choices. In DCEs, choice situations consisting of several alternatives are presented from which choice-makers select the preferred alternative. Standard multinomial logit models based on random utility theory can be used to estimate the utilities for the attributes. The overarching principle in these models is that respondents understand and use all the attributes when making choices. However, studies suggest that respondents sometimes ignore some attributes (commonly referred to as Attribute Non-Attendance/ANA). The choice modeling literature presents ANA as a static process, i.e., respondents’ ANA behavior does not change throughout the experiment. However, respondents may ignore attributes due to changing factors like availability of information on attributes, learning/fatigue in experiments, etc. We develop a dynamic mixture latent Markov model to model changes in ANA when information on attributes is provided. The model is illustrated on e-consumers’ webshop choices. The results indicate that the dynamic ANA model describes the behavioral changes better than modeling the impact of information using changes in parameters. Further, we find that providing information on attributes leads to an increase in the attendance probabilities for the investigated attributes.

Keywords: choice models, discrete choice experiments, dynamic models, e-commerce, statistical modeling

Procedia PDF Downloads 108
9693 Analysis of Operating Speed on Four-Lane Divided Highways under Mixed Traffic Conditions

Authors: Chaitanya Varma, Arpan Mehar

Abstract:

The present study demonstrates the procedure to analyse speed data collected on various four-lane divided sections in India. Field data for the study was collected at different straight and curved sections on rural highways with the help of radar speed gun and video camera. The data collected at the sections were analysed and parameters pertain to speed distributions were estimated. The different statistical distribution was analysed on vehicle type speed data and for mixed traffic speed data. It was found that vehicle type speed data was either follows the normal distribution or Log-normal distribution, whereas the mixed traffic speed data follows more than one type of statistical distribution. The most common fit observed on mixed traffic speed data were Beta distribution and Weibull distribution. The separate operating speed model based on traffic and roadway geometric parameters were proposed in the present study. The operating speed model with traffic parameters and curve geometry parameters were established. Two different operating speed models were proposed with variables 1/R and Ln(R) and were found to be realistic with a different range of curve radius. The models developed in the present study are simple and realistic and can be used for forecasting operating speed on four-lane highways.

Keywords: highway, mixed traffic flow, modeling, operating speed

Procedia PDF Downloads 433
9692 Statistical Convergence of the Szasz-Mirakjan-Kantorovich-Type Operators

Authors: Rishikesh Yadav, Ramakanta Meher, Vishnu Narayan Mishra

Abstract:

The main aim of this article is to investigate the statistical convergence of the summation of integral type operators and to obtain the weighted statistical convergence. The rate of statistical convergence by means of modulus of continuity and function belonging to the Lipschitz class are also studied. We discuss the convergence of the defined operators by graphical representation and put a better rate of convergence than the Szasz-Mirakjan-Kantorovich operators. In the last section, we extend said operators into bivariate operators to study about the rate of convergence in sense of modulus of continuity and by means of Lipschitz class by using function of two variables.

Keywords: The Szasz-Mirakjan-Kantorovich operators, statistical convergence, modulus of continuity, Peeters K-functional, weighted modulus of continuity

Procedia PDF Downloads 170
9691 A Study of Population Growth Models and Future Population of India

Authors: Sheena K. J., Jyoti Badge, Sayed Mohammed Zeeshan

Abstract:

A Comparative Study of Exponential and Logistic Population Growth Models in India India is the second most populous city in the world, just behind China, and is going to be in the first place by next year. The Indian population has remarkably at higher rate than the other countries from the past 20 years. There were many scientists and demographers who has formulated various models of population growth in order to study and predict the future population. Some of the models are Fibonacci population growth model, Exponential growth model, Logistic growth model, Lotka-Volterra model, etc. These models have been effective in the past to an extent in predicting the population. However, it is essential to have a detailed comparative study between the population models to come out with a more accurate one. Having said that, this research study helps to analyze and compare the two population models under consideration - exponential and logistic growth models, thereby identifying the most effective one. Using the census data of 2011, the approximate population for 2016 to 2031 are calculated for 20 Indian states using both the models, compared and recorded the data with the actual population. On comparing the results of both models, it is found that logistic population model is more accurate than the exponential model, and using this model, we can predict the future population in a more effective way. This will give an insight to the researchers about the effective models of population and how effective these population models are in predicting the future population.

Keywords: population growth, population models, exponential model, logistic model, fibonacci model, lotka-volterra model, future population prediction, demographers

Procedia PDF Downloads 90
9690 Comparison of Solar Radiation Models

Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci

Abstract:

Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.

Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)

Procedia PDF Downloads 324
9689 Quantitative Structure–Activity Relationship Analysis of Some Benzimidazole Derivatives by Linear Multivariate Method

Authors: Strahinja Z. Kovačević, Lidija R. Jevrić, Sanja O. Podunavac Kuzmanović

Abstract:

The relationship between antibacterial activity of eighteen different substituted benzimidazole derivatives and their molecular characteristics was studied using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on inhibitory activity towards Staphylococcus aureus, by using molecular descriptors, as well as minimal inhibitory activity (MIC). Molecular descriptors were calculated from the optimized structures. Principal component analysis (PCA) followed by hierarchical cluster analysis (HCA) and multiple linear regression (MLR) was performed in order to select molecular descriptors that best describe the antibacterial behavior of the compounds investigated, and to determine the similarities between molecules. The HCA grouped the molecules in separated clusters which have the similar inhibitory activity. PCA showed very similar classification of molecules as the HCA, and displayed which descriptors contribute to that classification. MLR equations, that represent MIC as a function of the in silico molecular descriptors were established. The statistical significance of the estimated models was confirmed by standard statistical measures and cross-validation parameters (SD = 0.0816, F = 46.27, R = 0.9791, R2CV = 0.8266, R2adj = 0.9379, PRESS = 0.1116). These parameters indicate the possibility of application of the established chemometric models in prediction of the antibacterial behaviour of studied derivatives and structurally very similar compounds.

Keywords: antibacterial, benzimidazole, molecular descriptors, QSAR

Procedia PDF Downloads 336
9688 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)

Authors: Azimollah Aleshzadeh, Enver Vural Yavuz

Abstract:

The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.

Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping

Procedia PDF Downloads 104
9687 Dynamic vs. Static Bankruptcy Prediction Models: A Dynamic Performance Evaluation Framework

Authors: Mohammad Mahdi Mousavi

Abstract:

Bankruptcy prediction models have been implemented for continuous evaluation and monitoring of firms. With the huge number of bankruptcy models, an extensive number of studies have focused on answering the question that which of these models are superior in performance. In practice, one of the drawbacks of existing comparative studies is that the relative assessment of alternative bankruptcy models remains an exercise that is mono-criterion in nature. Further, a very restricted number of criteria and measure have been applied to compare the performance of competing bankruptcy prediction models. In this research, we overcome these methodological gaps through implementing an extensive range of criteria and measures for comparison between dynamic and static bankruptcy models, and through proposing a multi-criteria framework to compare the relative performance of bankruptcy models in forecasting firm distress for UK firms.

Keywords: bankruptcy prediction, data envelopment analysis, performance criteria, performance measures

Procedia PDF Downloads 220
9686 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology

Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh

Abstract:

The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.

Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology

Procedia PDF Downloads 192
9685 Investigating the performance of machine learning models on PM2.5 forecasts: A case study in the city of Thessaloniki

Authors: Alexandros Pournaras, Anastasia Papadopoulou, Serafim Kontos, Anastasios Karakostas

Abstract:

The air quality of modern cities is an important concern, as poor air quality contributes to human health and environmental issues. Reliable air quality forecasting has, thus, gained scientific and governmental attention as an essential tool that enables authorities to take proactive measures for public safety. In this study, the potential of Machine Learning (ML) models to forecast PM2.5 at local scale is investigated in the city of Thessaloniki, the second largest city in Greece, which has been struggling with the persistent issue of air pollution. ML models, with proven ability to address timeseries forecasting, are employed to predict the PM2.5 concentrations and the respective Air Quality Index 5-days ahead by learning from daily historical air quality and meteorological data from 2014 to 2016 and gathered from two stations with different land use characteristics in the urban fabric of Thessaloniki. The performance of the ML models on PM2.5 concentrations is evaluated with common statistical methods, such as R squared (r²) and Root Mean Squared Error (RMSE), utilizing a portion of the stations’ measurements as test set. A multi-categorical evaluation is utilized for the assessment of their performance on respective AQIs. Several conclusions were made from the experiments conducted. Experimenting on MLs’ configuration revealed a moderate effect of various parameters and training schemas on the model’s predictions. Their performance of all these models were found to produce satisfactory results on PM2.5 concentrations. In addition, their application on untrained stations showed that these models can perform well, indicating a generalized behavior. Moreover, their performance on AQI was even better, showing that the MLs can be used as predictors for AQI, which is the direct information provided to the general public.

Keywords: Air Quality, AQ Forecasting, AQI, Machine Learning, PM2.5

Procedia PDF Downloads 43
9684 Mathematical Models for GMAW and FCAW Welding Processes for Structural Steels Used in the Oil Industry

Authors: Carlos Alberto Carvalho Castro, Nancy Del Ducca Barbedo, Edmilsom Otoni Côrrea

Abstract:

With increase the production oil and lines transmission gases that are in ample expansion, the industries medium and great transport they had to adapt itself to supply the demand manufacture in this fabrication segment. In this context, two welding processes have been more extensively used: the GMAW (Gas Metal Arc Welding) and the FCAW (Flux Cored Arc Welding). In this work, welds using these processes were carried out in flat position on ASTM A-36 carbon steel plates in order to make a comparative evaluation between them concerning to mechanical and metallurgical properties. A statistical tool based on technical analysis and design of experiments, DOE, from the Minitab software was adopted. For these analyses, the voltage, current, and welding speed, in both processes, were varied. As a result, it was observed that the welds in both processes have different characteristics in relation to the metallurgical properties and performance, but they present good weldability, satisfactory mechanical strength e developed mathematical models.

Keywords: Flux Cored Arc Welding (FCAW), Gas Metal Arc Welding (GMAW), Design of Experiments (DOE), mathematical models

Procedia PDF Downloads 529
9683 The Impact of Public Open Space System on Housing Price in Chicago

Authors: Si Chen, Le Zhang, Xian He

Abstract:

The research explored the influences of public open space system on housing price through hedonic models, in order to support better open space plans and economic policies. We have three initial hypotheses: 1) public open space system has an overall positive influence on surrounding housing prices. 2) Different public open space types have different levels of influence on motivating surrounding housing prices. 3) Walking and driving accessibilities from property to public open spaces have different statistical relation with housing prices. Cook County, Illinois, was chosen to be a study area since data availability, sufficient open space types, and long-term open space preservation strategies. We considered the housing attributes, driving and walking accessibility scores from houses to nearby public open spaces, and driving accessibility scores to hospitals as influential features and used real housing sales price in 2010 as a dependent variable in the built hedonic model. Through ordinary least squares (OLS) regression analysis, General Moran’s I analysis and geographically weighted regression analysis, we observed the statistical relations between public open spaces and housing sale prices in the three built hedonic models and confirmed all three hypotheses.

Keywords: hedonic model, public open space, housing sale price, regression analysis, accessibility score

Procedia PDF Downloads 100
9682 Statistical Models and Time Series Forecasting on Crime Data in Nepal

Authors: Dila Ram Bhandari

Abstract:

Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.

Keywords: time series analysis, forecasting, ARIMA, machine learning

Procedia PDF Downloads 133
9681 On Hyperbolic Gompertz Growth Model (HGGM)

Authors: S. O. Oyamakin, A. U. Chukwu,

Abstract:

We proposed a Hyperbolic Gompertz Growth Model (HGGM), which was developed by introducing a stabilizing parameter called θ using hyperbolic sine function into the classical gompertz growth equation. The resulting integral solution obtained deterministically was reprogrammed into a statistical model and used in modeling the height and diameter of Pines (Pinus caribaea). Its ability in model prediction was compared with the classical gompertz growth model, an approach which mimicked the natural variability of height/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using goodness of fit tests and model selection criteria. The Kolmogorov-Smirnov test and Shapiro-Wilk test was also used to test the compliance of the error term to normality assumptions while using testing the independence of the error term using the runs test. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic gompertz growth models better than the source model (classical gompertz growth model) while the results of R2, Adj. R2, MSE, and AIC confirmed the predictive power of the Hyperbolic Monomolecular growth models over its source model.

Keywords: height, Dbh, forest, Pinus caribaea, hyperbolic, gompertz

Procedia PDF Downloads 415
9680 Analysis of Tactile Perception of Textiles by Fingertip Skin Model

Authors: Izabela L. Ciesielska-Wrόbel

Abstract:

This paper presents finite element models of the fingertip skin which have been created to simulate the contact of textile objects with the skin to gain a better understanding of the perception of textiles through the skin, so-called Hand of Textiles (HoT). Many objective and subjective techniques have been developed to analyze HoT, however none of them provide exact overall information concerning the sensation of textiles through the skin. As the human skin is a complex heterogeneous hyperelastic body composed of many particles, some simplifications had to be made at the stage of building the models. The same concerns models of woven structures, however their utilitarian value was maintained. The models reflect only friction between skin and woven textiles, deformation of the skin and fabrics when “touching” textiles and heat transfer from the surface of the skin into direction of textiles.

Keywords: fingertip skin models, finite element models, modelling of textiles, sensation of textiles through the skin

Procedia PDF Downloads 438
9679 Analysis of Atomic Models in High School Physics Textbooks

Authors: Meng-Fei Cheng, Wei Fneg

Abstract:

New Taiwan high school standards emphasize employing scientific models and modeling practices in physics learning. However, to our knowledge. Few studies address how scientific models and modeling are approached in current science teaching, and they do not examine the views of scientific models portrayed in the textbooks. To explore the views of scientific models and modeling in textbooks, this study investigated the atomic unit in different textbook versions as an example and provided suggestions for modeling curriculum. This study adopted a quantitative analysis of qualitative data in the atomic units of four mainstream version of Taiwan high school physics textbooks. The models were further analyzed using five dimensions of the views of scientific models (nature of models, multiple models, purpose of the models, testing models, and changing models); each dimension had three levels (low, medium, high). Descriptive statistics were employed to compare the frequency of describing the five dimensions of the views of scientific models in the atomic unit to understand the emphasis of the views and to compare the frequency of the eight scientific models’ use to investigate the atomic model that was used most often in the textbooks. Descriptive statistics were further utilized to investigate the average levels of the five dimensions of the views of scientific models to examine whether the textbooks views were close to the scientific view. The average level of the five dimensions of the eight atomic models were also compared to examine whether the views of the eight atomic models were close to the scientific views. The results revealed the following three major findings from the atomic unit. (1) Among the five dimensions of the views of scientific models, the most portrayed dimension was the 'purpose of models,' and the least portrayed dimension was 'multiple models.' The most diverse view was the 'purpose of models,' and the most sophisticated scientific view was the 'nature of models.' The least sophisticated scientific view was 'multiple models.' (2) Among the eight atomic models, the most mentioned model was the atomic nucleus model, and the least mentioned model was the three states of matter. (3) Among the correlations between the five dimensions, the dimension of 'testing models' was highly related to the dimension of 'changing models.' In short, this study examined the views of scientific models based on the atomic units of physics textbooks to identify the emphasized and disregarded views in the textbooks. The findings suggest how future textbooks and curriculum can provide a thorough view of scientific models to enhance students' model-based learning.

Keywords: atomic models, textbooks, science education, scientific model

Procedia PDF Downloads 124
9678 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016

Authors: Dimitra Alexiou

Abstract:

During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.

Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy

Procedia PDF Downloads 232
9677 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm

Authors: Zachary Huffman, Joana Rocha

Abstract:

Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.

Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations

Procedia PDF Downloads 111
9676 Methodology for Obtaining Static Alignment Model

Authors: Lely A. Luengas, Pedro R. Vizcaya, Giovanni Sánchez

Abstract:

In this paper, a methodology is presented to obtain the Static Alignment Model for any transtibial amputee person. The proposed methodology starts from experimental data collected on the Hospital Militar Central, Bogotá, Colombia. The effects of transtibial prosthesis malalignment on amputees were measured in terms of joint angles, center of pressure (COP) and weight distribution. Some statistical tools are used to obtain the model parameters. Mathematical predictive models of prosthetic alignment were created. The proposed models are validated in amputees and finding promising results for the prosthesis Static Alignment. Static alignment process is unique to each subject; nevertheless the proposed methodology can be used in each transtibial amputee.

Keywords: information theory, prediction model, prosthetic alignment, transtibial prosthesis

Procedia PDF Downloads 225
9675 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, maturity model, systems theory, viable system model

Procedia PDF Downloads 287
9674 Statistical and Analytical Comparison of GIS Overlay Modelings: An Appraisal on Groundwater Prospecting in Precambrian Metamorphics

Authors: Tapas Acharya, Monalisa Mitra

Abstract:

Overlay modeling is the most widely used conventional analysis for spatial decision support system. Overlay modeling requires a set of themes with different weightage computed in varied manners, which gives a resultant input for further integrated analysis. In spite of the popularity and most widely used technique; it gives inconsistent and erroneous results for similar inputs while processed in various GIS overlay techniques. This study is an attempt to compare and analyse the differences in the outputs of different overlay methods using GIS platform with same set of themes of the Precambrian metamorphic to obtain groundwater prospecting in Precambrian metamorphic rocks. The objective of the study is to emphasize the most suitable overlay method for groundwater prospecting in older Precambrian metamorphics. Seven input thematic layers like slope, Digital Elevation Model (DEM), soil thickness, lineament intersection density, average groundwater table fluctuation, stream density and lithology have been used in the spatial overlay models of fuzzy overlay, weighted overlay and weighted sum overlay methods to yield the suitable groundwater prospective zones. Spatial concurrence analysis with high yielding wells of the study area and the statistical comparative studies among the outputs of various overlay models using RStudio reveal that the Weighted Overlay model is the most efficient GIS overlay model to delineate the groundwater prospecting zones in the Precambrian metamorphic rocks.

Keywords: fuzzy overlay, GIS overlay model, groundwater prospecting, Precambrian metamorphics, weighted overlay, weighted sum overlay

Procedia PDF Downloads 100
9673 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 323
9672 Use of Statistical Correlations for the Estimation of Shear Wave Velocity from Standard Penetration Test-N-Values: Case Study of Algiers Area

Authors: Soumia Merat, Lynda Djerbal, Ramdane Bahar, Mohammed Amin Benbouras

Abstract:

Along with shear wave, many soil parameters are associated with the standard penetration test (SPT) as a dynamic in situ experiment. Both SPT-N data and geophysical data do not often exist in the same area. Statistical analysis of correlation between these parameters is an alternate method to estimate Vₛ conveniently and without additional investigations or data acquisition. Shear wave velocity is a basic engineering tool required to define dynamic properties of soils. In many instances, engineers opt for empirical correlations between shear wave velocity (Vₛ) and reliable static field test data like standard penetration test (SPT) N value, CPT (Cone Penetration Test) values, etc., to estimate shear wave velocity or dynamic soil parameters. The relation between Vs and SPT- N values of Algiers area is predicted using the collected data, and it is also compared with the previously suggested formulas of Vₛ determination by measuring Root Mean Square Error (RMSE) of each model. Algiers area is situated in high seismic zone (Zone III [RPA 2003: réglement parasismique algerien]), therefore the study is important for this region. The principal aim of this paper is to compare the field measurements of Down-hole test and the empirical models to show which one of these proposed formulas are applicable to predict and deduce shear wave velocity values.

Keywords: empirical models, RMSE, shear wave velocity, standard penetration test

Procedia PDF Downloads 308
9671 Power MOSFET Models Including Quasi-Saturation Effect

Authors: Abdelghafour Galadi

Abstract:

In this paper, accurate power MOSFET models including quasi-saturation effect are presented. These models have no internal node voltages determined by the circuit simulator and use one JFET or one depletion mode MOSFET transistors controlled by an “effective” gate voltage taking into account the quasi-saturation effect. The proposed models achieve accurate simulation results with an average error percentage less than 9%, which is an improvement of 21 percentage points compared to the commonly used standard power MOSFET model. In addition, the models can be integrated in any available commercial circuit simulators by using their analytical equations. A description of the models will be provided along with the parameter extraction procedure.

Keywords: power MOSFET, drift layer, quasi-saturation effect, SPICE model

Procedia PDF Downloads 169
9670 Count Data Regression Modeling: An Application to Spontaneous Abortion in India

Authors: Prashant Verma, Prafulla K. Swain, K. K. Singh, Mukti Khetan

Abstract:

Objective: In India, around 20,000 women die every year due to abortion-related complications. In the modelling of count variables, there is sometimes a preponderance of zero counts. This article concerns the estimation of various count regression models to predict the average number of spontaneous abortion among women in the Punjab state of India. It also assesses the factors associated with the number of spontaneous abortions. Materials and methods: The study included 27,173 married women of Punjab obtained from the DLHS-4 survey (2012-13). Poisson regression (PR), Negative binomial (NB) regression, zero hurdle negative binomial (ZHNB), and zero-inflated negative binomial (ZINB) models were employed to predict the average number of spontaneous abortions and to identify the determinants affecting the number of spontaneous abortions. Results: Statistical comparisons among four estimation methods revealed that the ZINB model provides the best prediction for the number of spontaneous abortions. Antenatal care (ANC) place, place of residence, total children born to a woman, woman's education and economic status were found to be the most significant factors affecting the occurrence of spontaneous abortion. Conclusions: The study offers a practical demonstration of techniques designed to handle count variables. Statistical comparisons among four estimation models revealed that the ZINB model provided the best prediction for the number of spontaneous abortions and is recommended to be used to predict the number of spontaneous abortions. The study suggests that women receive institutional Antenatal care to attain limited parity. It also advocates promoting higher education among women in Punjab, India.

Keywords: count data, spontaneous abortion, Poisson model, negative binomial model, zero hurdle negative binomial, zero-inflated negative binomial, regression

Procedia PDF Downloads 125
9669 Documentation Project on Boat Models from Saqqara, in the Grand Egyptian Museum

Authors: Ayman Aboelkassem, Mohamoud Ali, Rezq Diab

Abstract:

This project aims to document and preserve boat models which were discovered in the Saqqara by Czech Institute of Egyptology archeological mission at Saqqara (GEM numbers, 46007, 46008, 46009). These boat models dates back to Egyptian Old Kingdom and have been transferred to the Conservation Center of the Grand Egyptian Museum, to be displayed at the new museum.The project objectives making such boat models more visible to visitors through the use of 3D reconstructed models and high resolution photos which describe the history of using the boats during the Ancient Egyptian history. Especially, The Grand Egyptian Museum is going to exhibit the second boat of King Khufu from Old kingdom. The project goals are to document the boat models and arrange an exhibition, where such Models going to be displayed next to the Khufu Second Boat. The project shows the importance of using boats in Ancient Egypt, and connecting their usage through Ancient Egyptian periods till now. The boat models had a unique Symbolized in ancient Egypt and connect the public with their kings. The Egyptian kings allowed high ranked employees to put boat models in their tombs which has a great meaning that they hope to fellow their kings in the journey of the afterlife.

Keywords: archaeology, boat models, 3D digital tools for heritage management, museums

Procedia PDF Downloads 99
9668 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.

Keywords: regression, piecewise, Bayesian, reversible Jump MCMC

Procedia PDF Downloads 489
9667 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation

Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell

Abstract:

Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.

Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models

Procedia PDF Downloads 121
9666 Applying Genetic Algorithm in Exchange Rate Models Determination

Authors: Mehdi Rostamzadeh

Abstract:

Genetic Algorithms (GAs) are an adaptive heuristic search algorithm premised on the evolutionary ideas of natural selection and genetic. In this study, we apply GAs for fundamental and technical models of exchange rate determination in exchange rate market. In this framework, we estimated absolute and relative purchasing power parity, Mundell-Fleming, sticky and flexible prices (monetary models), equilibrium exchange rate and portfolio balance model as fundamental models and Auto Regressive (AR), Moving Average (MA), Auto-Regressive with Moving Average (ARMA) and Mean Reversion (MR) as technical models for Iranian Rial against European Union’s Euro using monthly data from January 1992 to December 2014. Then, we put these models into the genetic algorithm system for measuring their optimal weight for each model. These optimal weights have been measured according to four criteria i.e. R-Squared (R2), mean square error (MSE), mean absolute percentage error (MAPE) and root mean square error (RMSE).Based on obtained Results, it seems that for explaining of Iranian Rial against EU Euro exchange rate behavior, fundamental models are better than technical models.

Keywords: exchange rate, genetic algorithm, fundamental models, technical models

Procedia PDF Downloads 247
9665 Use of Predictive Food Microbiology to Determine the Shelf-Life of Foods

Authors: Fatih Tarlak

Abstract:

Predictive microbiology can be considered as an important field in food microbiology in which it uses predictive models to describe the microbial growth in different food products. Predictive models estimate the growth of microorganisms quickly, efficiently, and in a cost-effective way as compared to traditional methods of enumeration, which are long-lasting, expensive, and time-consuming. The mathematical models used in predictive microbiology are mainly categorised as primary and secondary models. The primary models are the mathematical equations that define the growth data as a function of time under a constant environmental condition. The secondary models describe the effects of environmental factors, such as temperature, pH, and water activity (aw) on the parameters of the primary models, including the maximum specific growth rate and lag phase duration, which are the most critical growth kinetic parameters. The combination of primary and secondary models provides valuable information to set limits for the quantitative detection of the microbial spoilage and assess product shelf-life.

Keywords: shelf-life, growth model, predictive microbiology, simulation

Procedia PDF Downloads 173