Search results for: statistical models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9783

Search results for: statistical models

9543 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 43
9542 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 165
9541 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors

Authors: Susana Aragoneses Garrido

Abstract:

Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.

Keywords: analysis vehicles, asssesment, ergonomics, car redesign

Procedia PDF Downloads 308
9540 Modeling the Saltatory Conduction in Myelinated Axons by Order Reduction

Authors: Ruxandra Barbulescu, Daniel Ioan, Gabriela Ciuprina

Abstract:

The saltatory conduction is the way the action potential is transmitted along a myelinated axon. The potential diffuses along the myelinated compartments and it is regenerated in the Ranvier nodes due to the ion channels allowing the flow across the membrane. For an efficient simulation of populations of neurons, it is important to use reduced order models both for myelinated compartments and for Ranvier nodes and to have control over their accuracy and inner parameters. The paper presents a reduced order model of this neural system which allows an efficient simulation method for the saltatory conduction in myelinated axons. This model is obtained by concatenating reduced order linear models of 1D myelinated compartments and nonlinear 0D models of Ranvier nodes. The models for the myelinated compartments are selected from a series of spatially distributed models developed and hierarchized according to their modeling errors. The extracted model described by a nonlinear PDE of hyperbolic type is able to reproduce the saltatory conduction with acceptable accuracy and takes into account the finite propagation speed of potential. Finally, this model is again reduced in order to make it suitable for the inclusion in large-scale neural circuits.

Keywords: action potential, myelinated segments, nonlinear models, Ranvier nodes, reduced order models, saltatory conduction

Procedia PDF Downloads 130
9539 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection

Procedia PDF Downloads 278
9538 Assessment of Modern RANS Models for the C3X Vane Film Cooling Prediction

Authors: Mikhail Gritskevich, Sebastian Hohenstein

Abstract:

The paper presents the results of a detailed assessment of several modern Reynolds Averaged Navier-Stokes (RANS) turbulence models for prediction of C3X vane film cooling at various injection regimes. Three models are considered, namely the Shear Stress Transport (SST) model, the modification of the SST model accounting for the streamlines curvature (SST-CC), and the Explicit Algebraic Reynolds Stress Model (EARSM). It is shown that all the considered models face with a problem in prediction of the adiabatic effectiveness in the vicinity of the cooling holes; however, accounting for the Reynolds stress anisotropy within the EARSM model noticeably increases the solution accuracy. On the other hand, further downstream all the models provide a reasonable agreement with the experimental data for the adiabatic effectiveness and among the considered models the most accurate results are obtained with the use EARMS.

Keywords: discrete holes film cooling, Reynolds Averaged Navier-Stokes (RANS), Reynolds stress tensor anisotropy, turbulent heat transfer

Procedia PDF Downloads 394
9537 State of Art in Software Requirement Negotiation Process Models

Authors: Shamsu Abdullahi, Nazir Yusuf, Hazrina Sofian, Abubakar Zakari, Amina Nura, Salisu Suleiman

Abstract:

Requirements negotiation process models help in resolving conflicting requirements of the heterogeneous stakeholders in the software development industry. This is to achieve a shared vision of software projects to be developed by the industry. Negotiating stakeholder agreements is a serious and difficult task in the software development process. There are many requirements negotiation process models that effectively negotiate stakeholder agreements that have been proposed by the research community. Other issues in the requirements negotiation research domain include stakeholder communication, decision-making, lack of negotiation interoperability, and managing requirement changes and analysis. This study highlights the current state of the art in the existing software requirements negotiation process models. The study also describes the issues and limitations in the software requirements negotiations process models.

Keywords: requirements, negotiation, stakeholders, agreements

Procedia PDF Downloads 157
9536 Machine Learning Techniques for Estimating Ground Motion Parameters

Authors: Farid Khosravikia, Patricia Clayton

Abstract:

The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.

Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine

Procedia PDF Downloads 92
9535 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions

Authors: Vikrant Gupta, Amrit Goswami

Abstract:

The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.

Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition

Procedia PDF Downloads 105
9534 Harmonising the Circular Economy: An Analysis of 160 Papers

Authors: M. Novak, J. Dufourmount, D. Wildi, A. Sutherland, L. Sosa, J. Zimmer, E. Szabo

Abstract:

The circular economy has grounded itself amongst scholars and practitioners operating across governments and enterprises. The aim of this paper is to augment the circular economy concept by identifying common core and enabling circular business models. To this aim, we have analysed over 150 papers regarding circular activities and identified 8 clusters of business models and enablers. We have mapped and harmonised the most prominent frameworks conceptualising the circular economy. Our findings indicate that circular economy core business models include regenerative in addition to reduce, reuse and recycle activities. We further find enabling activities in design, digital technologies, knowledge development and sharing, multistakeholder collaborations, and extended corporate responsibility initiatives in various forms. We critically contrast the application of these business models across the European and African contexts. Overall, we find that seemingly varied circular economy definitions distill the same conceptual business models. We hope to contribute towards the coherence of the circular economy concept, and the continuous development of practical guidance to select and implement circular strategies.

Keywords: Circular economy, content analysis, business models, definitions, enablers, frameworks

Procedia PDF Downloads 185
9533 Non-Linear Assessment of Chromatographic Lipophilicity and Model Ranking of Newly Synthesized Steroid Derivatives

Authors: Milica Karadzic, Lidija Jevric, Sanja Podunavac-Kuzmanovic, Strahinja Kovacevic, Anamarija Mandic, Katarina Penov Gasi, Marija Sakac, Aleksandar Okljesa, Andrea Nikolic

Abstract:

The present paper deals with chromatographic lipophilicity prediction of newly synthesized steroid derivatives. The prediction was achieved using in silico generated molecular descriptors and quantitative structure-retention relationship (QSRR) methodology with the artificial neural networks (ANN) approach. Chromatographic lipophilicity of the investigated compounds was expressed as retention factor value logk. For QSRR modeling, a feedforward back-propagation ANN with gradient descent learning algorithm was applied. Using the novel sum of ranking differences (SRD) method generated ANN models were ranked. The aim was to distinguish the most consistent QSRR model that can be found, and similarity or dissimilarity between the models that could be noticed. In this study, SRD was performed with average values of retention factor value logk as reference values. An excellent correlation between experimentally observed retention factor value logk and values predicted by the ANN was obtained with a correlation coefficient higher than 0.9890. Statistical results show that the established ANN models can be applied for required purpose. This article is based upon work from COST Action (TD1305), supported by COST (European Cooperation in Science and Technology).

Keywords: artificial neural networks, liquid chromatography, molecular descriptors, steroids, sum of ranking differences

Procedia PDF Downloads 289
9532 Empirical Modeling of Air Dried Rubberwood Drying System

Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit

Abstract:

Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (= 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.

Keywords: empirical models, rubberwood, moisture ratio, hot air drying

Procedia PDF Downloads 242
9531 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: business process modelling, system models, role activity diagrams, sequence diagrams

Procedia PDF Downloads 351
9530 Capability of Available Seismic Soil Liquefaction Potential Assessment Models Based on Shear-Wave Velocity Using Banchu Case History

Authors: Nima Pirhadi, Yong Bo Shao, Xusheng Wa, Jianguo Lu

Abstract:

Several models based on the simplified method introduced by Seed and Idriss (1971) have been developed to assess the liquefaction potential of saturated sandy soils. The procedure includes determining the cyclic resistance of the soil as the cyclic resistance ratio (CRR) and comparing it with earthquake loads as cyclic stress ratio (CSR). Of all methods to determine CRR, the methods using shear-wave velocity (Vs) are common because of their low sensitivity to the penetration resistance reduction caused by fine content (FC). To evaluate the capability of the models, based on the Vs., the new data from Bachu-Jianshi earthquake case history collected, then the prediction results of the models are compared to the measured results; consequently, the accuracy of the models are discussed via three criteria and graphs. The evaluation demonstrates reasonable accuracy of the models in the Banchu region.

Keywords: seismic liquefaction, banchu-jiashi earthquake, shear-wave velocity, liquefaction potential evaluation

Procedia PDF Downloads 200
9529 Compromising Relevance for Elegance: A Danger of Dominant Growth Models for Backward Economies

Authors: Givi Kupatadze

Abstract:

Backward economies are facing a challenge of achieving sustainable high economic growth rate. Dominant growth models represent a roadmap in framing economic development strategy. This paper examines a relevance of the dominant growth models for backward economies. Cobb-Douglas production function, the Harrod-Domar model of economic growth, the Solow growth model and general formula of gross domestic product are examined to undertake a comprehensive study of the dominant growth models. Deductive research method allows to uncover major weaknesses of the dominant growth models and to come up with practical implications for economic development strategy. The key finding of the paper shows, contrary to what used to be taught by textbooks of economics, that constant returns to scale property of the dominant growth models are a mere coincidence and its generalization over space and time can be regarded as one of the most unfortunate mistakes in the whole field of political economy. The major suggestion of the paper for backward economies is that understanding and considering taxonomy of economic activities based on increasing and diminishing returns to scale represent a cornerstone of successful economic development strategy.

Keywords: backward economies, constant returns to scale, dominant growth models, taxonomy of economic activities

Procedia PDF Downloads 345
9528 Single Imputation for Audiograms

Authors: Sarah Beaver, Renee Bryce

Abstract:

Audiograms detect hearing impairment, but missing values pose problems. This work explores imputations in an attempt to improve accuracy. This work implements Linear Regression, Lasso, Linear Support Vector Regression, Bayesian Ridge, K Nearest Neighbors (KNN), and Random Forest machine learning techniques to impute audiogram frequencies ranging from 125Hz to 8000Hz. The data contains patients who had or were candidates for cochlear implants. Accuracy is compared across two different Nested Cross-Validation k values. Over 4000 audiograms were used from 800 unique patients. Additionally, training on data combines and compares left and right ear audiograms versus single ear side audiograms. The accuracy achieved using Root Mean Square Error (RMSE) values for the best models for Random Forest ranges from 4.74 to 6.37. The R\textsuperscript{2} values for the best models for Random Forest ranges from .91 to .96. The accuracy achieved using RMSE values for the best models for KNN ranges from 5.00 to 7.72. The R\textsuperscript{2} values for the best models for KNN ranges from .89 to .95. The best imputation models received R\textsuperscript{2} between .89 to .96 and RMSE values less than 8dB. We also show that the accuracy of classification predictive models performed better with our best imputation models versus constant imputations by a two percent increase.

Keywords: machine learning, audiograms, data imputations, single imputations

Procedia PDF Downloads 53
9527 On Bianchi Type Cosmological Models in Lyra’s Geometry

Authors: R. K. Dubey

Abstract:

Bianchi type cosmological models have been studied on the basis of Lyra’s geometry. Exact solution has been obtained by considering a time dependent displacement field for constant deceleration parameter and varying cosmological term of the universe. The physical behavior of the different models has been examined for different cases.

Keywords: Bianchi type-I cosmological model, variable gravitational coupling, cosmological constant term, Lyra's model

Procedia PDF Downloads 327
9526 Statistical Model to Examine the Impact of the Inflation Rate and Real Interest Rate on the Bahrain Economy

Authors: Ghada Abo-Zaid

Abstract:

Introduction: Oil is one of the most income source in Bahrain. Low oil price influence on the economy growth and the investment rate in Bahrain. For example, the economic growth was 3.7% in 2012, and it reduced to 2.9% in 2015. Investment rate was 9.8% in 2012, and it is reduced to be 5.9% and -12.1% in 2014 and 2015, respectively. The inflation rate is increased to the peak point in 2013 with 3.3 %. Objectives: The objectives here are to build statistical models to examine the effect of the interest rate inflation rate on the growth economy in Bahrain from 2000 to 2018. Methods: This study based on 18 years, and the multiple regression model is used for the analysis. All of the missing data are omitted from the analysis. Results: Regression model is used to examine the association between the Growth national product (GNP), the inflation rate, and real interest rate. We found that (i) Increase the real interest rate decrease the GNP. (ii) Increase the inflation rate does not effect on the growth economy in Bahrain since the average of the inflation rate was almost 2%, and this is considered as a low percentage. Conclusion: There is a positive impact of the real interest rate on the GNP in Bahrain. While the inflation rate does not show any negative influence on the GNP as the inflation rate was not large enough to effect negatively on the economy growth rate in Bahrain.

Keywords: growth national product, egypt, regression model, interest rate

Procedia PDF Downloads 126
9525 Influence of Optimization Method on Parameters Identification of Hyperelastic Models

Authors: Bale Baidi Blaise, Gilles Marckmann, Liman Kaoye, Talaka Dya, Moustapha Bachirou, Gambo Betchewe, Tibi Beda

Abstract:

This work highlights the capabilities of particles swarm optimization (PSO) method to identify parameters of hyperelastic models. The study compares this method with Genetic Algorithm (GA) method, Least Squares (LS) method, Pattern Search Algorithm (PSA) method, Beda-Chevalier (BC) method and the Levenberg-Marquardt (LM) method. Four classic hyperelastic models are used to test the different methods through parameters identification. Then, the study compares the ability of these models to reproduce experimental Treloar data in simple tension, biaxial tension and pure shear.

Keywords: particle swarm optimization, identification, hyperelastic, model

Procedia PDF Downloads 139
9524 Bridging the Data Gap for Sexism Detection in Twitter: A Semi-Supervised Approach

Authors: Adeep Hande, Shubham Agarwal

Abstract:

This paper presents a study on identifying sexism in online texts using various state-of-the-art deep learning models based on BERT. We experimented with different feature sets and model architectures and evaluated their performance using precision, recall, F1 score, and accuracy metrics. We also explored the use of pseudolabeling technique to improve model performance. Our experiments show that the best-performing models were based on BERT, and their multilingual model achieved an F1 score of 0.83. Furthermore, the use of pseudolabeling significantly improved the performance of the BERT-based models, with the best results achieved using the pseudolabeling technique. Our findings suggest that BERT-based models with pseudolabeling hold great promise for identifying sexism in online texts with high accuracy.

Keywords: large language models, semi-supervised learning, sexism detection, data sparsity

Procedia PDF Downloads 37
9523 Models of Innovation Processes and Their Evolution: A Literature Review

Authors: Maier Dorin, Maier Andreea

Abstract:

Today, any organization - regardless of the specific activity - must be prepared to face continuous radical changes, innovation thus becoming a condition of survival in a globalized market. Not all managers have an overall view on the real size of necessary innovation potential. Unfortunately there is still no common (and correct) understanding of the term of innovation among managers. Moreover, not all managers are aware of the need for innovation. This article highlights and analyzes a series of models of innovation processes and their evolution. The models analyzed encompass both the strategic level and the operational one within an organization, indicating performance innovation on each landing. As the literature review shows, there are no easy answers to the innovation process as there are no shortcuts to great results. Successful companies do not have a silver innovative bullet - they do not get results by making one or few things better than others, they make everything better.

Keywords: innovation, innovation process, business success, models of innovation

Procedia PDF Downloads 367
9522 Towards Efficient Reasoning about Families of Class Diagrams Using Union Models

Authors: Tejush Badal, Sanaa Alwidian

Abstract:

Class diagrams are useful tools within the Unified Modelling Language (UML) to model and visualize the relationships between, and properties of objects within a system. As a system evolves over time and space (e.g., products), a series of models with several commonalities and variabilities create what is known as a model family. In circumstances where there are several versions of a model, examining each model individually, becomes expensive in terms of computation resources. To avoid performing redundant operations, this paper proposes an approach for representing a family of class diagrams into Union Models to represent model families using a single generic model. The paper aims to analyze and reason about a family of class diagrams using union models as opposed to individual analysis of each member model in the family. The union algorithm provides a holistic view of the model family, where the latter cannot be otherwise obtained from an individual analysis approach, this in turn, enhances the analysis performed in terms of speeding up the time needed to analyze a family of models together as opposed to analyzing individual models, one model at a time.

Keywords: analysis, class diagram, model family, unified modeling language, union model

Procedia PDF Downloads 46
9521 Applying Business Model Patterns: A Case Study in Latin American Building Industry

Authors: James Alberto Ortega Morales, Nelson Andrés Martínez Marín

Abstract:

The bulding industry is one of the most important sectors all around the world in terms of contribution to index like GDP and labor. On the other hand, it is a major contributor to Greenhouse Gases (GHG) and waste generation contributing to global warming. In this sense, it is necessary to establish sustainable practices both from the strategic point of view to the operations point of view as well in all business and industries. Business models don’t scape to this reality attending it´s mediator role between strategy and operations. Business models can turn from the traditional practices searching economic benefits to sustainable bussines models that generate both economic value and value for society and the environment. Recent advances in the analysis of sustainable business models find different classifications that allow finding potential triple bottom line (economic, social and environmental) solutions applicable in every business sector. Into the metioned Advances have been identified, 11 groups and 45 patterns of sustainable business models have been identified; such patterns can be found either in the business models as a whole or found concurrently in their components. This article presents the analysis of a case study, seeking to identify the components and elements that are part of it, using the ECO CANVAS conceptual model. The case study allows showing the concurrent existence of different patterns of business models for sustainability empirically, serving as an example and inspiration for other Latin American companies interested in integrating sustainability into their new and existing business models.

Keywords: sustainable business models, business sustainability, business model patterns, case study, construction industry

Procedia PDF Downloads 81
9520 Regional Flood-Duration-Frequency Models for Norway

Authors: Danielle M. Barna, Kolbjørn Engeland, Thordis Thorarinsdottir, Chong-Yu Xu

Abstract:

Design flood values give estimates of flood magnitude within a given return period and are essential to making adaptive decisions around land use planning, infrastructure design, and disaster mitigation. Often design flood values are needed at locations with insufficient data. Additionally, in hydrologic applications where flood retention is important (e.g., floodplain management and reservoir design), design flood values are required at different flood durations. A statistical approach to this problem is a development of a regression model for extremes where some of the parameters are dependent on flood duration in addition to being covariate-dependent. In hydrology, this is called a regional flood-duration-frequency (regional-QDF) model. Typically, the underlying statistical distribution is chosen to be the Generalized Extreme Value (GEV) distribution. However, as the support of the GEV distribution depends on both its parameters and the range of the data, special care must be taken with the development of the regional model. In particular, we find that the GEV is problematic when developing a GAMLSS-type analysis due to the difficulty of proposing a link function that is independent of the unknown parameters and the observed data. We discuss these challenges in the context of developing a regional QDF model for Norway.

Keywords: design flood values, bayesian statistics, regression modeling of extremes, extreme value analysis, GEV

Procedia PDF Downloads 44
9519 Volatility Model with Markov Regime Switching to Forecast Baht/USD

Authors: Nop Sopipan

Abstract:

In this paper, we forecast the volatility of Baht/USDs using Markov Regime Switching GARCH (MRS-GARCH) models. These models allow volatility to have different dynamics according to unobserved regime variables. The main purpose of this paper is to find out whether MRS-GARCH models are an improvement on the GARCH type models in terms of modeling and forecasting Baht/USD volatility. The MRS-GARCH is the best performance model for Baht/USD volatility in short term but the GARCH model is best perform for long term.

Keywords: volatility, Markov Regime Switching, forecasting, Baht/USD

Procedia PDF Downloads 266
9518 Variable Selection in a Data Envelopment Analysis Model by Multiple Proportions Comparison

Authors: Jirawan Jitthavech, Vichit Lorchirachoonkul

Abstract:

A statistical procedure using multiple comparisons test for proportions is proposed for variable selection in a data envelopment analysis (DEA) model. The test statistic in the multiple comparisons is the proportion of efficient decision making units (DMUs) in a DEA model. Three methods of multiple comparisons test for proportions: multiple Z tests with Bonferroni correction, multiple tests in 2Xc crosstabulation and the Marascuilo procedure, are used in the proposed statistical procedure of iteratively eliminating the variables in a backward manner. Two simulation populations of moderately and lowly correlated variables are used to compare the results of the statistical procedure using three methods of multiple comparisons test for proportions with the hypothesis testing of the efficiency contribution measure. From the simulation results, it can be concluded that the proposed statistical procedure using multiple Z tests for proportions with Bonferroni correction clearly outperforms the proposed statistical procedure using the remaining two methods of multiple comparisons and the hypothesis testing of the efficiency contribution measure.

Keywords: Bonferroni correction, efficient DMUs, Marascuilo procedure, Pastor et al. method, 2xc crosstabulation

Procedia PDF Downloads 279
9517 Day of the Week Patterns and the Financial Trends' Role: Evidence from the Greek Stock Market during the Euro Era

Authors: Nikolaos Konstantopoulos, Aristeidis Samitas, Vasileiou Evangelos

Abstract:

The purpose of this study is to examine if the financial trends influence not only the stock markets’ returns, but also their anomalies. We choose to study the day of the week effect (DOW) for the Greek stock market during the Euro period (2002-12), because during the specific period there are not significant structural changes and there are long term financial trends. Moreover, in order to avoid possible methodological counterarguments that usually arise in the literature, we apply several linear (OLS) and nonlinear (GARCH family) models to our sample until we reach to the conclusion that the TGARCH model fits better to our sample than any other. Our results suggest that in the Greek stock market there is a long term predisposition for positive/negative returns depending on the weekday. However, the statistical significance is influenced from the financial trend. This influence may be the reason why there are conflict findings in the literature through the time. Finally, we combine the DOW’s empirical findings from 1985-2012 and we may assume that in the Greek case there is a tendency for long lived turn of the week effect.

Keywords: day of the week effect, GARCH family models, Athens stock exchange, economic growth, crisis

Procedia PDF Downloads 382
9516 As a Little-Known Side a Passionate Statistician: Florence Nightingale

Authors: Gülcan Taşkıran, Ayla Bayık Temel

Abstract:

Background: Florence Nightingale, the modern founder of the nursing, is most famous for her role as a nurse. But not so much known about her contributions as a mathematician and statistician. Aim: In this conceptual article it is aimed to examine Florence Nightingale's statistics education, how she used her passion for statistics and applied statistical data in nursing care and her scientific contributions to statistical science. Design: Literature review method was used in the study. The databases of Istanbul University Library Search Engine, Turkish Medical Directory, Thesis Scanning Center of Higher Education Council, PubMed, Google Scholar, EBSCO Host, Web of Science were scanned to reach the studies. The keywords 'statistics' and 'Florence Nightingale' have been used in Turkish and English while being screened. As a result of the screening, totally 41 studies were examined from the national and international literature. Results: Florence Nightingale has interested in mathematics and statistics at her early ages and has received various training in these subjects. Lessons learned by Nightingale in a cultured family environment, her talent in mathematics and numbers, and her religious beliefs played a crucial role in the direction of the statistics. She was influenced by Quetelet's ideas in the formation of the statistical philosophy and received support from William Farr in her statistical studies. During the Crimean War, she applied statistical knowledge to nursing care, developed many statistical methods and graphics, so that she made revolutionary reforms in the health field. Conclusions: Nightingale's interest in statistics, her broad vision, the statistical ideas fused with religious beliefs, the innovative graphics she has developed and the extraordinary statistical projects that she carried out has been influential on the basis of her professional achievements. Florence Nightingale has also become a model for women in statistics. Today, using and teaching of statistics and research in nursing care practices and education programs continues with the light she gave.

Keywords: Crimean war, Florence Nightingale, nursing, statistics

Procedia PDF Downloads 272
9515 Process Capability Analysis by Using Statistical Process Control of Rice Polished Cylinder Turning Practice

Authors: S. Bangphan, P. Bangphan, T.Boonkang

Abstract:

Quality control helps industries in improvements of its product quality and productivity. Statistical Process Control (SPC) is one of the tools to control the quality of products that turning practice in bringing a department of industrial engineering process under control. In this research, the process control of a turning manufactured at workshops machines. The varying measurements have been recorded for a number of samples of a rice polished cylinder obtained from a number of trials with the turning practice. SPC technique has been adopted by the process is finally brought under control and process capability is improved.

Keywords: rice polished cylinder, statistical process control, control charts, process capability

Procedia PDF Downloads 466
9514 Use of Artificial Neural Networks to Estimate Evapotranspiration for Efficient Irrigation Management

Authors: Adriana Postal, Silvio C. Sampaio, Marcio A. Villas Boas, Josué P. Castro, Ralpho R. Reis

Abstract:

This study deals with the estimation of reference evapotranspiration (ET₀) in an agricultural context, focusing on efficient irrigation management to meet the growing interest in the sustainable management of water resources. Given the importance of water in agriculture and its scarcity in many regions, efficient use of this resource is essential to ensure food security and environmental sustainability. The methodology used involved the application of artificial intelligence techniques, specifically Multilayer Perceptron (MLP) Artificial Neural Networks (ANNs), to predict ET₀ in the state of Paraná, Brazil. The models were trained and validated with meteorological data from the Brazilian National Institute of Meteorology (INMET), together with data obtained from a producer's weather station in the western region of Paraná. Two optimizers (SGD and Adam) and different meteorological variables, such as temperature, humidity, solar radiation, and wind speed, were explored as inputs to the models. Nineteen configurations with different input variables were tested; amidst them, configuration 9, with 8 input variables, was identified as the most efficient of all. Configuration 10, with 4 input variables, was considered the most effective, considering the smallest number of variables. The main conclusions of this study show that MLP ANNs are capable of accurately estimating ET₀, providing a valuable tool for irrigation management in agriculture. Both configurations (9 and 10) showed promising performance in predicting ET₀. The validation of the models with cultivator data underlined the practical relevance of these tools and confirmed their generalization ability for different field conditions. The results of the statistical metrics, including Mean Absolute Error (MAE), Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and Coefficient of Determination (R2), showed excellent agreement between the model predictions and the observed data, with MAE as low as 0.01 mm/day and 0.03 mm/day, respectively. In addition, the models achieved an R2 between 0.99 and 1, indicating a satisfactory fit to the real data. This agreement was also confirmed by the Kolmogorov-Smirnov test, which evaluates the agreement of the predictions with the statistical behavior of the real data and yields values between 0.02 and 0.04 for the producer data. In addition, the results of this study suggest that the developed technique can be applied to other locations by using specific data from these sites to further improve ET₀ predictions and thus contribute to sustainable irrigation management in different agricultural regions. To summarize, this study has helped to advance research in the field of irrigation management in agriculture. It provides an accessible and effective approach to ET₀ estimation that has the potential to significantly improve water use efficiency and promote agricultural sustainability in different contexts.

Keywords: agricultural technology, neural networks in agriculture, water efficiency, water use optimization

Procedia PDF Downloads 15