Search results for: statistical model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19878

Search results for: statistical model

19788 Series-Parallel Systems Reliability Optimization Using Genetic Algorithm and Statistical Analysis

Authors: Essa Abrahim Abdulgader Saleem, Thien-My Dao

Abstract:

The main objective of this paper is to optimize series-parallel system reliability using Genetic Algorithm (GA) and statistical analysis; considering system reliability constraints which involve the redundant numbers of selected components, total cost, and total weight. To perform this work, firstly the mathematical model which maximizes system reliability subject to maximum system cost and maximum system weight constraints is presented; secondly, a statistical analysis is used to optimize GA parameters, and thirdly GA is used to optimize series-parallel systems reliability. The objective is to determine the strategy choosing the redundancy level for each subsystem to maximize the overall system reliability subject to total cost and total weight constraints. Finally, the series-parallel system case study reliability optimization results are showed, and comparisons with the other previous results are presented to demonstrate the performance of our GA.

Keywords: reliability, optimization, meta-heuristic, genetic algorithm, redundancy

Procedia PDF Downloads 337
19787 'Call Drop': A Problem for Handover Minimizing the Call Drop Probability Using Analytical and Statistical Method

Authors: Anshul Gupta, T. Shankar

Abstract:

In this paper, we had analyzed the call drop to provide a good quality of service to user. By optimizing it we can increase the coverage area and also the reduction of interference and congestion created in a network. Basically handover is the transfer of call from one cell site to another site during a call. Here we have analyzed the whole network by two method-statistic model and analytic model. In statistic model we have collected all the data of a network during busy hour and normal 24 hours and in analytic model we have the equation through which we have to find the call drop probability. By avoiding unnecessary handovers we can increase the number of calls per hour. The most important parameter is co-efficient of variation on which the whole paper discussed.

Keywords: coefficient of variation, mean, standard deviation, call drop probability, handover

Procedia PDF Downloads 491
19786 The Examination of Organizational DNA of General Directorate of Youth and Sport Organization of Fars Province Based on Hnald Model

Authors: Mehdi Rastegari Ghiri, Mohammad Reza Baradaran, Zahra Mirsanjari

Abstract:

The aim of the present study was the investigation of DNA Corporate General Administration of Sports and Youth in Fars province. The descriptive research method is a survey that was conducted by field survey. For data collection, questionnaires were used that designed based on Hnald and Silverman model. In this model the organizational DNA model is stated in four types: objective, individualistic, field-oriented and Spiritual. The reliability of the questionnaire by the researcher obtained by using Cronbach's alpha equal to 89/0 respectively. The statistical population includes all managers and specialists of Fars Province Directorate of Youth and Sport that 48 of them were selected as the samples of the research. The results showed the organizational DNA Directorate General for Youth and Sports Organization of Fars province has a field –oriented and nearly field-oriented DNA.

Keywords: organizational, DNA, Hnald, Silverman model

Procedia PDF Downloads 451
19785 Normalizing Logarithms of Realized Volatility in an ARFIMA Model

Authors: G. L. C. Yap

Abstract:

Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.

Keywords: Gaussian process, long-memory, normalization, value-at-risk, volatility, Whittle estimator

Procedia PDF Downloads 354
19784 An Improved Two-dimensional Ordered Statistical Constant False Alarm Detection

Authors: Weihao Wang, Zhulin Zong

Abstract:

Two-dimensional ordered statistical constant false alarm detection is a widely used method for detecting weak target signals in radar signal processing applications. The method is based on analyzing the statistical characteristics of the noise and clutter present in the radar signal and then using this information to set an appropriate detection threshold. In this approach, the reference cell of the unit to be detected is divided into several reference subunits. These subunits are used to estimate the noise level and adjust the detection threshold, with the aim of minimizing the false alarm rate. By using an ordered statistical approach, the method is able to effectively suppress the influence of clutter and noise, resulting in a low false alarm rate. The detection process involves a number of steps, including filtering the input radar signal to remove any noise or clutter, estimating the noise level based on the statistical characteristics of the reference subunits, and finally, setting the detection threshold based on the estimated noise level. One of the main advantages of two-dimensional ordered statistical constant false alarm detection is its ability to detect weak target signals in the presence of strong clutter and noise. This is achieved by carefully analyzing the statistical properties of the signal and using an ordered statistical approach to estimate the noise level and adjust the detection threshold. In conclusion, two-dimensional ordered statistical constant false alarm detection is a powerful technique for detecting weak target signals in radar signal processing applications. By dividing the reference cell into several subunits and using an ordered statistical approach to estimate the noise level and adjust the detection threshold, this method is able to effectively suppress the influence of clutter and noise and maintain a low false alarm rate.

Keywords: two-dimensional, ordered statistical, constant false alarm, detection, weak target signals

Procedia PDF Downloads 79
19783 Detecting Earnings Management via Statistical and Neural Networks Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange

Procedia PDF Downloads 424
19782 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material

Authors: S. Boria

Abstract:

In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.

Keywords: composite material, crashworthiness, finite element analysis, optimization

Procedia PDF Downloads 256
19781 Predicting Automotive Interior Noise Including Wind Noise by Statistical Energy Analysis

Authors: Yoshio Kurosawa

Abstract:

The applications of soundproof materials for reduction of high frequency automobile interior noise have been researched. This paper presents a sound pressure prediction technique including wind noise by Hybrid Statistical Energy Analysis (HSEA) in order to reduce weight of acoustic insulations. HSEA uses both analytical SEA and experimental SEA. As a result of chassis dynamo test and road test, the validity of SEA modeling was shown, and utility of the method was confirmed.

Keywords: vibration, noise, road noise, statistical energy analysis

Procedia PDF Downloads 351
19780 Generating Product Description with Generative Pre-Trained Transformer 2

Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen

Abstract:

Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.

Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining

Procedia PDF Downloads 198
19779 ARIMA-GARCH, A Statistical Modeling for Epileptic Seizure Prediction

Authors: Salman Mohamadi, Seyed Mohammad Ali Tayaranian Hosseini, Hamidreza Amindavar

Abstract:

In this paper, we provide a procedure to analyze and model EEG (electroencephalogram) signal as a time series using ARIMA-GARCH to predict an epileptic attack. The heteroskedasticity of EEG signal is examined through the ARCH or GARCH, (Autore- gressive conditional heteroskedasticity, Generalized autoregressive conditional heteroskedasticity) test. The best ARIMA-GARCH model in AIC sense is utilized to measure the volatility of the EEG from epileptic canine subjects, to forecast the future values of EEG. ARIMA-only model can perform prediction, but the ARCH or GARCH model acting on the residuals of ARIMA attains a con- siderable improved forecast horizon. First, we estimate the best ARIMA model, then different orders of ARCH and GARCH modelings are surveyed to determine the best heteroskedastic model of the residuals of the mentioned ARIMA. Using the simulated conditional variance of selected ARCH or GARCH model, we suggest the procedure to predict the oncoming seizures. The results indicate that GARCH modeling determines the dynamic changes of variance well before the onset of seizure. It can be inferred that the prediction capability comes from the ability of the combined ARIMA-GARCH modeling to cover the heteroskedastic nature of EEG signal changes.

Keywords: epileptic seizure prediction , ARIMA, ARCH and GARCH modeling, heteroskedasticity, EEG

Procedia PDF Downloads 406
19778 Towards Integrating Statistical Color Features for Human Skin Detection

Authors: Mohd Zamri Osman, Mohd Aizaini Maarof, Mohd Foad Rohani

Abstract:

Human skin detection recognized as the primary step in most of the applications such as face detection, illicit image filtering, hand recognition and video surveillance. The performance of any skin detection applications greatly relies on the two components: feature extraction and classification method. Skin color is the most vital information used for skin detection purpose. However, color feature alone sometimes could not handle images with having same color distribution with skin color. A color feature of pixel-based does not eliminate the skin-like color due to the intensity of skin and skin-like color fall under the same distribution. Hence, the statistical color analysis will be exploited such mean and standard deviation as an additional feature to increase the reliability of skin detector. In this paper, we studied the effectiveness of statistical color feature for human skin detection. Furthermore, the paper analyzed the integrated color and texture using eight classifiers with three color spaces of RGB, YCbCr, and HSV. The experimental results show that the integrating statistical feature using Random Forest classifier achieved a significant performance with an F1-score 0.969.

Keywords: color space, neural network, random forest, skin detection, statistical feature

Procedia PDF Downloads 462
19777 Evaluation of Egg Quality Parameters in the Isa Brown Line in Intensive Production Systems in the Ocaña Region, Norte de Santander

Authors: Meza-Quintero Myriam, Lobo Torrado Katty Andrea, Sanchez Picon Yesenia, Hurtado-Lugo Naudin

Abstract:

The objective of the study was to evaluate the internal and external quality of the egg in the three production housing systems: floor, cage, and grazing of laying birds of the Isa Brown line, in the laying period between weeks 35 to 41; 135 hens distributed in 3 treatments of 45 birds per repetition were used (the replicas were the seven weeks of the trial). The feeding treatment supplied in the floor and cage systems contained 114 g/bird/day; for the grazing system, 14 grams less concentrate was provided. Nine eggs were collected to be studied and analyzed in the animal nutrition laboratory (3 eggs per housing system). The random statistical model was implemented: for the statistical analysis of the data, the statistical software of IBM® Statistical Products and Services Solution (SPSS) version 2.3 was used. The evaluation and follow-up instruments were the vernier caliper for the measurement in millimeters, a YolkFan™16 from Roche DSM for the evaluation of the egg yolk pigmentation, a digital scale for the measurement in grams, a micrometer for the measurement in millimeters and evaluation in the laboratory using dry matter, ashes, and ethereal extract. The results suggested that equivalent to the size of the egg (0.04 ± 3.55) and the thickness of the shell (0.46 ± 3.55), where P-Value> 0.05 was obtained, weight albumen (0.18 ± 3.55), albumen height (0.38 ± 3.55), yolk weight (0.64 ± 3.55), yolk height (0.54 ± 3.55) and for yolk pigmentation (1.23 ± 3.55). It was concluded that the hens in the three production systems, floor, cage, and grazing, did not show significant statistical differences in the internal and external quality of the chicken in the parameters studied egg for the production system.

Keywords: biological, territories, genetic resource, egg

Procedia PDF Downloads 84
19776 Impact of Green Bonds Issuance on Stock Prices: An Event Study on Respective Indian Companies

Authors: S. L. Tulasi Devi, Shivam Azad

Abstract:

The primary objective of this study is to analyze the impact of green bond issuance on the stock prices of respective Indian companies. An event study methodology has been employed to study the effect of green bond issuance. For in-depth study and analysis, this paper used different window frames, including 15-15 days, 10-10 days, 7-7days, 6-6 days, and 5-5 days. Further, for better clarity, this paper also used an uneven window period of 7-5 days. The period of study covered all the companies which issued green bonds during the period of 2017-2022; Adani Green Energy, State Bank of India, Power Finance Corporation, Jain Irrigation, and Rural Electrification Corporation, except Indian Renewable Energy Development Agency and Indian Railway Finance Corporation, because of data unavailability. The paper used all three event study methods as discussed in earlier literature; 1) constant return model, 2) market-adjusted model, and 3) capital asset pricing model. For the fruitful comparison between results, the study considered cumulative average return (CAR) and buy and hold average return (BHAR) methodology. For checking the statistical significance, a two-tailed t-statistic has been used. All the statistical calculations have been performed in Microsoft Excel 2016. The study found that all other companies have shown positive returns on the event day except for the State Bank of India. The results demonstrated that constant return model outperformed compared to the market-adjusted model and CAPM. The p-value derived from all the methods has shown an almost insignificant impact of the issuance of green bonds on the stock prices of respective companies. The overall analysis states that there’s not much improvement in the market efficiency of the Indian Stock Markets.

Keywords: green bonds, event study methodology, constant return model, market-adjusted model, CAPM

Procedia PDF Downloads 98
19775 Statistical Analysis of Surface Roughness and Tool Life Using (RSM) in Face Milling

Authors: Mohieddine Benghersallah, Lakhdar Boulanouar, Salim Belhadi

Abstract:

Currently, higher production rate with required quality and low cost is the basic principle in the competitive manufacturing industry. This is mainly achieved by using high cutting speed and feed rates. Elevated temperatures in the cutting zone under these conditions shorten tool life and adversely affect the dimensional accuracy and surface integrity of component. Thus it is necessary to find optimum cutting conditions (cutting speed, feed rate, machining environment, tool material and geometry) that can produce components in accordance with the project and having a relatively high production rate. Response surface methodology is a collection of mathematical and statistical techniques that are useful for modelling and analysis of problems in which a response of interest is influenced by several variables and the objective is to optimize this response. The work presented in this paper examines the effects of cutting parameters (cutting speed, feed rate and depth of cut) on to the surface roughness through the mathematical model developed by using the data gathered from a series of milling experiments performed.

Keywords: Statistical analysis (RSM), Bearing steel, Coating inserts, Tool life, Surface Roughness, End milling.

Procedia PDF Downloads 432
19774 Nonparametric Path Analysis with Truncated Spline Approach in Modeling Rural Poverty in Indonesia

Authors: Usriatur Rohma, Adji Achmad Rinaldo Fernandes

Abstract:

Nonparametric path analysis is a statistical method that does not rely on the assumption that the curve is known. The purpose of this study is to determine the best nonparametric truncated spline path function between linear and quadratic polynomial degrees with 1, 2, and 3-knot points and to determine the significance of estimating the best nonparametric truncated spline path function in the model of the effect of population migration and agricultural economic growth on rural poverty through the variable unemployment rate using the t-test statistic at the jackknife resampling stage. The data used in this study are secondary data obtained from statistical publications. The results showed that the best model of nonparametric truncated spline path analysis is quadratic polynomial degree with 3-knot points. In addition, the significance of the best-truncated spline nonparametric path function estimation using jackknife resampling shows that all exogenous variables have a significant influence on the endogenous variables.

Keywords: nonparametric path analysis, truncated spline, linear, quadratic, rural poverty, jackknife resampling

Procedia PDF Downloads 50
19773 Uncertainty of the Brazilian Earth System Model for Solar Radiation

Authors: Elison Eduardo Jardim Bierhals, Claudineia Brazil, Deivid Pires, Rafael Haag, Elton Gimenez Rossini

Abstract:

This study evaluated the uncertainties involved in the solar radiation projections generated by the Brazilian Earth System Model (BESM) of the Weather and Climate Prediction Center (CPTEC) belonging to Coupled Model Intercomparison Phase 5 (CMIP5), with the aim of identifying efficiency in the projections for solar radiation of said model and in this way establish the viability of its use. Two different scenarios elaborated by Intergovernmental Panel on Climate Change (IPCC) were evaluated: RCP 4.5 (with more optimistic contour conditions) and 8.5 (with more pessimistic initial conditions). The method used to verify the accuracy of the present model was the Nash coefficient and the Statistical bias, as it better represents these atmospheric patterns. The BESM showed a tendency to overestimate the data ​​of solar radiation projections in most regions of the state of Rio Grande do Sul and through the validation methods adopted by this study, BESM did not present a satisfactory accuracy.

Keywords: climate changes, projections, solar radiation, uncertainty

Procedia PDF Downloads 252
19772 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool

Procedia PDF Downloads 370
19771 Statistical Shape Analysis of the Human Upper Airway

Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar

Abstract:

The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.

Keywords: medical imaging, image processing, FEM/BEM, statistical modelling

Procedia PDF Downloads 514
19770 Statistical Correlation between Ply Mechanical Properties of Composite and Its Effect on Structure Reliability

Authors: S. Zhang, L. Zhang, X. Chen

Abstract:

Due to the large uncertainty on the mechanical properties of FRP (fibre reinforced plastic), the reliability evaluation of FRP structures are currently receiving much attention in industry. However, possible statistical correlation between ply mechanical properties has been so far overlooked, and they are mostly assumed to be independent random variables. In this study, the statistical correlation between ply mechanical properties of uni-directional and plain weave composite is firstly analyzed by a combination of Monte-Carlo simulation and finite element modeling of the FRP unit cell. Large linear correlation coefficients between the in-plane mechanical properties are observed, and the correlation coefficients are heavily dependent on the uncertainty of the fibre volume ratio. It is also observed that the correlation coefficients related to Poisson’s ratio are negative while others are positive. To experimentally achieve the statistical correlation coefficients between in-plane mechanical properties of FRP, all concerned in-plane mechanical properties of the same specimen needs to be known. In-plane shear modulus of FRP is experimentally derived by the approach suggested in the ASTM standard D5379M. Tensile tests are conducted using the same specimens used for the shear test, and due to non-uniform tensile deformation a modification factor is derived by a finite element modeling. Digital image correlation is adopted to characterize the specimen non-uniform deformation. The preliminary experimental results show a good agreement with the numerical analysis on the statistical correlation. Then, failure probability of laminate plates is calculated in cases considering and not considering the statistical correlation, using the Monte-Carlo and Markov Chain Monte-Carlo methods, respectively. The results highlight the importance of accounting for the statistical correlation between ply mechanical properties to achieve accurate failure probability of laminate plates. Furthermore, it is found that for the multi-layer laminate plate, the statistical correlation between the ply elastic properties significantly affects the laminate reliability while the effect of statistical correlation between the ply strength is minimal.

Keywords: failure probability, FRP, reliability, statistical correlation

Procedia PDF Downloads 162
19769 Detection of Chaos in General Parametric Model of Infectious Disease

Authors: Javad Khaligh, Aghileh Heydari, Ali Akbar Heydari

Abstract:

Mathematical epidemiological models for the spread of disease through a population are used to predict the prevalence of a disease or to study the impacts of treatment or prevention measures. Initial conditions for these models are measured from statistical data collected from a population since these initial conditions can never be exact, the presence of chaos in mathematical models has serious implications for the accuracy of the models as well as how epidemiologists interpret their findings. This paper confirms the chaotic behavior of a model for dengue fever and SI by investigating sensitive dependence, bifurcation, and 0-1 test under a variety of initial conditions.

Keywords: epidemiological models, SEIR disease model, bifurcation, chaotic behavior, 0-1 test

Procedia PDF Downloads 326
19768 The Application of Lesson Study Model in Writing Review Text in Junior High School

Authors: Sulastriningsih Djumingin

Abstract:

This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.

Keywords: application, lesson study, review text, writing

Procedia PDF Downloads 202
19767 Exploring the Applications of Neural Networks in the Adaptive Learning Environment

Authors: Baladitya Swaika, Rahul Khatry

Abstract:

Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.

Keywords: computer adaptive tests, item response theory, machine learning, neural networks

Procedia PDF Downloads 176
19766 Estimating the Value of Statistical Life under the Subsidization and Cultural Effects

Authors: Mohammad A. Alolayan, John S. Evans, James K. Hammitt

Abstract:

The value of statistical life has been estimated for a middle eastern country with high economical subsidization system. In this study, in-person interviews were conducted on a stratified random sample to estimate the value of mortality risk. Double-bounded dichotomous choice questions followed by open-ended question were used in the interview to investigate the willingness to pay of the respondent for mortality risk reduction. High willingness to pay was found to be associated with high income and education. Also, females were found to have lower willingness to pay than males. The estimated value of statistical life is larger than the ones estimated for western countries where taxation system exists. This estimate provides a baseline for monetizing the health benefits for proposed policy or program to the decision makers in an eastern country. Also, the value of statistical life for a country in the region can be extrapolated from this this estimate by using the benefit transfer method.

Keywords: mortality, risk, VSL, willingness-to-pay

Procedia PDF Downloads 315
19765 Application of Statistical Linearized Models for Investigations of Digital Dynamic Pulse-Frequency Control Systems

Authors: B. H. Aitchanov, Sh. K. Aitchanova, O. A. Baimuratov

Abstract:

This paper is focused on dynamic pulse-frequency modulation (DPFM) control systems. Currently, the control law based on DPFM control signals is widely used in direct digital control subsystems introduced in the automated control systems of technological processes. Statistical analysis of automatic control systems is reduced to its construction of functional relationships between the statistical characteristics of the errors processes and input processes. Structural and dynamic Volterra models of digital pulse-frequency control systems can be used to develop methods for generating the dependencies, differing accuracy, requiring the amount of information about the statistical characteristics of input processes and computing labor intensity of their use.

Keywords: digital dynamic pulse-frequency control systems, dynamic pulse-frequency modulation, control object, discrete filter, impulse device, microcontroller

Procedia PDF Downloads 495
19764 Architectural Building Safety and Health Performance Model for Stratified Low-Cost Housing: Education and Management Tool for Building Managers

Authors: Zainal Abidin Akasah, Maizam Alias, Azuin Ramli

Abstract:

The safety and health performances aspects of a building are the most challenging aspect of facility management. It requires a deep understanding by the building managers on the factors that contribute to health and safety performances. This study attempted to develop an explanatory architectural safety performance model for stratified low-cost housing in Malaysia. The proposed Building Safety and Health Performance (BSHP) model was tested empirically through a survey on 308 construction practitioners using Partial Least Squares (PLS) and Structural Equation Modelling (SEM) tool. Statistical analysis results supports the conclusion that architecture, building services, external environment, management approaches and maintenance management have positive influence on safety and health performance of stratified low-cost housing in Malaysia. The findings provide valuable insights for construction industry to introduce BSHP model in the future where the model could be used as a guideline for training purposes of managers and better planning and implementation of building management.

Keywords: building management, stratified low-cost housing, safety, health model

Procedia PDF Downloads 558
19763 Shock Compressibility of Iron Alloys Calculated in the Framework of Quantum-Statistical Models

Authors: Maxim A. Kadatskiy, Konstantin V. Khishchenko

Abstract:

Iron alloys are widespread components in various types of structural materials which are exposed to intensive thermal and mechanical loads. Various quantum-statistical cell models with the approximation of self-consistent field can be used for the prediction of the behavior of these materials under extreme conditions. The application of these models is even more valid, the higher the temperature and the density of matter. Results of Hugoniot calculation for iron alloys in the framework of three quantum-statistical (the Thomas–Fermi, the Thomas–Fermi with quantum and exchange corrections and the Hartree–Fock–Slater) models are presented. Results of quantum-statistical calculations are compared with results from other reliable models and available experimental data. It is revealed a good agreement between results of calculation and experimental data for terra pascal pressures. Advantages and disadvantages of this approach are shown.

Keywords: alloy, Hugoniot, iron, terapascal pressure

Procedia PDF Downloads 344
19762 Jointly Optimal Statistical Process Control and Maintenance Policy for Deteriorating Processes

Authors: Lucas Paganin, Viliam Makis

Abstract:

With the advent of globalization, the market competition has become a major issue for most companies. One of the main strategies to overcome this situation is the quality improvement of the product at a lower cost to meet customers’ expectations. In order to achieve the desired quality of products, it is important to control the process to meet the specifications, and to implement the optimal maintenance policy for the machines and the production lines. Thus, the overall objective is to reduce process variation and the production and maintenance costs. In this paper, an integrated model involving Statistical Process Control (SPC) and maintenance is developed to achieve this goal. Therefore, the main focus of this paper is to develop the jointly optimal maintenance and statistical process control policy minimizing the total long run expected average cost per unit time. In our model, the production process can go out of control due to either the deterioration of equipment or other assignable causes. The equipment is also subject to failures in any of the operating states due to deterioration and aging. Hence, the process mean is controlled by an Xbar control chart using equidistant sampling epochs. We assume that the machine inspection epochs are the times when the control chart signals an out-of-control condition, considering both true and false alarms. At these times, the production process will be stopped, and an investigation will be conducted not only to determine whether it is a true or false alarm, but also to identify the causes of the true alarm, whether it was caused by the change in the machine setting, by other assignable causes, or by both. If the system is out of control, the proper actions will be taken to bring it back to the in-control state. At these epochs, a maintenance action can be taken, which can be no action, or preventive replacement of the unit. When the equipment is in the failure state, a corrective maintenance action is performed, which can be minimal repair or replacement of the machine and the process is brought to the in-control state. SMDP framework is used to formulate and solve the joint control problem. Numerical example is developed to demonstrate the effectiveness of the control policy.

Keywords: maintenance, semi-Markov decision process, statistical process control, Xbar control chart

Procedia PDF Downloads 92
19761 Impact of Crises on Official Statistics: Environmental Statistics at Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf during the COVID-19 Pandemic: A Case Study

Authors: Ibtihaj Al-Siyabi

Abstract:

The crisis of COVID-19 posed enormous challenges to the statistical providers. While official statistics were disrupted by the pandemic and related containment measures, there was a growing and pressing need for real-time data and statistics to inform decisions. This paper gives an account of the way the pandemic impacted the operations of the National Statistical Offices (NSOs) in general in terms of data collection and methods used and the main challenges encountered by them based on international surveys. It highlights the performance of the Statistical Centre for the Cooperation Council for the Arab Countries of the Gulf, GCC-STAT, and its responsiveness to the pandemic placing special emphasis on environmental statistics. The paper concludes by confirming the GCC-STAT’s resilience and success in facing the challenges.

Keywords: NSO, COVID-19, statistics, crisis, pandemic

Procedia PDF Downloads 140
19760 A Fully Automated New-Fangled VESTAL to Label Vertebrae and Intervertebral Discs

Authors: R. Srinivas, K. V. Ramana

Abstract:

This paper presents a novel method called VESTAL to label vertebrae and inter vertebral discs. Each vertebra has certain statistical features properties. To label vertebrae and discs, a new equation to model the path of spinal cord is derived using statistical properties of the spinal canal. VESTAL uses this equation for labeling vertebrae and discs. For each vertebrae and inter vertebral discs both posterior, interior width, height are measured. The calculated values are compared with real values which are measured using venires calipers and the comparison produced 95% efficiency and accurate results. The VESTAL is applied on 50 patients 350 MR images and obtained 100% accuracy in labeling.

Keywords: spine, vertebrae, inter vertebral disc, labeling, statistics, texture, disc

Procedia PDF Downloads 363
19759 Applying of an Adaptive Neuro-Fuzzy Inference System (ANFIS) for Estimation of Flood Hydrographs

Authors: Amir Ahmad Dehghani, Morteza Nabizadeh

Abstract:

This paper presents the application of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to flood hydrograph modeling of Shahid Rajaee reservoir dam located in Iran. This was carried out using 11 flood hydrographs recorded in Tajan river gauging station. From this dataset, 9 flood hydrographs were chosen to train the model and 2 flood hydrographs to test the model. The different architectures of neuro-fuzzy model according to the membership function and learning algorithm were designed and trained with different epochs. The results were evaluated in comparison with the observed hydrographs and the best structure of model was chosen according the least RMSE in each performance. To evaluate the efficiency of neuro-fuzzy model, various statistical indices such as Nash-Sutcliff and flood peak discharge error criteria were calculated. In this simulation, the coordinates of a flood hydrograph including peak discharge were estimated using the discharge values occurred in the earlier time steps as input values to the neuro-fuzzy model. These results indicate the satisfactory efficiency of neuro-fuzzy model for flood simulating. This performance of the model demonstrates the suitability of the implemented approach to flood management projects.

Keywords: adaptive neuro-fuzzy inference system, flood hydrograph, hybrid learning algorithm, Shahid Rajaee reservoir dam

Procedia PDF Downloads 479