Search results for: standard normal variance
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8260

Search results for: standard normal variance

8260 Virtual Dimension Analysis of Hyperspectral Imaging to Characterize a Mining Sample

Authors: L. Chevez, A. Apaza, J. Rodriguez, R. Puga, H. Loro, Juan Z. Davalos

Abstract:

Virtual Dimension (VD) procedure is used to analyze Hyperspectral Image (HIS) treatment-data in order to estimate the abundance of mineral components of a mining sample. Hyperspectral images coming from reflectance spectra (NIR region) are pre-treated using Standard Normal Variance (SNV) and Minimum Noise Fraction (MNF) methodologies. The endmember components are identified by the Simplex Growing Algorithm (SVG) and after adjusted to the reflectance spectra of reference-databases using Simulated Annealing (SA) methodology. The obtained abundance of minerals of the sample studied is very near to the ones obtained using XRD with a total relative error of 2%.

Keywords: hyperspectral imaging, minimum noise fraction, MNF, simplex growing algorithm, SGA, standard normal variance, SNV, virtual dimension, XRD

Procedia PDF Downloads 114
8259 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 329
8258 Efficient Frontier: Comparing Different Volatility Estimators

Authors: Tea Poklepović, Zdravka Aljinović, Mario Matković

Abstract:

Modern Portfolio Theory (MPT) according to Markowitz states that investors form mean-variance efficient portfolios which maximizes their utility. Markowitz proposed the standard deviation as a simple measure for portfolio risk and the lower semi-variance as the only risk measure of interest to rational investors. This paper uses a third volatility estimator based on intraday data and compares three efficient frontiers on the Croatian Stock Market. The results show that range-based volatility estimator outperforms both mean-variance and lower semi-variance model.

Keywords: variance, lower semi-variance, range-based volatility, MPT

Procedia PDF Downloads 483
8257 Comparison of Self-Efficacy and Life Satisfaction in Normal Users and Users with Internet Addiction

Authors: Mansour Abdi, Hadi Molaei Yasavoli

Abstract:

The purpose of this research is to comparison of self- efficacy and life satisfaction in normal users and users with internet addiction. The present study was descriptive and causal-comparative. Therefore, 304 students were selected random sampling method from students of Semnan University and completed questionnaires of internet addiction (young), Self-Efficacy Questionnaire and Life Satisfaction (SWIS). For data analysis was used the Multivariate Analysis of Variance (MANOVA). The results showed that internet addiction users have lower levels of self-efficacy and life satisfaction in comparison with normal users and the difference in p=0/0005 significantly. The findings showed that 78 percent of the variance in the dependent variables of self-efficacy and life satisfaction by grouping variables (internet addiction users and normal) is determined. Finally, considering that the rate of self-efficacy and life satisfaction is effective in the incidence of Internet addiction, it is proposed required measures are taken to enhance self-efficacy and life satisfaction in Internet users.

Keywords: self-efficacy, life satisfaction, users, internet addiction, normal users

Procedia PDF Downloads 459
8256 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 146
8255 On Confidence Intervals for the Difference between Inverse of Normal Means with Known Coefficients of Variation

Authors: Arunee Wongkhao, Suparat Niwitpong, Sa-aat Niwitpong

Abstract:

In this paper, we propose two new confidence intervals for the difference between the inverse of normal means with known coefficients of variation. One of these two confidence intervals for this problem is constructed based on the generalized confidence interval and the other confidence interval is constructed based on the closed form method of variance estimation. We examine the performance of these confidence intervals in terms of coverage probabilities and expected lengths via Monte Carlo simulation.

Keywords: coverage probability, expected length, inverse of normal mean, coefficient of variation, generalized confidence interval, closed form method of variance estimation

Procedia PDF Downloads 278
8254 The Study of Rapeseed Characteristics by Factor Analysis under Normal and Drought Stress Conditions

Authors: Ali Bakhtiari Gharibdosti, Mohammad Hosein Bijeh Keshavarzi, Samira Alijani

Abstract:

To understand internal characteristics relationships and determine factors which explain under consideration characteristics in rapeseed varieties, 10 rapeseed genotypes were implemented in complete accidental plot with three-time repetitions under drought stress in 2009-2010 in research field of agriculture college, Islamic Azad University, Karaj branch. In this research, 11 characteristics include of characteristics related to growth, production and functions stages was considered. Variance analysis results showed that there is a significant difference among rapeseed varieties characteristics. By calculating simple correlation coefficient under both conditions, normal and drought stress indicate that seed function characteristics in plant and pod number have positive and significant correlation in 1% probable level with seed function and selection on the base of these characteristics was effective for improving this function. Under normal and drought stress, analyzing the main factors showed that numbers of factors which have more than one amount, had five factors under normal conditions which were 82.72% of total variance totally, but under drought stress four factors diagnosed which were 76.78% of total variance. By considering total results of this research and by assessing effective characteristics for factor analysis and selecting different components of these characteristics, they can be used for modifying works to select applicable and tolerant genotypes in drought stress conditions.

Keywords: correlation, drought stress, factor analysis, rapeseed

Procedia PDF Downloads 151
8253 Comparison between Some of Robust Regression Methods with OLS Method with Application

Authors: Sizar Abed Mohammed, Zahraa Ghazi Sadeeq

Abstract:

The use of the classic method, least squares (OLS) to estimate the linear regression parameters, when they are available assumptions, and capabilities that have good characteristics, such as impartiality, minimum variance, consistency, and so on. The development of alternative statistical techniques to estimate the parameters, when the data are contaminated with outliers. These are powerful methods (or resistance). In this paper, three of robust methods are studied, which are: Maximum likelihood type estimate M-estimator, Modified Maximum likelihood type estimate MM-estimator and Least Trimmed Squares LTS-estimator, and their results are compared with OLS method. These methods applied to real data taken from Duhok company for manufacturing furniture, the obtained results compared by using the criteria: Mean Squared Error (MSE), Mean Absolute Percentage Error (MAPE) and Mean Sum of Absolute Error (MSAE). Important conclusions that this study came up with are: a number of typical values detected by using four methods in the furniture line and very close to the data. This refers to the fact that close to the normal distribution of standard errors, but typical values in the doors line data, using OLS less than that detected by the powerful ways. This means that the standard errors of the distribution are far from normal departure. Another important conclusion is that the estimated values of the parameters by using the lifeline is very far from the estimated values using powerful methods for line doors, gave LTS- destined better results using standard MSE, and gave the M- estimator better results using standard MAPE. Moreover, we noticed that using standard MSAE, and MM- estimator is better. The programs S-plus (version 8.0, professional 2007), Minitab (version 13.2) and SPSS (version 17) are used to analyze the data.

Keywords: Robest, LTS, M estimate, MSE

Procedia PDF Downloads 208
8252 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control

Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay

Abstract:

In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.

Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart

Procedia PDF Downloads 180
8251 An Empirical Study of the Best Fitting Probability Distributions for Stock Returns Modeling

Authors: Jayanta Pokharel, Gokarna Aryal, Netra Kanaal, Chris Tsokos

Abstract:

Investment in stocks and shares aims to seek potential gains while weighing the risk of future needs, such as retirement, children's education etc. Analysis of the behavior of the stock market returns and making prediction is important for investors to mitigate risk on investment. Historically, the normal variance models have been used to describe the behavior of stock market returns. However, the returns of the financial assets are actually skewed with higher kurtosis, heavier tails, and a higher center than the normal distribution. The Laplace distribution and its family are natural candidates for modeling stock returns. The Variance-Gamma (VG) distribution is the most sought-after distributions for modeling asset returns and has been extensively discussed in financial literatures. In this paper, it explore the other Laplace family, such as Asymmetric Laplace, Skewed Laplace, Kumaraswamy Laplace (KS) together with Variance-Gamma to model the weekly returns of the S&P 500 Index and it's eleven business sector indices. The method of maximum likelihood is employed to estimate the parameters of the distributions and our empirical inquiry shows that the Kumaraswamy Laplace distribution performs much better for stock returns modeling among the choice of distributions used in this study and in practice, KS can be used as a strong alternative to VG distribution.

Keywords: stock returns, variance-gamma, kumaraswamy laplace, maximum likelihood

Procedia PDF Downloads 41
8250 Estimation of the Mean of the Selected Population

Authors: Kalu Ram Meena, Aditi Kar Gangopadhyay, Satrajit Mandal

Abstract:

Two normal populations with different means and same variance are considered, where the variances are known. The population with the smaller sample mean is selected. Various estimators are constructed for the mean of the selected normal population. Finally, they are compared with respect to the bias and MSE risks by the method of Monte-Carlo simulation and their performances are analysed with the help of graphs.

Keywords: estimation after selection, Brewster-Zidek technique, estimators, selected populations

Procedia PDF Downloads 476
8249 BIASS in the Estimation of Covariance Matrices and Optimality Criteria

Authors: Juan M. Rodriguez-Diaz

Abstract:

The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.

Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix

Procedia PDF Downloads 404
8248 The Comparison of Emotional Regulation Strategies and Psychological Symptoms in Patients with Multiple Sclerosis and Normal Individuals

Authors: Amir Salamatzade, Marhamet HematPour

Abstract:

Due to the increasing importance of psychological factors in the incidence and exacerbation of chronic diseases such as multiple sclerosis, the aim of this study was to determine the difference between emotional regulation strategies and psychological symptoms in patients with multiple sclerosis and normal people. The research method was causal-comparative (post-event). The statistical population of this research included all patients with multiple sclerosis referred to the MS Association of Rasht in the first quarter of 2021, approximately 350 people. The study sample also included 120 people (60 patients with multiple sclerosis and 60 normal people) who were selected by the available sampling method and completed the emotional regulation and anxiety, depression, and stress Lavibund and Lavibund (1995) questionnaires. Data were analyzed using an independent t-test and multivariate variance analysis. The results showed that there was a significant difference between the mean of emotional regulation strategies and the components of emotional reassessment and emotional inhibition between the two groups of patients with multiple sclerosis and normal individuals (p < 0.01). There is a significant difference between the mean of psychological symptoms and the components of depression, anxiety, and stress in the two groups of patients with multiple sclerosis and normal individuals. (p < 0.01). Based on this, it can be concluded that patients with multiple sclerosis have lower levels of emotional regulation strategies and higher levels of psychological symptoms than normal individuals.

Keywords: emotional regulation strategies, psychological symptoms, multiple sclerosis, normal Individuals

Procedia PDF Downloads 183
8247 Reliability Prediction of Tires Using Linear Mixed-Effects Model

Authors: Myung Hwan Na, Ho- Chun Song, EunHee Hong

Abstract:

We widely use normal linear mixed-effects model to analysis data in repeated measurement. In case of detecting heteroscedasticity and the non-normality of the population distribution at the same time, normal linear mixed-effects model can give improper result of analysis. To achieve more robust estimation, we use heavy tailed linear mixed-effects model which gives more exact and reliable analysis conclusion than standard normal linear mixed-effects model.

Keywords: reliability, tires, field data, linear mixed-effects model

Procedia PDF Downloads 534
8246 Markov Switching of Conditional Variance

Authors: Josip Arneric, Blanka Skrabic Peric

Abstract:

Forecasting of volatility, i.e. returns fluctuations, has been a topic of interest to portfolio managers, option traders and market makers in order to get higher profits or less risky positions. Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most common used models are GARCH type models. As standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance, it is difficult the predict volatility using standard GARCH models. Due to practical limitations of these models different approaches have been proposed in the literature, based on Markov switching models. In such situations models in which the parameters are allowed to change over time are more appropriate because they allow some part of the model to depend on the state of the economy. The empirical analysis demonstrates that Markov switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility for selected emerging markets.

Keywords: emerging markets, Markov switching, GARCH model, transition probabilities

Procedia PDF Downloads 430
8245 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling

Authors: Vibha Devi, Shabina Khanam

Abstract:

Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.

Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation

Procedia PDF Downloads 120
8244 A Generalized Family of Estimators for Estimation of Unknown Population Variance in Simple Random Sampling

Authors: Saba Riaz, Syed A. Hussain

Abstract:

This paper is addressing the estimation method of the unknown population variance of the variable of interest. A new generalized class of estimators of the finite population variance has been suggested using the auxiliary information. To improve the precision of the proposed class, known population variance of the auxiliary variable has been used. Mathematical expressions for the biases and the asymptotic variances of the suggested class are derived under large sample approximation. Theoretical and numerical comparisons are made to investigate the performances of the proposed class of estimators. The empirical study reveals that the suggested class of estimators performs better than the usual estimator, classical ratio estimator, classical product estimator and classical linear regression estimator. It has also been found that the suggested class of estimators is also more efficient than some recently published estimators.

Keywords: study variable, auxiliary variable, finite population variance, bias, asymptotic variance, percent relative efficiency

Procedia PDF Downloads 193
8243 Effect of Steel Fibers on Flexural Behavior of Normal and High Strength Concrete

Authors: K. M. Aldossari, W. A. Elsaigh, M. J. Shannag

Abstract:

An experimental study was conducted to investigate the effect of hooked-end steel fibers on the flexural behavior of normal and high strength concrete matrices. The fiber content appropriate for the concrete matrices investigated was also determined based on flexural tests on standard prisms. Parameters investigated include: Matrix compressive strength ranging from 45 MPa to 70 MPa, corresponding to normal and high strength concrete matrices respectively; Fiber volume fraction including 0, 0.5%, 0.76%, and 1%, equivalent to 0, 40, 60, and 80 kg/m3 of hooked-end steel fibers respectively. Test results indicated that flexural strength and toughness of normal and high strength concrete matrices were significantly improved with the increase in the fiber content added; Whereas a slight improvement in compressive strength was observed for the same matrices. Furthermore, the test results indicated that the effect of increasing the fiber content was more pronounced on increasing the flexural strength of high strength concrete than that of normal concrete.

Keywords: concrete, flexural strength, toughness, steel fibers

Procedia PDF Downloads 462
8242 Distributed Energy Storage as a Potential Solution to Electrical Network Variance

Authors: V. Rao, A. Bedford

Abstract:

As the efficient performance of national grid becomes increasingly important to maintain the electrical network stability, the balance between the generation and the demand must be effectively maintained. To do this, any losses that occur in the power network must be reduced by compensating for it. In this paper, one of the main cause for the losses in the network is identified as the variance, which hinders the grid’s power carrying capacity. The reason for the variance in the grid is investigated and identified as the rise in the integration of renewable energy sources (RES) such as wind and solar power. The intermittent nature of these RES along with fluctuating demands gives rise to variance in the electrical network. The losses that occur during this process is estimated by analyzing the network’s power profiles. Whilst researchers have identified different ways to tackle this problem, little consideration is given to energy storage. This paper seeks to redress this by considering the role of energy storage systems as potential solutions to reduce variance in the network. The implementation of suitable energy storage systems based on different applications is presented in this paper as part of variance reduction method and thus contribute towards maintaining a stable and efficient grid operation.

Keywords: energy storage, electrical losses, national grid, renewable energy, variance

Procedia PDF Downloads 285
8241 Normal Weight Obesity among Female Students: BMI as a Non-Sufficient Tool for Obesity Assessment

Authors: Krzysztof Plesiewicz, Izabela Plesiewicz, Krzysztof Chiżyński, Marzenna Zielińska

Abstract:

Background: Obesity is an independent risk factor for cardiovascular diseases. There are several anthropometric parameters proposed to estimate the level of obesity, but until now there is no agreement which one is the best predictor of cardiometabolic risk. Scientists defined metabolically obese normal weight, who suffer from metabolic abnormalities, the same as obese individuals, and defined this syndrome as normal weight obesity (NWO). Aim of the study: The aim of our study was to determine the occurrence of overweight and obesity in a cohort of young, adult women, using standard and complementary methods of obesity assessment and to indicate those, who are at risk of obesity. The second aim of our study was to test additional methods of obesity assessment and proof that body mass index using alone is not sufficient parameter of obesity assessment. Materials and methods: 384 young women, aged 18-32, were enrolled into the study. Standard anthropometric parameters (waist to hips ratio (WTH), waist to height ratio (WTHR)) and two other methods of body fat percentage measurement (BFPM) were used in the study: electrical bioimpendance analysis (BIA) and skinfold measurement test by digital fat body mass clipper (SFM). Results: In the study group 5% and 7% of participants had waist to hips ratio and accordingly waist to height ratio values connected with visceral obesity. According to BMI 14% participants were overweight and obese. Using additional methods of body fat assessment, there were 54% and 43% of obese for BIA and SMF method. In the group of participants with normal BMI and underweight (not overweight, n =340) there were individuals with the level of BFPM above the upper limit, for the BIA 49% (n =164) and for the SFM 36 % (n=125). Statistical analysis revealed strong correlation between BIA and SFM methods. Conclusion: BMI using alone is not a sufficient parameter of obesity assessment. High percentage of young women with normal BMI values seem to be normal weight obese.

Keywords: electrical bioimpedance, normal weight obesity, skin-fold measurement test, women

Procedia PDF Downloads 242
8240 The Effect of Excel on Undergraduate Students’ Understanding of Statistics and the Normal Distribution

Authors: Masomeh Jamshid Nejad

Abstract:

Nowadays, statistical literacy is no longer a necessary skill but an essential skill with broad applications across diverse fields, especially in operational decision areas such as business management, finance, and economics. As such, learning and deep understanding of statistical concepts are essential in the context of business studies. One of the crucial topics in statistical theory and its application is the normal distribution, often called a bell-shaped curve. To interpret data and conduct hypothesis tests, comprehending the properties of normal distribution (the mean and standard deviation) is essential for business students. This requires undergraduate students in the field of economics and business management to visualize and work with data following a normal distribution. Since technology is interconnected with education these days, it is important to teach statistics topics in the context of Python, R-studio, and Microsoft Excel to undergraduate students. This research endeavours to shed light on the effect of Excel-based instruction on learners’ knowledge of statistics, specifically the central concept of normal distribution. As such, two groups of undergraduate students (from the Business Management program) were compared in this research study. One group underwent Excel-based instruction and another group relied only on traditional teaching methods. We analyzed experiential data and BBA participants’ responses to statistic-related questions focusing on the normal distribution, including its key attributes, such as the mean and standard deviation. The results of our study indicate that exposing students to Excel-based learning supports learners in comprehending statistical concepts more effectively compared with the other group of learners (teaching with the traditional method). In addition, students in the context of Excel-based instruction showed ability in picturing and interpreting data concentrated on normal distribution.

Keywords: statistics, excel-based instruction, data visualization, pedagogy

Procedia PDF Downloads 29
8239 Impact of Yogic Exercise on Cardiovascular Function on Selected College Students of High Altitude

Authors: Benu Gupta

Abstract:

The purpose of the study was to assess the impact of yogic exercise on cardiovascular exercises on selected college students of high altitude. The research was conducted on college students of high altitude in Shimla for their cardiovascular function [Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] in respect to yogic exercise. Total 139 students were randomly selected from Himachal University colleges in Shimla. The study was conducted in three phases. The subjects were identified in the first phase of research program then further in next phase they were physiologically tested, and yogic exercise battery was operated in different time frame. The entire subjects were treated with three months yogic exercise. The entire lot of students were again evaluated physiologically [(Cardiovascular measurement: Blood Pressure (BP), VO2 Max (TLC) and Pulse Rate (PR)] with standard equipments. The statistical analyses of the variance (PR, BP (SBP & DBP) and TLC) were done. The result reveals that there was a significant difference in TLC; whereas there was no significant difference in PR. For BP statistical analysis suggests no significant difference were formed. Result showed that the BP of the participants were more inclined towards normal standard BP i.e. 120-80 mmHg.

Keywords: cardiovascular function, college students, high altitude, yogic exercise

Procedia PDF Downloads 203
8238 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 355
8237 Normal Meniscal Extrusion Using Ultrasonography during the Different Range of Motion Running Head: Sonography for Meniscal Extrusion

Authors: Arash Sharafat Vaziri, Leila Aghaghazvini, Soodeh Jahangiri, Mohammad Tahami, Roham Borazjani, Mohammad Naghi Tahmasebi, Hamid Rabie, Hesan Jelodari Mamaghani, Fardis Vosoughi, Maryam Salimi

Abstract:

Aims: It is essential to know the normal extrusion measures in order to detect pathological ones. In this study, we aimed to define some normal reference values for meniscal extrusion in the normal knees during different ranges of motion. Methods: The amount of anterior and posterior portion of meniscal extrusion among twenty-one asymptomatic volunteers (42 knees) were tracked at 0, 45, and 90 degrees of knee flexion using an ultrasound machine. The repeated measures analysis of variance (ANOVA) was used to show the interaction between the amounts of meniscal extrusion and the different degrees of knee flexion. Result: The anterior portion of the lateral menisci at full knee extension (0.59±1.40) and the posterior portion of the medial menisci during 90° flexion (3.06±2.36) showed the smallest and the highest mean amount of extrusion, respectively. The normal average amounts of anterior extrusion were 1.12± 1.17 mm and 0.99± 1.34 mm for medial and lateral menisci, respectively. The posterior meniscal normal extrusions were significantly increasing in both medial and lateral menisci during the survey (F= 20.250 and 11.298; both P-values< 0.001) as they were measured at 2.37± 2.16 mm and 1.53± 2.18 mm in order. Conclusion: The medial meniscus can extrude 1.74± 1.84 mm normally, while this amount was 1.26± 1.82 mm for the lateral meniscus. These measures commonly increased with the rising of knee flexion motion. Likewise, the posterior portion showed more extrusion than the anterior portion on both sides. These measures commonly increased with higher knee flexion.

Keywords: meniscal extrusion, ultrasonography, knee

Procedia PDF Downloads 68
8236 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring

Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti

Abstract:

Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by density-based time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., mean value, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one class classifier (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, a new anomaly detector strategy is proposed, namely one class classifier neural network two (OCCNN2), which exploit the classification capability of standard classifiers in an anomaly detection problem, finding the standard class (the boundary of the features space in normal operating conditions) through a two-step approach: coarse and fine boundary estimation. The coarse estimation uses classics OCC techniques, while the fine estimation is performed through a feedforward neural network (NN) trained that exploits the boundaries estimated in the coarse step. The detection algorithms vare then compared with known methods based on principal component analysis (PCA), kernel principal component analysis (KPCA), and auto-associative neural network (ANN). In many cases, the proposed solution increases the performance with respect to the standard OCC algorithms in terms of F1 score and accuracy. In particular, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 96% with the proposed method.

Keywords: anomaly detection, frequencies selection, modal analysis, neural network, sensor network, structural health monitoring, vibration measurement

Procedia PDF Downloads 92
8235 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling

Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao

Abstract:

In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.

Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis

Procedia PDF Downloads 108
8234 Contrasted Mean and Median Models in Egyptian Stock Markets

Authors: Mai A. Ibrahim, Mohammed El-Beltagy, Motaz Khorshid

Abstract:

Emerging Markets return distributions have shown significance departure from normality were they are characterized by fatter tails relative to the normal distribution and exhibit levels of skewness and kurtosis that constitute a significant departure from normality. Therefore, the classical Markowitz Mean-Variance is not applicable for emerging markets since it assumes normally-distributed returns (with zero skewness and kurtosis) and a quadratic utility function. Moreover, the Markowitz mean-variance analysis can be used in cases of moderate non-normality and it still provides a good approximation of the expected utility, but it may be ineffective under large departure from normality. Higher moments models and median models have been suggested in the literature for asset allocation in this case. Higher moments models have been introduced to account for the insufficiency of the description of a portfolio by only its first two moments while the median model has been introduced as a robust statistic which is less affected by outliers than the mean. Tail risk measures such as Value-at Risk (VaR) and Conditional Value-at-Risk (CVaR) have been introduced instead of Variance to capture the effect of risk. In this research, higher moment models including the Mean-Variance-Skewness (MVS) and Mean-Variance-Skewness-Kurtosis (MVSK) are formulated as single-objective non-linear programming problems (NLP) and median models including the Median-Value at Risk (MedVaR) and Median-Mean Absolute Deviation (MedMAD) are formulated as a single-objective mixed-integer linear programming (MILP) problems. The higher moment models and median models are compared to some benchmark portfolios and tested on real financial data in the Egyptian main Index EGX30. The results show that all the median models outperform the higher moment models were they provide higher final wealth for the investor over the entire period of study. In addition, the results have confirmed the inapplicability of the classical Markowitz Mean-Variance to the Egyptian stock market as it resulted in very low realized profits.

Keywords: Egyptian stock exchange, emerging markets, higher moment models, median models, mixed-integer linear programming, non-linear programming

Procedia PDF Downloads 284
8233 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals

Authors: Naser Safdarian, Nader Jafarnia Dabanloo

Abstract:

In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.

Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition

Procedia PDF Downloads 428
8232 Spectral Analysis of Heart Rate Variability for Normal and Preeclamptic Pregnants

Authors: Abdulnasir Hossen, Alaa Barhoum, Deepali Jaju, V. Gowri, L. Al-Kharusi, M. Hassan, K. Al-Hashmi

Abstract:

Preeclampsia is a pregnancy disorder associated with increase in blood pressure and excess amount of protein in the urine. HRV analysis has been used by many researchers to identify preeclamptic pregnancy from normal pregnancy. A study in this regard to identify preeclamptic pregnancy in Oman from normal pregnant was conducted on 40 subjects (20 patients and 20 normal). The subjects were collected from two hospitals in Oman. A Fast Fourier transform (FFT) spectral analysis has shown that patients with preeclamptic pregnancy have a reduction in the power of the HF band and an increase in the power of the LF band of HRV compared with subjects with normal pregnancy. The accuracy of identification obtained was 80%.

Keywords: preelampsia, pregnancy hypertension, normal pregnant, FFT, spectral analysis, HRV

Procedia PDF Downloads 526
8231 Appropriate Depth of Needle Insertion during Rhomboid Major Trigger Point Block

Authors: Seongho Jang

Abstract:

Objective: To investigate an appropriate depth of needle insertion during trigger point injection into the rhomboid major muscle. Methods: Sixty-two patients who visited our department with shoulder or upper back pain participated in this study. The distance between the skin and the rhomboid major muscle (SM) and the distance between the skin and rib (SB) were measured using ultrasonography. The subjects were divided into 3 groups according to BMI: BMI less than 23 kg/m2 (underweight or normal group); 23 kg/m2 or more to less than 25 kg/m2 (overweight group); and 25 kg/m2 or more (obese group). The mean ±standard deviation (SD) of SM and SB of each group were calculated. A range between mean+1 SD of SM and the mean-1 SD of SB was defined as a safe margin. Results: The underweight or normal group’s SM, SB, and the safe margin were 1.2±0.2, 2.1±0.4, and 1.4 to 1.7 cm, respectively. The overweight group’s SM and SB were 1.4±0.2 and 2.4±0.9 cm, respectively. The safe margin could not be calculated for this group. The obese group’s SM, SB, and the safe margin were 1.8±0.3, 2.7±0.5, and 2.1 to 2.2 cm, respectively. Conclusion: This study will help us to set the standard depth of safe needle insertion into the rhomboid major muscle in an effective manner without causing any complications.

Keywords: pneumothorax, rhomboid major muscle, trigger point injection, ultrasound

Procedia PDF Downloads 263