Search results for: the statistical measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6890

Search results for: the statistical measure

6710 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.

Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling

Procedia PDF Downloads 136
6709 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 394
6708 Evaluation of Egg Quality Parameters in the Isa Brown Line in Intensive Production Systems in the Ocaña Region, Norte de Santander

Authors: Meza-Quintero Myriam, Lobo Torrado Katty Andrea, Sanchez Picon Yesenia, Hurtado-Lugo Naudin

Abstract:

The objective of the study was to evaluate the internal and external quality of the egg in the three production housing systems: floor, cage, and grazing of laying birds of the Isa Brown line, in the laying period between weeks 35 to 41; 135 hens distributed in 3 treatments of 45 birds per repetition were used (the replicas were the seven weeks of the trial). The feeding treatment supplied in the floor and cage systems contained 114 g/bird/day; for the grazing system, 14 grams less concentrate was provided. Nine eggs were collected to be studied and analyzed in the animal nutrition laboratory (3 eggs per housing system). The random statistical model was implemented: for the statistical analysis of the data, the statistical software of IBM® Statistical Products and Services Solution (SPSS) version 2.3 was used. The evaluation and follow-up instruments were the vernier caliper for the measurement in millimeters, a YolkFan™16 from Roche DSM for the evaluation of the egg yolk pigmentation, a digital scale for the measurement in grams, a micrometer for the measurement in millimeters and evaluation in the laboratory using dry matter, ashes, and ethereal extract. The results suggested that equivalent to the size of the egg (0.04 ± 3.55) and the thickness of the shell (0.46 ± 3.55), where P-Value> 0.05 was obtained, weight albumen (0.18 ± 3.55), albumen height (0.38 ± 3.55), yolk weight (0.64 ± 3.55), yolk height (0.54 ± 3.55) and for yolk pigmentation (1.23 ± 3.55). It was concluded that the hens in the three production systems, floor, cage, and grazing, did not show significant statistical differences in the internal and external quality of the chicken in the parameters studied egg for the production system.

Keywords: biological, territories, genetic resource, egg

Procedia PDF Downloads 70
6707 Measurement of Intellectual Capital in an Algerian Company

Authors: S. Brahmi, S. Aitouche, M. D. Mouss

Abstract:

Every modern company should measure the value of its intellectual capital and to report to complement the traditional annual balance sheets. The purpose of this work is to measure the intellectual capital in an Algerian company (or production system) using the Weightless Wealth Tool Kit (WWTK). The results of the measurement of intellectual capital are supplemented by traditional financial ratios. The measurement was applied to the National Company of Wells Services (ENSP) in Hassi Messaoud city, in the south of Algeria. We calculated the intellectual capital (intangible resources) of the ENSP to help the organization to better capitalize on its potential of workers and their know-how. The intangible value of the ENSP is evaluated at 16,936,173,345 DA in 2015.

Keywords: financial valuation, intangible capital, intellectual capital, intellectual capital measurement

Procedia PDF Downloads 273
6706 Comparative Analysis of Canal Centering Ratio, Apical Transportation, and Remaining Dentin Thickness between Single File System Using Cone Beam Computed Tomography: An in vitro Study

Authors: Aditi Jain

Abstract:

Aim: To compare the canal transportation, centering ability and remaining dentin thickness of OneShape and WaveOne system using CBCT. Objective: To identify rotary system which respects original canal anatomy. Materials and Methods: Forty extracted human single-rooted premolars were used in the present study. Pre-instrumentation scans of all teeth were taken, canal curvatures were calculated, and the samples were randomly divided into two groups with twenty samples in each group, where Group 1 included WaveOne system and Group 2 Protaper rotary system. Post-instrumentation scans were performed, and the two scans were compared to determine canal transportation, centering ability and remaining dentin thickness at 1, 3, and 5 mm from the root apex. Results: Using Student’s unpaired t test results were as follows; for canal transportation Group 1 showed statistical significant difference at 3mm, 6mm and non-significant difference was obtained at 9mm but for Group 2 non-statistical significant difference was obtained at 3mm, 6mm, and 9mm. For centering ability and remaining dentin thickness Group 1 showed non-statistical significant difference at 3mm and 9mm, while statistical significant difference at 6mm was obtained. When comparison of remaining dentin thickness was done at three levels using two groups WaveOne and ProTaper. There was non-statistical significant difference between two groups. Conclusion: WaveOne single reciprocation file respects original canal anatomy better than ProTaper. WaveOne depicted the best centering ability.

Keywords: ShapeOne, WaveOne, transportation, centering ability, dentin thickness, CBCT (Cone Beam Computed Tomography)

Procedia PDF Downloads 190
6705 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 258
6704 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 452
6703 Efficient Frontier: Comparing Different Volatility Estimators

Authors: Tea Poklepović, Zdravka Aljinović, Mario Matković

Abstract:

Modern Portfolio Theory (MPT) according to Markowitz states that investors form mean-variance efficient portfolios which maximizes their utility. Markowitz proposed the standard deviation as a simple measure for portfolio risk and the lower semi-variance as the only risk measure of interest to rational investors. This paper uses a third volatility estimator based on intraday data and compares three efficient frontiers on the Croatian Stock Market. The results show that range-based volatility estimator outperforms both mean-variance and lower semi-variance model.

Keywords: variance, lower semi-variance, range-based volatility, MPT

Procedia PDF Downloads 502
6702 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode

Authors: Girish Chavadappanavar

Abstract:

The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).

Keywords: climate impact, regression analysis, yield and forecast model, sugar models

Procedia PDF Downloads 59
6701 Evaluating the Factors Controlling the Hydrochemistry of Gaza Coastal Aquifer Using Hydrochemical and Multivariate Statistical Analysis

Authors: Madhat Abu Al-Naeem, Ismail Yusoff, Ng Tham Fatt, Yatimah Alias

Abstract:

Groundwater in Gaza strip is increasingly being exposed to anthropic and natural factors that seriously impacted the groundwater quality. Physiochemical data of groundwater can offer important information on changes in groundwater quality that can be useful in improving water management tactics. An integrative hydrochemical and statistical techniques (Hierarchical cluster analysis (HCA) and factor analysis (FA)) have been applied on the existence ten physiochemical data of 84 samples collected in (2000/2001) using STATA, AquaChem, and Surfer softwares to: 1) Provide valuable insight into the salinization sources and the hydrochemical processes controlling the chemistry of groundwater. 2) Differentiate the influence of natural processes and man-made activities. The recorded large diversity in water facies with dominance Na-Cl type that reveals a highly saline aquifer impacted by multiple complex hydrochemical processes. Based on WHO standards, only (15.5%) of the wells were suitable for drinking. HCA yielded three clusters. Cluster 1 is the highest in salinity, mainly due to the impact of Eocene saline water invasion mixed with human inputs. Cluster 2 is the lowest in salinity also due to Eocene saline water invasion but mixed with recent rainfall recharge and limited carbonate dissolution and nitrate pollution. Cluster 3 is similar in salinity to Cluster 2, but with a high diversity of facies due to the impact of many sources of salinity as sea water invasion, carbonate dissolution and human inputs. Factor analysis yielded two factors accounting for 88% of the total variance. Factor 1 (59%) is a salinization factor demonstrating the mixing contribution of natural saline water with human inputs. Factor 2 measure the hardness and pollution which explained 29% of the total variance. The negative relationship between the NO3- and pH may reveal a denitrification process in a heavy polluted aquifer recharged by a limited oxygenated rainfall. Multivariate statistical analysis combined with hydrochemical analysis indicate that the main factors controlling groundwater chemistry were Eocene saline invasion, seawater invasion, sewage invasion and rainfall recharge and the main hydrochemical processes were base ion and reverse ion exchange processes with clay minerals (water rock interactions), nitrification, carbonate dissolution and a limited denitrification process.

Keywords: dendrogram and cluster analysis, water facies, Eocene saline invasion and sea water invasion, nitrification and denitrification

Procedia PDF Downloads 348
6700 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments

Authors: Xiaoqin Wang, Li Yin

Abstract:

Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.

Keywords: causal effect, point effect, statistical modelling, sequential causal inference

Procedia PDF Downloads 193
6699 Developing Fault Tolerance Metrics of Web and Mobile Applications

Authors: Ahmad Mohsin, Irfan Raza Naqvi, Syda Fatima Usamn

Abstract:

Applications with higher fault tolerance index are considered more reliable and trustworthy to drive quality. In recent years application development has been shifted from traditional desktop and web to native and hybrid application(s) for the web and mobile platforms. With the emergence of Internet of things IOTs, cloud and big data trends, the need for measuring Fault Tolerance for these complex nature applications has increased to evaluate their performance. There is a phenomenal gap between fault tolerance metrics development and measurement. Classic quality metric models focused on metrics for traditional systems ignoring the essence of today’s applications software, hardware & deployment characteristics. In this paper, we have proposed simple metrics to measure fault tolerance considering general requirements for Web and Mobile Applications. We have aligned factors – subfactors, using GQM for metrics development considering the nature of mobile we apps. Systematic Mathematical formulation is done to measure metrics quantitatively. Three web mobile applications are selected to measure Fault Tolerance factors using formulated metrics. Applications are then analysed on the basis of results from observations in a controlled environment on different mobile devices. Quantitative results are presented depicting Fault tolerance in respective applications.

Keywords: web and mobile applications, reliability, fault tolerance metric, quality metrics, GQM based metrics

Procedia PDF Downloads 329
6698 Methods to Measure the Quality of 2D Image Compression Techniques

Authors: Mohammed H. Rasheed, Hussein Nadhem Fadhel, Mohammed M. Siddeq

Abstract:

In this paper we suggested image quality measuring metrics tools that can provide an accurate and close to the perceived quality sense of the tested images. Such tools give metrics that can be used to compare the performance of image compression algorithms. In this paper, two new metrics to measure the quality of decompressed images are proposed. The metric measurement based on combined data (CD) between an originals and decompressed images. Compared with other e.g., PSNR and RMSE, the proposed metrics gives values with the closest reflection of image quality perception by the human eye.

Keywords: RMSE, PSNR, image quality metrics, image compression

Procedia PDF Downloads 8
6697 Reexamining Contrarian Trades as a Proxy of Informed Trades: Evidence from China's Stock Market

Authors: Dongqi Sun, Juan Tao, Yingying Wu

Abstract:

This paper reexamines the appropriateness of contrarian trades as a proxy of informed trades, using high frequency Chinese stock data. Employing this measure for 5 minute intervals, a U-shaped intraday pattern of probability of informed trades (PIN) is found for the CSI300 stocks, which is consistent with previous findings for other markets. However, while dividing the trades into different sizes, a reversed U-shaped PIN from large-sized trades, opposed to the U-shaped pattern for small- and medium-sized trades, is observed. Drawing from the mixed evidence with different trade sizes, the price impact of trades is further investigated. By examining the relationship between trade imbalances and unexpected returns, larges-sized trades are found to have significant price impact. This implies that in those intervals with large trades, it is non-contrarian trades that are more likely to be informed trades. Taking account of the price impact of large-sized trades, non-contrarian trades are used to proxy for informed trading in those intervals with large trades, and contrarian trades are still used to measure informed trading in other intervals. A stronger U-shaped PIN is demonstrated from this modification. Auto-correlation and information advantage tests for robustness also support the modified informed trading measure.

Keywords: contrarian trades, informed trading, price impact, trade imbalance

Procedia PDF Downloads 157
6696 Callous-Unemotional Traits in Preschoolers: Distinct Associations with Empathy Subcomponents

Authors: E. Stylianopoulou, A. K. Fanti

Abstract:

Object: Children scoring high on Callous-Unemotional traits (CU traits) exhibit lack of empathy. More specifically, children scoring high on CU traits appear to exhibit deficits on affective empathy or deficits in other constructs. However, little is known about cognitive empathy, and it's relation with CU traits in preschoolers. Despite the fact that empathy is measurable at a very young age, relatively less study has focused on empathy in preschoolers than older children with CU traits. The present study examines the cognitive and affective empathy in preschoolers with CU traits. The aim was to examine the differences between cognitive and affective empathy in those individuals. Based on previous research in children with CU traits, it was hypothesized that preschoolers scoring high in CU traits will show deficits in both cognitive and affective empathy; however, more deficits will be detected in affective empathy rather than cognitive empathy. Method: The sample size was 209 children, of which 109 were male, and 100 were female between the ages of 3 and 7 (M=4.73, SD=0.71). From those participants, only 175 completed all the items. The Inventory of Callous-Unemotional traits was used to measure CU traits. Moreover, the Griffith Empathy Measure (GEM) Affective Scale and the Griffith Empathy Measure (GEM) Cognitive Scale was used to measure Affective and Cognitive empathy, respectively. Results: Linear Regression was applied to examine the preceding hypotheses. The results showed that generally, there was a moderate negative association between CU traits and empathy, which was significant. More specifically, it has been found that there was a significant and negative moderate relation between CU traits and cognitive empathy. Surprisingly, results indicated that there was no significant relation between CU traits and affective empathy. Conclusion: The current findings support that preschoolers show deficits in understanding others emotions, indicating a significant association between CU traits and cognitive empathy. However, such a relation was not found between CU traits and affective empathy. The current results raised the importance that there is a need for focusing more on cognitive empathy in preschoolers with CU traits, a component that seems to be underestimated till now.

Keywords: affective empathy, callous-unemotional traits, cognitive empathy, preschoolers

Procedia PDF Downloads 138
6695 Insulin Resistance in Early Postmenopausal Women Can Be Attenuated by Regular Practice of 12 Weeks of Yoga Therapy

Authors: Praveena Sinha

Abstract:

Context: Diabetes is a global public health burden, particularly affecting postmenopausal women. Insulin resistance (IR) is prevalent in this population, and it is associated with an increased risk of developing type 2 diabetes. Yoga therapy is gaining attention as a complementary intervention for diabetes due to its potential to address stress psychophysiology. This study focuses on the efficacy of a 12-week yoga practice in attenuating insulin resistance in early postmenopausal women. Research Aim: The aim of this research is to investigate the effect of a 3-month long yoga practice on insulin resistance in early postmenopausal women. Methodology: The study conducted a prospective longitudinal design with 67 women within five years of menopause. Participants were divided into two groups based on their willingness to join yoga. The Yoga group (n = 37) received routine gynecological management along with an integrated yoga module, while the Non-Yoga group (n = 30) received only routine management. Insulin resistance was measured using the homeostasis model assessment of insulin resistance (HOMA-IR) method before and after the intervention. Statistical analysis was performed using GraphPad Prism Version 5 software, with statistical significance set at P < 0.05. Findings: The results indicate a significant decrease in serum fasting insulin levels and HOMA-IR measurements in the Yoga group, although the decrease did not reach statistical significance. In contrast, the Non-Yoga group showed a significant rise in serum fasting insulin levels and HOMA-IR measurements after 3 months, suggesting a detrimental effect on insulin resistance in these postmenopausal women. Theoretical Importance: This study provides evidence that a 12-week yoga practice can attenuate the increase in insulin resistance in early postmenopausal women. It highlights the potential of yoga as a preventive measure against the early onset of insulin resistance and the development of type 2 diabetes mellitus. Regular yoga practice can be a valuable tool in addressing hormonal imbalances associated with early postmenopause, leading to a decrease in morbidity and mortality related to insulin resistance and type 2 diabetes mellitus in this population. Data Collection and Analysis Procedures: Data collection involved measuring serum fasting insulin levels and calculating HOMA-IR. Statistical analysis was performed using GraphPad Prism Version 5 software, and mean values with standard error of the mean were reported. The significance level was set at P < 0.05. Question Addressed: The study aimed to address whether a 3-month long yoga practice could attenuate insulin resistance in early postmenopausal women. Conclusion: The research findings support the efficacy of a 12-week yoga practice in attenuating insulin resistance in early postmenopausal women. Regular yoga practice has the potential to prevent the early onset of insulin resistance and the development of type 2 diabetes mellitus in this population. By addressing the hormonal imbalances associated with early post menopause, yoga could significantly decrease morbidity and mortality related to insulin resistance and type 2 diabetes mellitus in these subjects.

Keywords: post menopause, insulin resistance, HOMA-IR, yoga, type 2 diabetes mellitus

Procedia PDF Downloads 55
6694 Statistical Scientific Investigation of Popular Cultural Heritage in the Relationship between Astronomy and Weather Conditions in the State of Kuwait

Authors: Ahmed M. AlHasem

Abstract:

The Kuwaiti society has long been aware of climatic changes and their annual dates and trying to link them to astronomy in an attempt to forecast the future weather conditions. The reason for this concern is that many of the economic, social and living activities of the society depend deeply on the nature of the weather conditions directly and indirectly. In other words, Kuwaiti society, like the case of many human societies, has in the past tried to predict climatic conditions by linking them to astronomy or popular statements to indicate the timing of climate changes. Accordingly, this study was devoted to scientific investigation based on the statistical analysis of climatic data to show the accuracy and compatibility of some of the most important elements of the cultural heritage in relation to climate change and to relate it scientifically to precise climatic measurements for decades. The research has been divided into 10 topics, each topic has been focused on one legacy, whether by linking climate changes to the appearance/disappearance of star or a popular statement inherited through generations, through explain the nature and timing and thereby statistical analysis to indicate the proportion of accuracy based on official climatic data since 1962. The study's conclusion is that the relationship is weak and, in some cases, non-existent between the popular heritage and the actual climatic data. Therefore, it does not have a dependable relationship and a reliable scientific prediction between both the popular heritage and the forecast of weather conditions.

Keywords: astronomy, cultural heritage, statistical analysis, weather prediction

Procedia PDF Downloads 111
6693 Impact of Gaming Environment in Education

Authors: Md. Ataur Rahman Bhuiyan, Quazi Mahabubul Hasan, Md. Rifat Ullah

Abstract:

In this research, we did explore the effectiveness of the gaming environment in education and compared it with the traditional education system. We take several workshops in both learning environments. We measured student’s performance by providing a grading score (by professional academics) on their attitude in different criteria. We also collect data from survey questionnaires to understand student’s experiences towards education and study. Finally, we examine the impact of the different learning environments by applying statistical hypothesis tests, the T-test, and the ANOVA test.

Keywords: gamification, game-based learning, education, statistical analysis, human-computer interaction

Procedia PDF Downloads 209
6692 A Statistical Energy Analysis Model of an Automobile for the Prediction of the Internal Sound Pressure Level

Authors: El Korchi Ayoub, Cherif Raef

Abstract:

Interior noise in vehicles is an essential factor affecting occupant comfort. Over recent decades, much work has been done to develop simulation tools for vehicle NVH. At the medium high-frequency range, the statistical energy analysis method (SEA) shows significant effectiveness in predicting noise and vibration responses of mechanical systems. In this paper, the evaluation of the sound pressure level (SPL) inside an automobile cabin has been performed numerically using the statistical energy analysis (SEA) method. A test car cabin was performed using a monopole source as a sound source. The decay rate method was employed to obtain the damping loss factor (DLF) of each subsystem of the developed SEA model. These parameters were then used to predict the sound pressure level in the interior cabin. The results show satisfactory agreement with the directly measured SPL. The developed SEA vehicle model can be used in early design phases and allows the engineer to identify sources contributing to the total noise and transmission paths.

Keywords: SEA, SPL, DLF, NVH

Procedia PDF Downloads 81
6691 Development of the Family Capacity of Management of Patients with Autism Spectrum Disorder Diagnosis

Authors: Marcio Emilio Dos Santos, Kelly C. F. Dos Santos

Abstract:

Caregivers of patients diagnosed with ASD are subjected to high stress situations due to the complexity and multiple levels of daily activities that require the organization of events, behaviors and socioemotional situations, such as immediate decision making and in public spaces. The cognitive and emotional requirement needed to fulfill this caregiving role exceeds the regular cultural process that adults receive in their process of preparation for conjugal and parental life. Therefore, in many cases, caregivers present a high level of overload, poor capacity to organize and mediate the development process of the child or patient about their care. Aims: Improvement in the cognitive and emotional capacities related to the caregiver function, allowing the reduction of the overload, the feeling of incompetence and the characteristic level of stress, developing a more organized conduct and decision making more oriented towards the objectives and procedural gains necessary for the integral development of the patient with diagnosis of ASD. Method: The study was performed with 20 relatives, randomly selected from a total of 140 patients attended. The family members were submitted to the Wechsler Adult Intelligence Scale III intelligence test and the Family assessment Management Measure (FaMM) questionnaire as a previous evaluation. Therapeutic activity in a small group of family members or caregivers, with weekly frequency, with a minimum workload of two hours, using the Feuerstein Instrumental Enrichment Cognitive Development Program - Feuerstein Instrumental Enrichment for ten months. Reapplication of the previous tests to verify the gains obtained. Results and Discussion: There is a change in the level of caregiver overload, improvement in the results of the Family assessment Management Measure and highlight to the increase of performance in the cognitive aspects related to problem solving, planned behavior and management of behavioral crises. These results lead to the discussion of the need to invest in the integrated care of patients and their caregivers, mainly by enabling cognitively to deal with the complexity of Autism. This goes beyond the simple therapeutic orientation about adjustments in family and school routines. The study showed that when the caregiver improves his/her capacity of management, the results of the treatment are potentiated and there is a reduction of the level of the caregiver's overload. Importantly, the study was performed for only ten months and the number of family members attended in the study (n = 20) needs to be expanded to have statistical strength.

Keywords: caregiver overload, cognitive development program ASD caregivers, feuerstein instrumental enrichment, family assessment management measure

Procedia PDF Downloads 112
6690 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods

Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno

Abstract:

Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.

Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management

Procedia PDF Downloads 481
6689 Dispersion Rate of Spilled Oil in Water Column under Non-Breaking Water Waves

Authors: Hanifeh Imanian, Morteza Kolahdoozan

Abstract:

The purpose of this study is to present a mathematical phrase for calculating the dispersion rate of spilled oil in water column under non-breaking waves. In this regard, a multiphase numerical model is applied for which waves and oil phase were computed concurrently, and accuracy of its hydraulic calculations have been proven. More than 200 various scenarios of oil spilling in wave waters were simulated using the multiphase numerical model and its outcome were collected in a database. The recorded results were investigated to identify the major parameters affected vertical oil dispersion and finally 6 parameters were identified as main independent factors. Furthermore, some statistical tests were conducted to identify any relationship between the dependent variable (dispersed oil mass in the water column) and independent variables (water wave specifications containing height, length and wave period and spilled oil characteristics including density, viscosity and spilled oil mass). Finally, a mathematical-statistical relationship is proposed to predict dispersed oil in marine waters. To verify the proposed relationship, a laboratory example available in the literature was selected. Oil mass rate penetrated in water body computed by statistical regression was in accordance with experimental data was predicted. On this occasion, it was necessary to verify the proposed mathematical phrase. In a selected laboratory case available in the literature, mass oil rate penetrated in water body computed by suggested regression. Results showed good agreement with experimental data. The validated mathematical-statistical phrase is a useful tool for oil dispersion prediction in oil spill events in marine areas.

Keywords: dispersion, marine environment, mathematical-statistical relationship, oil spill

Procedia PDF Downloads 224
6688 Clinical Use of Opioid Analgesics in China: An Adequacy of Consumption Measure

Authors: Mengjia Zhi, Xingmei Wei, Xiang Gao, Shiyang Liu, Zhiran Huang, Li Yang, Jing Sun

Abstract:

Background: To understand the consumption trend of opioid analgesics and the consumption adequacy of opioid analgesic treatment for moderate to severe pain in China, as well as the pain control level of China with international perspective. Importance: To author’s best knowledge, this is the first study in China to measure the adequacy of opioid analgesic treatment for moderate to severe pain considering disease pattern and with the standardized pain treatment guideline. Methods: A retrospective analysis was carried out to show the consumption frequency (daily defined doses, DDDs) of opioid analgesics and its trend in China from 2006 to 2016. Adequacy of consumption measure (ACM) was used to measure the number of needed morphine equivalents and the overall adequacy of opioid analgesic treatment of moderate to severe pain in China, and compared with international data. Results: The consumption frequency of opioid analgesics (DDDs) in China increased from 13,200,000 DDDs in 2006 to 44,200,000 DDDs in 2016, and showed an increasing trend. The growth rate was faster at first, especially in 2013, then slowed down, decreased slightly in 2015. The ACM of China increased from 0.0032 in 2006 to 0.0074 in 2016, with an overall trend of growth. The ACM level of China has been always a very poor level during 2006-2016. Conclusion: The consumption of opioid analgesics for the treatment of moderate to severe pain in China has always been inadequate. There is a huge gap between China and the international level. There are many reasons behind this problem, which lie in different aspects, including medical staff, patients and the public, health systems and social & cultural aspects. It is necessary to strengthen the training and education of medical staff and the patients, to use mass media to disseminate scientific knowledge of pain management, to encourage communications between doctors and patients, to improve regulatory system for the controlled medicines and the overall health systems, and to balance the regulatory goal for avoidance of abuse, and the social goal of meeting the increasing needs of the people for better life.

Keywords: opioid analgesics, adequate consumption measure, pain control, China

Procedia PDF Downloads 199
6687 Algorithms Minimizing Total Tardiness

Authors: Harun Aydilek, Asiye Aydilek, Ali Allahverdi

Abstract:

The total tardiness is a widely used performance measure in the scheduling literature. This performance measure is particularly important in situations where there is a cost to complete a job beyond its due date. The cost of scheduling increases as the gap between a job's due date and its completion time increases. Such costs may also be penalty costs in contracts, loss of goodwill. This performance measure is important as the fulfillment of due dates of customers has to be taken into account while making scheduling decisions. The problem is addressed in the literature, however, it has been assumed zero setup times. Even though this assumption may be valid for some environments, it is not valid for some other scheduling environments. When setup times are treated as separate from processing times, it is possible to increase machine utilization and to reduce total tardiness. Therefore, non-zero setup times need to be considered as separate. A dominance relation is developed and several algorithms are proposed. The developed dominance relation is utilized in the proposed algorithms. Extensive computational experiments are conducted for the evaluation of the algorithms. The experiments indicated that the developed algorithms perform much better than the existing algorithms in the literature. More specifically, one of the newly proposed algorithms reduces the error of the best existing algorithm in the literature by 40 percent.

Keywords: algorithm, assembly flowshop, dominance relation, total tardiness

Procedia PDF Downloads 341
6686 Evaluating Hourly Sulphur Dioxide and Ground Ozone Simulated with the Air Quality Model in Lima, Peru

Authors: Odón R. Sánchez-Ccoyllo, Elizabeth Ayma-Choque, Alan Llacza

Abstract:

Sulphur dioxide (SO₂) and surface-ozone (O₃) concentrations are associated with diseases. The objective of this research is to evaluate the effectiveness of the air-quality-WRF-Chem model with a horizontal resolution of 5 km x 5 km. For this purpose, the measurements of the hourly SO₂ and O₃ concentrations available in three air quality monitoring stations in Lima, Peru were used for the purpose of validating the simulations of the SO₂ and O₃ concentrations obtained with the WRF-Chem model in February 2018. For the quantitative evaluation of the simulations of these gases, statistical techniques were implemented, such as the average of the simulations; the average of the measurements; the Mean Bias (MeB); the Mean Error (MeE); and the Root Mean Square Error (RMSE). The results of these statistical metrics indicated that the simulated SO₂ and O₃ values over-predicted the SO₂ and O₃ measurements. For the SO₂ concentration, the MeB values varied from 0.58 to 26.35 µg/m³; the MeE values varied from 8.75 to 26.5 µg/m³; the RMSE values varied from 13.3 to 31.79 µg/m³; while for O₃ concentrations the statistical values of the MeB varied from 37.52 to 56.29 µg/m³; the MeE values varied from 37.54 to 56.70 µg/m³; the RMSE values varied from 43.05 to 69.56 µg/m³.

Keywords: ground-ozone, lima, sulphur dioxide, WRF-chem

Procedia PDF Downloads 120
6685 A Study of Achievement and Attitude on Learning Science in English by Using Co – Teaching Method

Authors: Sakchai Rachniyom

Abstract:

Owing to the ASEAN community will formally take place in the few months; therefore, Thais should realize about the importance of English language. Since, it is regarded as a working language in the community. To promote Science students’ English proficiency, teacher should be able to teach in English language appropriately and effectively. The purposes of the quasi – experimental research are (1) to measure the learning achievement, (2) to evaluate students’ satisfaction on the teaching and learning and (3) to study the consequences of co – teaching method in order comprehend the learning achievement and improvement. The participants were 40 general science students teacher. Two types of research instruments were included; (1) an achievement test, and (2) a questionnaire. This research was conducted for 1 semester. The statistics used in this research were arithmetic mean and standard deviation. The findings of the study revealed that students’ achievement score was significantly increased at statistical level .05 and the students satisfied the teaching and learning at the highest level . The students’ involvement and teachers’ support were promoted. It was also reported students’ learning was improved by co – teaching method.

Keywords: co – teaching method, learning science in english, teacher, education

Procedia PDF Downloads 467
6684 Comparing Friction Force Between Track and Spline Using graphite, Mos2, PTFE, and Silicon Dry Lubricant

Authors: M. De Maaijer, Wenxuan Shi, , Dolores Pose, Ditmar, F. Barati

Abstract:

Friction has several detrimental effects on Blind performance, Therefore Ziptak company as the leading company in the blind manufacturing sector, start investigating on how to conquer this problem in next generation of blinds. This problem is more sever in extremely sever condition. Although in these condition Ziptrak suggest not to use the blind, working on blind and its associated parts was the priority of Ziptrak company. The purpose of this article is to measure the effects of lubrication process on reducing friction force between spline and track especially at windy conditions Four different lubricants were implicated to measure their efficiency on reducing friction force.

Keywords: libricant, ziptrak, blind, spline

Procedia PDF Downloads 74
6683 Wind Farm Power Performance Verification Using Non-Parametric Statistical Inference

Authors: M. Celeska, K. Najdenkoski, V. Dimchev, V. Stoilkov

Abstract:

Accurate determination of wind turbine performance is necessary for economic operation of a wind farm. At present, the procedure to carry out the power performance verification of wind turbines is based on a standard of the International Electrotechnical Commission (IEC). In this paper, nonparametric statistical inference is applied to designing a simple, inexpensive method of verifying the power performance of a wind turbine. A statistical test is explained, examined, and the adequacy is tested over real data. The methods use the information that is collected by the SCADA system (Supervisory Control and Data Acquisition) from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. The study has used data on the monthly output of wind farm in the Republic of Macedonia, and the time measuring interval was from January 1, 2016, to December 31, 2016. At the end, it is concluded whether the power performance of a wind turbine differed significantly from what would be expected. The results of the implementation of the proposed methods showed that the power performance of the specific wind farm under assessment was acceptable.

Keywords: canonical correlation analysis, power curve, power performance, wind energy

Procedia PDF Downloads 322
6682 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 418
6681 Statistical Design of Synthetic VP X-bar Control Chat Using Markov Chain Approach

Authors: Ali Akbar Heydari

Abstract:

Control charts are an important tool of statistical quality control. Thesecharts are used to detect and eliminate unwanted special causes of variation that occurred during aperiod of time. The design and operation of control charts require the determination of three design parameters: the sample size (n), the sampling interval (h), and the width coefficient of control limits (k). Thevariable parameters (VP) x-bar controlchart is the x-barchart in which all the design parameters vary between twovalues. These values are a function of the most recent process information. In fact, in the VP x-bar chart, the position of each sample point on the chart establishes the size of the next sample and the timeof its sampling. The synthetic x-barcontrol chartwhich integrates the x-bar chart and the conforming run length (CRL) chart, provides significant improvement in terms of detection power over the basic x-bar chart for all levels of mean shifts. In this paper, we introduce the syntheticVP x-bar control chart for monitoring changes in the process mean. To determine the design parameters, we used a statistical design based on the minimum out of control average run length (ARL) criteria. The optimal chart parameters of the proposed chart are obtained using the Markov chain approach. A numerical example is also done to show the performance of the proposed chart and comparing it with the other control charts. The results show that our proposed syntheticVP x-bar controlchart perform better than the synthetic x-bar controlchart for all shift parameter values. Also, the syntheticVP x-bar controlchart perform better than the VP x-bar control chart for the moderate or large shift parameter values.

Keywords: control chart, markov chain approach, statistical design, synthetic, variable parameter

Procedia PDF Downloads 147