Search results for: statistical approaches
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7422

Search results for: statistical approaches

7242 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach

Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh

Abstract:

Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.

Keywords: activated carbon, POME based lipase, immobilization, adsorption

Procedia PDF Downloads 218
7241 Large Amplitude Vibration of Sandwich Beam

Authors: Youssef Abdelli, Rachid Nasri

Abstract:

The large amplitude free vibration analysis of three-layered symmetric sandwich beams is carried out using two different approaches. The governing nonlinear partial differential equations of motion in free natural vibration are derived using Hamilton's principle. The formulation leads to two nonlinear partial differential equations that are coupled both in axial and binding deformations. In the first approach, the method of multiple scales is applied directly to the governing equation that is a nonlinear partial differential equation. In the second approach, we discretize the governing equation by using Galerkin's procedure and then apply the shooting method to the obtained ordinary differential equations. In order to check the validity of the solutions obtained by the two approaches, they are compared with the solutions obtained by two approaches; they are compared with the solutions obtained numerically by the finite difference method.

Keywords: finite difference method, large amplitude vibration, multiple scales, nonlinear vibration

Procedia PDF Downloads 427
7240 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 280
7239 Benchmarking Machine Learning Approaches for Forecasting Hotel Revenue

Authors: Rachel Y. Zhang, Christopher K. Anderson

Abstract:

A critical aspect of revenue management is a firm’s ability to predict demand as a function of price. Historically hotels have used simple time series models (regression and/or pick-up based models) owing to the complexities of trying to build casual models of demands. Machine learning approaches are slowly attracting attention owing to their flexibility in modeling relationships. This study provides an overview of approaches to forecasting hospitality demand – focusing on the opportunities created by machine learning approaches, including K-Nearest-Neighbors, Support vector machine, Regression Tree, and Artificial Neural Network algorithms. The out-of-sample performances of above approaches to forecasting hotel demand are illustrated by using a proprietary sample of the market level (24 properties) transactional data for Las Vegas NV. Causal predictive models can be built and evaluated owing to the availability of market level (versus firm level) data. This research also compares and contrast model accuracy of firm-level models (i.e. predictive models for hotel A only using hotel A’s data) to models using market level data (prices, review scores, location, chain scale, etc… for all hotels within the market). The prospected models will be valuable for hotel revenue prediction given the basic characters of a hotel property or can be applied in performance evaluation for an existed hotel. The findings will unveil the features that play key roles in a hotel’s revenue performance, which would have considerable potential usefulness in both revenue prediction and evaluation.

Keywords: hotel revenue, k-nearest-neighbors, machine learning, neural network, prediction model, regression tree, support vector machine

Procedia PDF Downloads 105
7238 Detecting Covid-19 Fake News Using Deep Learning Technique

Authors: AnjalI A. Prasad

Abstract:

Nowadays, social media played an important role in spreading misinformation or fake news. This study analyzes the fake news related to the COVID-19 pandemic spread in social media. This paper aims at evaluating and comparing different approaches that are used to mitigate this issue, including popular deep learning approaches, such as CNN, RNN, LSTM, and BERT algorithm for classification. To evaluate models’ performance, we used accuracy, precision, recall, and F1-score as the evaluation metrics. And finally, compare which algorithm shows better result among the four algorithms.

Keywords: BERT, CNN, LSTM, RNN

Procedia PDF Downloads 176
7237 Punishment In Athenian Forensic Oratory

Authors: Eleni Volonaki

Abstract:

In Athenian forensic speeches, the argumentation on punishment of the wrongdoers constitutes a fundamental ideal of exacting justice in court. The present paper explores the variation of approaches to punishment as a means of reformation, revenge, correction, education, example, chance to restoration of justice. As it will be shown, all these approaches reflect the social and political ideology of Athenian justice in the classical period and enhances the role of the courts and the importance of rhetoric in the process of decision-making. Punishment entails a wide range of penalties but also of ideological principles related to the Athenian constitution of democracy.

Keywords: punishment, athenian forensic speeches, justice, athenian democracy

Procedia PDF Downloads 156
7236 The Impact of Project Management Approaches in Enhancing Entrepreneurial Growth: A Study Using the Theory of Planned Behaviour as a Lens to Understand

Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinnmond

Abstract:

Entrepreneurship and project management are widely associated and seen as a vehicle for economic growth, but are studied separately. A few authors have considered the interconnectivity existing between these two fields, but relatively little empirical data currently exist in the literature. The purpose of the present empirical study is to explore whether successful entrepreneurs utilise project management approaches in enhancing enterprise growth by understanding the working practices and experiences of the entrepreneurs’ using the Theory of Planned Behaviour (TPB) as a lens. In order to understand those experiences, ten successful entrepreneurs in various business sectors in the North West of England were interviewed through a face-to-face semi-structured interview method. The collected audio tape-recorded data was transcribed and analysed using the deductive thematic technique (qualitative approach). The themes were viewed through the lens of Theory of Planned Behaviour to identify the three intentional antecedents (attitude, subjective norms, and perceived behavioural control) and to understand how they relate to the project management approaches (Planning, execution, and monitoring). The findings are twofold, the first evidence of the three intentional antecedents, which make up Theory of Planned Behaviour was present. Secondly, the analysis of project management approaches themes (planning, execution, and monitoring) using the lens of the theory of planned behaviour shows evidence of the three intentional antecedents. There were more than one intentional antecedents found in a particular project management theme, which indicates that the entrepreneur does utilise these approaches without categorising them into definite themes. However, the entrepreneur utilised these intentional antecedents as processes to enhanced business growth. In conclusion, the work presented here showed a way of understanding the interconnectivity between entrepreneurship and project management towards enhancing enterprise growth by examining the working practices and experiences of the successful entrepreneurs in the North-West England.

Keywords: business growth, entrepreneurship, project management approaches, theory of planned behaviour

Procedia PDF Downloads 175
7235 A Review of Benefit-Risk Assessment over the Product Lifecycle

Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris

Abstract:

Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.

Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches

Procedia PDF Downloads 117
7234 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 60
7233 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 44
7232 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder

Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen

Abstract:

Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.

Keywords: count data, meta-analytic prior, negative binomial, poisson

Procedia PDF Downloads 88
7231 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.

Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling

Procedia PDF Downloads 117
7230 Lexical Based Method for Opinion Detection on Tripadvisor Collection

Authors: Faiza Belbachir, Thibault Schienhinski

Abstract:

The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.

Keywords: Tripadvisor, opinion detection, SentiWordNet, trust score

Procedia PDF Downloads 165
7229 Eclectic Therapy in Approach to Clients’ Problems and Application of Multiple Intelligence Theory

Authors: Mohamed Sharof Mostafa, Atefeh Ahmadi

Abstract:

Most of traditional single modality psychotherapy and counselling approaches to clients’ problems are based on the application of one therapy in all sessions. Modern developments in these sciences focus on eclectic and integrative interventions to consider all dimensions of an issue and all characteristics of the clients. This paper presents and overview eclectic therapy and its pros and cons. In addition, multiple intelligence theory and its application in eclectic therapy approaches are mentioned.

Keywords: eclectic therapy, client, multiple intelligence theory, dimensions

Procedia PDF Downloads 669
7228 Select Communicative Approaches and Speaking Skills of Junior High School Students

Authors: Sonia Arradaza-Pajaron

Abstract:

Speaking English, as a medium of instruction among students who are non-native English speakers poses a real challenge to achieve proficiency, especially so if it is a requirement in most communicative classroom instruction. It becomes a real burden among students whose English language orientation is not well facilitated and encouraged by teachers among national high schools. This study, which utilized a descriptive-correlational research, examined the relationship between the select communicative approaches commonly utilized in classroom instruction to the level of speaking skills among the identified high school students. Survey questionnaires, interview, and observations sheets were researcher instruments used to generate salient information. Data were analyzed and treated statistically utilizing weighted mean speaking skills levels and Pearson r to determine the relationship between the two identified variables of the study. Findings revealed that the level of English speaking skills of the high school students is just average. Further, among the identified speaking sub-skills, namely, grammar, pronunciation and fluency, the students were considered above average level. There was also a clear relationship of some communicative approaches to the respondents’ speaking skills. Most notable among the select approaches is that of role-playing, compared to storytelling, informal debate, brainstorming, oral reporting, and others. It may be because role-playing is the most commonly used approach in the classroom. This implies that when these high school students are given enough time and autonomy on how they could express their ideas or comprehension of some lessons, they are shown to have a spontaneous manner of expression, through the maximization of the second language. It can be concluded further that high school students have the capacity to express ideas even in the second language, only if they are encouraged and well-facilitated by teachers. Also, when a better communicative approach is identified and better implemented, thus, will level up students’ classroom engagement.

Keywords: communicative approaches, comprehension, role playing, speaking skills

Procedia PDF Downloads 149
7227 Evaluation of Egg Quality Parameters in the Isa Brown Line in Intensive Production Systems in the Ocaña Region, Norte de Santander

Authors: Meza-Quintero Myriam, Lobo Torrado Katty Andrea, Sanchez Picon Yesenia, Hurtado-Lugo Naudin

Abstract:

The objective of the study was to evaluate the internal and external quality of the egg in the three production housing systems: floor, cage, and grazing of laying birds of the Isa Brown line, in the laying period between weeks 35 to 41; 135 hens distributed in 3 treatments of 45 birds per repetition were used (the replicas were the seven weeks of the trial). The feeding treatment supplied in the floor and cage systems contained 114 g/bird/day; for the grazing system, 14 grams less concentrate was provided. Nine eggs were collected to be studied and analyzed in the animal nutrition laboratory (3 eggs per housing system). The random statistical model was implemented: for the statistical analysis of the data, the statistical software of IBM® Statistical Products and Services Solution (SPSS) version 2.3 was used. The evaluation and follow-up instruments were the vernier caliper for the measurement in millimeters, a YolkFan™16 from Roche DSM for the evaluation of the egg yolk pigmentation, a digital scale for the measurement in grams, a micrometer for the measurement in millimeters and evaluation in the laboratory using dry matter, ashes, and ethereal extract. The results suggested that equivalent to the size of the egg (0.04 ± 3.55) and the thickness of the shell (0.46 ± 3.55), where P-Value> 0.05 was obtained, weight albumen (0.18 ± 3.55), albumen height (0.38 ± 3.55), yolk weight (0.64 ± 3.55), yolk height (0.54 ± 3.55) and for yolk pigmentation (1.23 ± 3.55). It was concluded that the hens in the three production systems, floor, cage, and grazing, did not show significant statistical differences in the internal and external quality of the chicken in the parameters studied egg for the production system.

Keywords: biological, territories, genetic resource, egg

Procedia PDF Downloads 51
7226 Comparison of Classical Computer Vision vs. Convolutional Neural Networks Approaches for Weed Mapping in Aerial Images

Authors: Paulo Cesar Pereira Junior, Alexandre Monteiro, Rafael da Luz Ribeiro, Antonio Carlos Sobieranski, Aldo von Wangenheim

Abstract:

In this paper, we present a comparison between convolutional neural networks and classical computer vision approaches, for the specific precision agriculture problem of weed mapping on sugarcane fields aerial images. A systematic literature review was conducted to find which computer vision methods are being used on this specific problem. The most cited methods were implemented, as well as four models of convolutional neural networks. All implemented approaches were tested using the same dataset, and their results were quantitatively and qualitatively analyzed. The obtained results were compared to a human expert made ground truth for validation. The results indicate that the convolutional neural networks present better precision and generalize better than the classical models.

Keywords: convolutional neural networks, deep learning, digital image processing, precision agriculture, semantic segmentation, unmanned aerial vehicles

Procedia PDF Downloads 216
7225 Condition for Plasma Instability and Stability Approaches

Authors: Ratna Sen

Abstract:

As due to very high temperature of Plasma it is very difficult to confine it for sufficient time so that nuclear fusion reactions to take place, As we know Plasma escapes faster than the binary collision rates. We studied the ball analogy and the ‘energy principle’ and calculated the total potential energy for the whole Plasma. If δ ⃗w is negative, that is decrease in potential energy then the plasma will be unstable. We also discussed different approaches of stability analysis such as Nyquist Method, MHD approximation and Vlasov approach of plasma stability. So that by using magnetic field configurations we can able to create a stable Plasma in Tokamak for generating energy for future generations.

Keywords: jello, magnetic field configuration, MHD approximation, energy principle

Procedia PDF Downloads 411
7224 Correlation between Potential Intelligence Explanatory Study in the Perspective of Multiple Intelligence Theory by Using Dermatoglyphics and Culture Approaches

Authors: Efnie Indrianie

Abstract:

Potential Intelligence constitutes one essential factor in every individual. This intelligence can be a provision for the development of Performance Intelligence if it is supported by surrounding environment. Fingerprint analysis is a method in recognizing this Potential Intelligence. This method is grounded on pattern and number of finger print outlines that are assumed symmetrical with the number of nerves in our brain, in which these areas have their own function among another. These brain’s functions are later being transposed into intelligence components in accordance with the Multiple Intelligences theory. This research tested the correlation between Potential Intelligence and the components of its Performance Intelligence. Statistical test results that used Pearson correlation showed that five components of Potential Intelligence correlated with Performance Intelligence. Those five components are Logic-Math, Logic, Linguistic, Music, Kinesthetic, and Intrapersonal. Also, this research indicated that cultural factor had a big role in shaping intelligence.

Keywords: potential intelligence, performance intelligence, multiple intelligences, fingerprint, environment, brain

Procedia PDF Downloads 496
7223 A Survey on Types of Noises and De-Noising Techniques

Authors: Amandeep Kaur

Abstract:

Digital Image processing is a fundamental tool to perform various operations on the digital images for pattern recognition, noise removal and feature extraction. In this paper noise removal technique has been described for various types of noises. This paper comprises discussion about various noises available in the image due to different environmental, accidental factors. In this paper, various de-noising approaches have been discussed that utilize different wavelets and filters for de-noising. By analyzing various papers on image de-noising we extract that wavelet based de-noise approaches are much effective as compared to others.

Keywords: de-noising techniques, edges, image, image processing

Procedia PDF Downloads 304
7222 Multimodal Deep Learning for Human Activity Recognition

Authors: Ons Slimene, Aroua Taamallah, Maha Khemaja

Abstract:

In recent years, human activity recognition (HAR) has been a key area of research due to its diverse applications. It has garnered increasing attention in the field of computer vision. HAR plays an important role in people’s daily lives as it has the ability to learn advanced knowledge about human activities from data. In HAR, activities are usually represented by exploiting different types of sensors, such as embedded sensors or visual sensors. However, these sensors have limitations, such as local obstacles, image-related obstacles, sensor unreliability, and consumer concerns. Recently, several deep learning-based approaches have been proposed for HAR and these approaches are classified into two categories based on the type of data used: vision-based approaches and sensor-based approaches. This research paper highlights the importance of multimodal data fusion from skeleton data obtained from videos and data generated by embedded sensors using deep neural networks for achieving HAR. We propose a deep multimodal fusion network based on a twostream architecture. These two streams use the Convolutional Neural Network combined with the Bidirectional LSTM (CNN BILSTM) to process skeleton data and data generated by embedded sensors and the fusion at the feature level is considered. The proposed model was evaluated on a public OPPORTUNITY++ dataset and produced a accuracy of 96.77%.

Keywords: human activity recognition, action recognition, sensors, vision, human-centric sensing, deep learning, context-awareness

Procedia PDF Downloads 68
7221 Comparative Analysis of Canal Centering Ratio, Apical Transportation, and Remaining Dentin Thickness between Single File System Using Cone Beam Computed Tomography: An in vitro Study

Authors: Aditi Jain

Abstract:

Aim: To compare the canal transportation, centering ability and remaining dentin thickness of OneShape and WaveOne system using CBCT. Objective: To identify rotary system which respects original canal anatomy. Materials and Methods: Forty extracted human single-rooted premolars were used in the present study. Pre-instrumentation scans of all teeth were taken, canal curvatures were calculated, and the samples were randomly divided into two groups with twenty samples in each group, where Group 1 included WaveOne system and Group 2 Protaper rotary system. Post-instrumentation scans were performed, and the two scans were compared to determine canal transportation, centering ability and remaining dentin thickness at 1, 3, and 5 mm from the root apex. Results: Using Student’s unpaired t test results were as follows; for canal transportation Group 1 showed statistical significant difference at 3mm, 6mm and non-significant difference was obtained at 9mm but for Group 2 non-statistical significant difference was obtained at 3mm, 6mm, and 9mm. For centering ability and remaining dentin thickness Group 1 showed non-statistical significant difference at 3mm and 9mm, while statistical significant difference at 6mm was obtained. When comparison of remaining dentin thickness was done at three levels using two groups WaveOne and ProTaper. There was non-statistical significant difference between two groups. Conclusion: WaveOne single reciprocation file respects original canal anatomy better than ProTaper. WaveOne depicted the best centering ability.

Keywords: ShapeOne, WaveOne, transportation, centering ability, dentin thickness, CBCT (Cone Beam Computed Tomography)

Procedia PDF Downloads 173
7220 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 225
7219 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 43
7218 A Method of Detecting the Difference in Two States of Brain Using Statistical Analysis of EEG Raw Data

Authors: Digvijaysingh S. Bana, Kiran R. Trivedi

Abstract:

This paper introduces various methods for the alpha wave to detect the difference between two states of brain. One healthy subject participated in the experiment. EEG was measured on the forehead above the eye (FP1 Position) with reference and ground electrode are on the ear clip. The data samples are obtained in the form of EEG raw data. The time duration of reading is of one minute. Various test are being performed on the alpha band EEG raw data.The readings are performed in different time duration of the entire day. The statistical analysis is being carried out on the EEG sample data in the form of various tests.

Keywords: electroencephalogram(EEG), biometrics, authentication, EEG raw data

Procedia PDF Downloads 438
7217 Revolutionary Solutions for Modeling and Visualization of Complex Software Systems

Authors: Jay Xiong, Li Lin

Abstract:

Existing software modeling and visualization approaches using UML are outdated, which are outcomes of reductionism and the superposition principle that the whole of a system is the sum of its parts, so that with them all tasks of software modeling and visualization are performed linearly, partially, and locally. This paper introduces revolutionary solutions for modeling and visualization of complex software systems, which make complex software systems much easy to understand, test, and maintain. The solutions are based on complexity science, offering holistic, automatic, dynamic, virtual, and executable approaches about thousand times more efficient than the traditional ones.

Keywords: complex systems, software maintenance, software modeling, software visualization

Procedia PDF Downloads 372
7216 Impact of Climate on Sugarcane Yield Over Belagavi District, Karnataka Using Statistical Mode

Authors: Girish Chavadappanavar

Abstract:

The impact of climate on agriculture could result in problems with food security and may threaten the livelihood activities upon which much of the population depends. In the present study, the development of a statistical yield forecast model has been carried out for sugarcane production over Belagavi district, Karnataka using weather variables of crop growing season and past observed yield data for the period of 1971 to 2010. The study shows that this type of statistical yield forecast model could efficiently forecast yield 5 weeks and even 10 weeks in advance of the harvest for sugarcane within an acceptable limit of error. The performance of the model in predicting yields at the district level for sugarcane crops is found quite satisfactory for both validation (2007 and 2008) as well as forecasting (2009 and 2010).In addition to the above study, the climate variability of the area has also been studied, and hence, the data series was tested for Mann Kendall Rank Statistical Test. The maximum and minimum temperatures were found to be significant with opposite trends (decreasing trend in maximum and increasing in minimum temperature), while the other three are found in significant with different trends (rainfall and evening time relative humidity with increasing trend and morning time relative humidity with decreasing trend).

Keywords: climate impact, regression analysis, yield and forecast model, sugar models

Procedia PDF Downloads 34
7215 Statistical Scientific Investigation of Popular Cultural Heritage in the Relationship between Astronomy and Weather Conditions in the State of Kuwait

Authors: Ahmed M. AlHasem

Abstract:

The Kuwaiti society has long been aware of climatic changes and their annual dates and trying to link them to astronomy in an attempt to forecast the future weather conditions. The reason for this concern is that many of the economic, social and living activities of the society depend deeply on the nature of the weather conditions directly and indirectly. In other words, Kuwaiti society, like the case of many human societies, has in the past tried to predict climatic conditions by linking them to astronomy or popular statements to indicate the timing of climate changes. Accordingly, this study was devoted to scientific investigation based on the statistical analysis of climatic data to show the accuracy and compatibility of some of the most important elements of the cultural heritage in relation to climate change and to relate it scientifically to precise climatic measurements for decades. The research has been divided into 10 topics, each topic has been focused on one legacy, whether by linking climate changes to the appearance/disappearance of star or a popular statement inherited through generations, through explain the nature and timing and thereby statistical analysis to indicate the proportion of accuracy based on official climatic data since 1962. The study's conclusion is that the relationship is weak and, in some cases, non-existent between the popular heritage and the actual climatic data. Therefore, it does not have a dependable relationship and a reliable scientific prediction between both the popular heritage and the forecast of weather conditions.

Keywords: astronomy, cultural heritage, statistical analysis, weather prediction

Procedia PDF Downloads 91
7214 The Use of Different Methodological Approaches to Teaching Mathematics at Secondary Level

Authors: M. Rodionov, N. Sharapova, Z. Dedovets

Abstract:

The article describes methods of preparation of future teachers that includes the entire diversity of traditional and computer-oriented methodological approaches. The authors reveal how, in the specific educational environment, a teacher can choose the most effective combination of educational technologies based on the nature of the learning task. The key conditions that determine such a choice are that the methodological approach corresponds to the specificity of the problem being solved and that it is also responsive to the individual characteristics of the students. The article refers to the training of students in the proper use of mathematical electronic tools for educational purposes. The preparation of future mathematics teachers should be a step-by-step process, building on specific examples. At the first stage, students optimally solve problems aided by electronic means of teaching. At the second stage, the main emphasis is on modeling lessons. At the third stage, students develop and implement strategies in the study of one of the topics within a school mathematics curriculum. The article also recommended the implementation of this strategy in preparation of future teachers and stated the possible benefits.

Keywords: education, methodological approaches, teacher, secondary school

Procedia PDF Downloads 142
7213 Impact of Gaming Environment in Education

Authors: Md. Ataur Rahman Bhuiyan, Quazi Mahabubul Hasan, Md. Rifat Ullah

Abstract:

In this research, we did explore the effectiveness of the gaming environment in education and compared it with the traditional education system. We take several workshops in both learning environments. We measured student’s performance by providing a grading score (by professional academics) on their attitude in different criteria. We also collect data from survey questionnaires to understand student’s experiences towards education and study. Finally, we examine the impact of the different learning environments by applying statistical hypothesis tests, the T-test, and the ANOVA test.

Keywords: gamification, game-based learning, education, statistical analysis, human-computer interaction

Procedia PDF Downloads 188