Search results for: conventional statistical methods
20383 Implant Guided Surgery and Immediate Loading
Authors: Omid Tavakol, Mahnaz Gholami
Abstract:
Introduction : In this oral presentation the main goal is discussing immediate loading in dental implants , from treatment planning and surgical guide designing to delivery , follow up and occlusal consideration . Methods and materials : first of all systematic reviews about immediate loading will be considered . besides , a comparison will be made between immediate loading and conventional loading in terms of success rate and complications . After that different methods , prosthetic options and materials best used in immediate loading will be explained. Particularly multi unit abutments and their mechanism of function will be explained .Digital impressions and designing the temporaries is the next topic we are to explicate .Next issue is the differences between single unit , multiple unit and full arch implantation in immediate loading .Following we are going to describe methods for tissue engineering and papilla formation after extraction . Last slides are about a full mouth rehabilitation via immediate loading technique from surgical designing to follow up .At the end we would talk about potential complications , how to prevent from occurrence and what to do if we face up with .Keywords: guided surgery, digital implantology, immediate loading, digital dentistry
Procedia PDF Downloads 4320382 Role of DatScan in the Diagnosis of Parkinson's Disease
Authors: Shraddha Gopal, Jayam Lazarus
Abstract:
Aims: To study the referral practice and impact of DAT-scan in the diagnosis or exclusion of Parkinson’s disease. Settings and Designs: A retrospective study Materials and methods: A retrospective study of the results of 60 patients who were referred for a DAT scan over a period of 2 years from the Department of Neurology at Northern Lincolnshire and Goole NHS trust. The reason for DAT scan referral was noted under 5 categories against Parkinson’s disease; drug-induced Parkinson’s, essential tremors, diagnostic dilemma, not responding to Parkinson’s treatment, and others. We assessed the number of patients who were diagnosed with Parkinson’s disease against the number of patients in whom Parkinson’s disease was excluded or an alternative diagnosis was made. Statistical methods: Microsoft Excel was used for data collection and statistical analysis, Results: 30 of the 60 scans were performed to confirm the diagnosis of early Parkinson’s disease, 13 were done to differentiate essential tremors from Parkinsonism, 6 were performed to exclude drug-induced Parkinsonism, 5 were done to look for alternative diagnosis as the patients were not responding to anti-Parkinson medication and 6 indications were outside the recommended guidelines. 55% of cases were confirmed with a diagnosis of Parkinson’s disease. 43.33% had Parkinson’s disease excluded. 33 of the 60 scans showed bilateral abnormalities and confirmed the clinical diagnosis of Parkinson’s disease. Conclusion: DAT scan provides valuable information in confirming Parkinson’s disease in 55% of patients along with excluding the diagnosis in 43.33% of patients aiding an alternative diagnosis.Keywords: DATSCAN, Parkinson's disease, diagnosis, essential tremors
Procedia PDF Downloads 23020381 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications
Procedia PDF Downloads 9320380 Parent’s Preferences about Technology-Based Therapy for Children and Young People on the Autism Spectrum – a UK Survey
Authors: Athanasia Kouroupa, Karen Irvine, Sivana Mengoni, Shivani Sharma
Abstract:
Exploring parents’ preferences towards technology-based interventions for children on the autism spectrum can inform future research and support technology design. The study aimed to provide a comprehensive description of parents’ knowledge and preferences about innovative technology to support children on the autism spectrum. Survey data were collected from parents (n = 267) internationally. The survey included information about the use of conventional (e.g., smartphone, iPod, tablets) and non-conventional (e.g., virtual reality, robot) technologies. Parents appeared to prefer conventional technologies such as tablets and dislike non-conventional ones. They highlighted the positive contribution technology brought to the children’s lives during the pandemic. A few parents were equally concerned that the compulsory introduction of technology during the pandemic was associated with elongated time on devices. The data suggested that technology-based interventions are not widely known, need to be financially approachable and achieve a high standard of design to engage users.Keywords: autism, intervention, preferences, technology
Procedia PDF Downloads 13120379 Heavy Vehicle Traffic Estimation Using Automatic Traffic Recorders/Weigh-In-Motion Data: Current Practice and Proposed Methods
Authors: Muhammad Faizan Rehman Qureshi, Ahmed Al-Kaisy
Abstract:
Accurate estimation of traffic loads is critical for pavement and bridge design, among other transportation applications. Given the disproportional impact of heavier axle loads on pavement and bridge structures, truck and heavy vehicle traffic is expected to be a major determinant of traffic load estimation. Further, heavy vehicle traffic is also a major input in transportation planning and economic studies. The traditional method for estimating heavy vehicle traffic primarily relies on AADT estimation using Monthly Day of the Week (MDOW) adjustment factors as well as the percent heavy vehicles observed using statewide data collection programs. The MDOW factors are developed using daily and seasonal (or monthly) variation patterns for total traffic, consisting predominantly of passenger cars and other smaller vehicles. Therefore, while using these factors may yield reasonable estimates for total traffic (AADT), such estimates may involve a great deal of approximation when applied to heavy vehicle traffic. This research aims at assessing the approximation involved in estimating heavy vehicle traffic using MDOW adjustment factors for total traffic (conventional approach) along with three other methods of using MDOW adjustment factors for total trucks (class 5-13), combination-unit trucks (class 8-13), as well as adjustment factors for each vehicle class separately. Results clearly indicate that the conventional method was outperformed by the other three methods by a large margin. Further, using the most detailed and data intensive method (class-specific adjustment factors) does not necessarily yield a more accurate estimation of heavy vehicle traffic.Keywords: traffic loads, heavy vehicles, truck traffic, adjustment factors, traffic data collection
Procedia PDF Downloads 2120378 A Robust Frequency Offset Estimator for Orthogonal Frequency Division Multiplexing
Authors: Keunhong Chae, Seokho Yoon
Abstract:
We address the integer frequency offset (IFO) estimation under the influence of the timing offset (TO) in orthogonal frequency division multiplexing (OFDM) systems. Incorporating the IFO and TO into the symbol set used to represent the received OFDM symbol, we investigate the influence of the TO on the IFO, and then, propose a combining method between two consecutive OFDM correlations, reducing the influence. The proposed scheme has almost the same complexity as that of the conventional schemes, whereas it does not need the TO knowledge contrary to the conventional schemes. From numerical results it is confirmed that the proposed scheme is insensitive to the TO, consequently, yielding an improvement of the IFO estimation performance over the conventional schemes when the TO exists.Keywords: estimation, integer frequency offset, OFDM, timing offset
Procedia PDF Downloads 47220377 Comparative Study of Stone Column with and without Encasement Using Waste Aggregate
Authors: V. K. Stalin, V. Paneerselvam, M. Bharath, M. Kirithika
Abstract:
In developing countries like India due to the rapid urbanization, large amount of waste materials are produced every year. These waste materials can be utilized in the improvement of problematic soils. Stone column is one of the best methods to improve soft clay deposits. In this study, load tests were conducted to ensure the suitability of waste as column materials. The variable parameters studied are material, number of column and encasement. The materials used for the study are stone aggregate, copper slag, construction waste, for one, two and three number of columns with geotextile and geogrid encasement. It was found that the performance of waste as column material are comparable to that of conventional stone column with and without encasement. Hence, it is concluded that the copper slag and construction waste may be used as a column material in place of conventional stone aggregate to improve the soft clay advantage being utilization of waste.Keywords: stone column, geocomposite, construction waste, copper slag
Procedia PDF Downloads 37820376 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine
Authors: Djamila Benhaddouche, Abdelkader Benyettou
Abstract:
In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction
Procedia PDF Downloads 55620375 A Cross-Gender Statistical Analysis of Tuvinian Intonation Features in Comparison With Uzbek and Azerbaijani
Authors: Daria Beziakina, Elena Bulgakova
Abstract:
The paper deals with cross-gender and cross-linguistic comparison of pitch characteristics for Tuvinian with two other Turkic languages - Uzbek and Azerbaijani, based on the results of statistical analysis of pitch parameter values and intonation patterns used by male and female speakers. The main goal of our work is to obtain the ranges of pitch parameter values typical for Tuvinian speakers for the purpose of automatic language identification. We also propose a cross-gender analysis of declarative intonation in the poorly studied Tuvinian language. The ranges of pitch parameter values were obtained by means of specially developed software that deals with the distribution of pitch values and allows us to obtain statistical language-specific pitch intervals.Keywords: speech analysis, statistical analysis, speaker recognition, identification of person
Procedia PDF Downloads 34720374 The Effect of Enamel Surface Preparation on the Self-Etch Bonding of Orthodontic Tubes: An in Vitro Study
Authors: Fernandes A. C. B. C. J., de Jesus V. C., Sepideh N., Vilela OFGG, Somarin K. K., França R., Pinheiro F. H. S. L.
Abstract:
Objective: The purpose of this study was to look at the effect of pre-treatment of enamel with pumice and/or 37% phosphoric acid on the shear bond strength (SBS) of orthodontic tubes bonded to enamel while simultaneously evaluating the efficacy of orthodontic tubes bonded by self-etch primer (SEP). Materials and Methods: 39 of the crown halves were divided into 3 groups at random. Group, I was the control group utilizing both prophy paste and the conventional double etching pre-treatment method. Group II excluded the use of prophy paste prior to double etching. Group III excluded the use of both prophy paste and double etching and only utilized SEP. Bond strength of the orthodontic tubes was measured by SBS. One way ANOVA and Tukey’s HSD test were used to compare SBS values between the three groups. The statistical significance was set to p<0.05. Results: The difference in SBS values of groups I (36.672 ± 9.315 Mpa), II (34.242 ± 9.986 Mpa), and III (39.055 ± 5.565) were not statistically significant (P<0.05). Conclusion: This study suggested that the use of prophy paste or pre-acid etch of the enamel surface did not provide a statistically significant difference in SBS between the three groups.Keywords: shear bond strength, orthodontic bracket, self-etch primer, pumice, prophy
Procedia PDF Downloads 17720373 Statistical Comparison of Machine and Manual Translation: A Corpus-Based Study of Gone with the Wind
Authors: Yanmeng Liu
Abstract:
This article analyzes and compares the linguistic differences between machine translation and manual translation, through a case study of the book Gone with the Wind. As an important carrier of human feeling and thinking, the literature translation poses a huge difficulty for machine translation, and it is supposed to expose distinct translation features apart from manual translation. In order to display linguistic features objectively, tentative uses of computerized and statistical evidence to the systematic investigation of large scale translation corpora by using quantitative methods have been deployed. This study compiles bilingual corpus with four versions of Chinese translations of the book Gone with the Wind, namely, Piao by Chunhai Fan, Piao by Huairen Huang, translations by Google Translation and Baidu Translation. After processing the corpus with the software of Stanford Segmenter, Stanford Postagger, and AntConc, etc., the study analyzes linguistic data and answers the following questions: 1. How does the machine translation differ from manual translation linguistically? 2. Why do these deviances happen? This paper combines translation study with the knowledge of corpus linguistics, and concretes divergent linguistic dimensions in translated text analysis, in order to present linguistic deviances in manual and machine translation. Consequently, this study provides a more accurate and more fine-grained understanding of machine translation products, and it also proposes several suggestions for machine translation development in the future.Keywords: corpus-based analysis, linguistic deviances, machine translation, statistical evidence
Procedia PDF Downloads 14120372 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods
Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim
Abstract:
Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.Keywords: economical analysis, probability of failure, retaining walls, statistical analysis
Procedia PDF Downloads 40520371 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers
Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal
Abstract:
Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test
Procedia PDF Downloads 9720370 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data
Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah
Abstract:
At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.Keywords: Semantic Web, linked open data, database, statistic
Procedia PDF Downloads 17420369 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks
Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox
Abstract:
miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network
Procedia PDF Downloads 50620368 Economic Forecasting Analysis for Solar Photovoltaic Application
Authors: Enas R. Shouman
Abstract:
Economic development with population growth is leading to a continuous increase in energy demand. At the same time, growing global concern for the environment is driving to decrease the use of conventional energy sources and to increase the use of renewable energy sources. The objective of this study is to present the market trends of solar energy photovoltaic technology over the world and to represent economics methods for PV financial analyzes on the basis of expectations for the expansion of PV in many applications. In the course of this study, detailed information about the current PV market was gathered and analyzed to find factors influencing the penetration of PV energy. The paper methodology depended on five relevant economic financial analysis methods that are often used for investment decisions maker. These methods are payback analysis, net benefit analysis, saving-to-investment ratio, adjusted internal rate of return, and life-cycle cost. The results of this study may be considered as a marketing guide that helps diffusion of using PV Energy. The study showed that PV cost is economically reliable. The consumers will pay higher purchase prices for PV system installation but will get lower electricity bill.Keywords: photovoltaic, financial methods, solar energy, economics, PV panel
Procedia PDF Downloads 10820367 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models
Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg
Abstract:
Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction
Procedia PDF Downloads 30720366 Various Advanced Statistical Analyses of Index Values Extracted from Outdoor Agricultural Workers Motion Data
Authors: Shinji Kawakura, Ryosuke Shibasaki
Abstract:
We have been grouping and developing various kinds of practical, promising sensing applied systems concerning agricultural advancement and technical tradition (guidance). These include advanced devices to secure real-time data related to worker motion, and we analyze by methods of various advanced statistics and human dynamics (e.g. primary component analysis, Ward system based cluster analysis, and mapping). What is more, we have been considering worker daily health and safety issues. Targeted fields are mainly common farms, meadows, and gardens. After then, we observed and discussed time-line style, changing data. And, we made some suggestions. The entire plan makes it possible to improve both the aforementioned applied systems and farms.Keywords: advanced statistical analysis, wearable sensing system, tradition of skill, supporting for workers, detecting crisis
Procedia PDF Downloads 39220365 The Differences on the Surface Roughness of Glass Ionomer Cement as the Results of Brushing with Whitening and Conventional Toothpaste
Authors: Aulina R. Rahmi, Farid Yuristiawan, Annisa Ibifadillah, Ummu H. Amri, Hidayati Gunawan
Abstract:
Glass ionomer cement is one of the filling material that often used on the field of dentistry because it is relatively less expensive and mostly available. Restoration materials could undergo changes in their clinical properties such as changes in roughness of the restoration`s surface. An increase of surface roughness accelerates bacterial colonization and plaque maturation. In the oral cavity, GIC was exposed to various substances, such as toothpaste, an oral care product used during toothbrushing. One of the popular toothpaste is whitening toothpaste. Abrasive and chemical agents such as hydrogen peroxide in whitening toothpaste could increase the surface roughness of restorative materials. Objective: To determine the differences on the surface roughness of glass ionomer cement that was brushed with whitening and conventional toothpaste. Method: This study was done using experimental laboratory method with pre and post test design. There were 36 samples which were divided into 2 groups. The first group was brushed with whitening toothpaste and the second group was brushed with conventional toothpaste, each for 2 minutes. Surface roughness value of the specimens was measured by using Roughness Tester test. Result: The data was analyzed by using independent t-test and the result of this study showed there was a significant difference between the surface of glass ionomer cement which was brushed with whitening and conventional toothpaste (p=0,000). Conclusion: Glass ionomer cement that was brushed with whitening toothpaste produced more roughness than conventional toothpaste.Keywords: glass ionomer cement, surface roughness, toothpaste, roughness tester
Procedia PDF Downloads 28620364 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 48020363 Statistical Characteristics of Code Formula for Design of Concrete Structures
Authors: Inyeol Paik, Ah-Ryang Kim
Abstract:
In this research, a statistical analysis is carried out to examine the statistical properties of the formula given in the design code for concrete structures. The design formulas of the Korea highway bridge design code - the limit state design method (KHBDC) which is the current national bridge design code and the design code for concrete structures by Korea Concrete Institute (KCI) are applied for the analysis. The safety levels provided by the strength formulas of the design codes are defined based on the probabilistic and statistical theory.KHBDC is a reliability-based design code. The load and resistance factors of this code were calibrated to attain the target reliability index. It is essential to define the statistical properties for the design formulas in this calibration process. In general, the statistical characteristics of a member strength are due to the following three factors. The first is due to the difference between the material strength of the actual construction and that used in the design calculation. The second is the difference between the actual dimensions of the constructed sections and those used in design calculation. The third is the difference between the strength of the actual member and the formula simplified for the design calculation. In this paper, the statistical study is focused on the third difference. The formulas for calculating the shear strength of concrete members are presented in different ways in KHBDC and KCI. In this study, the statistical properties of design formulas were obtained through comparison with the database which comprises the experimental results from the reference publications. The test specimen was either reinforced with the shear stirrup or not. For an applied database, the bias factor was about 1.12 and the coefficient of variation was about 0.18. By applying the statistical properties of the design formula to the reliability analysis, it is shown that the resistance factors of the current design codes satisfy the target reliability indexes of both codes. Also, the minimum resistance factors of the KHBDC which is written in the material resistance factor format and KCE which is in the member resistance format are obtained and the results are presented. A further research is underway to calibrate the resistance factors of the high strength and high-performance concrete design guide.Keywords: concrete design code, reliability analysis, resistance factor, shear strength, statistical property
Procedia PDF Downloads 31920362 Clinical Feature Analysis and Prediction on Recurrence in Cervical Cancer
Authors: Ravinder Bahl, Jamini Sharma
Abstract:
The paper demonstrates analysis of the cervical cancer based on a probabilistic model. It involves technique for classification and prediction by recognizing typical and diagnostically most important test features relating to cervical cancer. The main contributions of the research include predicting the probability of recurrences in no recurrence (first time detection) cases. The combination of the conventional statistical and machine learning tools is applied for the analysis. Experimental study with real data demonstrates the feasibility and potential of the proposed approach for the said cause.Keywords: cervical cancer, recurrence, no recurrence, probabilistic, classification, prediction, machine learning
Procedia PDF Downloads 35720361 Choice Experiment Approach on Evaluation of Non-Market Farming System Outputs: First Results from Lithuanian Case Study
Authors: A. Novikova, L. Rocchi, G. Startiene
Abstract:
Market and non-market outputs are produced jointly in agriculture. Their supply depends on the intensity and type of production. The role of agriculture as an economic activity and its effects are important for the Lithuanian case study, as agricultural land covers more than a half of country. Positive and negative externalities, created in agriculture are not considered in the market. Therefore, specific techniques such as stated preferences methods, in particular choice experiments (CE) are used for evaluation of non-market outputs in agriculture. The main aim of this paper is to present construction of the research path for evaluation of non-market farming system outputs in Lithuania. The conventional and organic farming, covering crops (including both cereal and industrial crops) and livestock (including dairy and cattle) production has been selected. The CE method and nested logit (NL) model were selected as appropriate for evaluation of non-market outputs of different farming systems in Lithuania. A pilot survey was implemented between October–November 2018, in order to test and improve the CE questionnaire. The results of the survey showed that the questionnaire is accepted and well understood by the respondents. The econometric modelling showed that the selected NL model could be used for the main survey. The understanding of the differences between organic and conventional farming by residents was identified. It was revealed that they are more willing to choose organic farming in comparison to conventional farming.Keywords: choice experiments, farming system, Lithuania market outputs, non-market outputs
Procedia PDF Downloads 12820360 128-Multidetector CT for Assessment of Optimal Depth of Electrode Array Insertion in Cochlear Implant Operations
Authors: Amina Sultan, Mohamed Ghonim, Eman Oweida, Aya Abdelaziz
Abstract:
Objective: To assess the diagnostic reliability of multi-detector CT in pre and post-operative evaluation of cochlear implant candidates. Material and Methods: The study includes 40 patients (18 males and 22 females); mean age 5.6 years. They were classified into two groups: Group A (20 patients): cochlear implant device was Nucleus-22 and Group B (20 patients): the device was MED-EL. Cochlear length (CL) and cochlear height (CH) were measured pre-operatively by 128-multidetector CT. Electrode length (EL) and insertion depth angle (α) were measured post-operatively by MDCT. Results: For Group A mean CL was 9.1 mm ± 0.4 SD; mean CH was 4.1 ± 0.3 SD; mean EL was 18 ± 2.7 SD; mean α angle was 299.05 ± 37 SD. Significant statistical correlation (P < 0.05) was found between preoperative CL and post-operative EL (r²=0.6); as well as EL and α angle (r²=0.7). Group B's mean CL was 9.1 mm ± 0.3 SD; mean CH was 4.1 ± 0.4 SD; mean EL was 27 ± 2.1 SD; mean α angle was 287.6 ± 41.7 SD. Significant statistical correlation was found between CL and EL (r²= 0.6) and α angle (r²=0.5). Also, a strong correlation was found between EL and α angle (r²=0.8). Significant statistical difference was detected between the two devices as regards to the electrode length. Conclusion: Multidetector CT is a reliable tool for preoperative planning and post-operative evaluation of the outcomes of cochlear implant operations. Cochlear length is a valuable prognostic parameter for prediction of the depth of electrode array insertion which can influence criteria of device selection.Keywords: angle of insertion (α angle), cochlear implant (CI), cochlear length (CL), Multidetector Computed Tomography (MDCT)
Procedia PDF Downloads 19120359 Human Identification Using Local Roughness Patterns in Heartbeat Signal
Authors: Md. Khayrul Bashar, Md. Saiful Islam, Kimiko Yamashita, Yano Midori
Abstract:
Despite having some progress in human authentication, conventional biometrics (e.g., facial features, fingerprints, retinal scans, gait, voice patterns) are not robust against falsification because they are neither confidential nor secret to an individual. As a non-invasive tool, electrocardiogram (ECG) has recently shown a great potential in human recognition due to its unique rhythms characterizing the variability of human heart structures (chest geometry, sizes, and positions). Moreover, ECG has a real-time vitality characteristic that signifies the live signs, which ensure legitimate individual to be identified. However, the detection accuracy of the current ECG-based methods is not sufficient due to a high variability of the individual’s heartbeats at a different instance of time. These variations may occur due to muscle flexure, the change of mental or emotional states, and the change of sensor positions or long-term baseline shift during the recording of ECG signal. In this study, a new method is proposed for human identification, which is based on the extraction of the local roughness of ECG heartbeat signals. First ECG signal is preprocessed using a second order band-pass Butterworth filter having cut-off frequencies of 0.00025 and 0.04. A number of local binary patterns are then extracted by applying a moving neighborhood window along the ECG signal. At each instant of the ECG signal, the pattern is formed by comparing the ECG intensities at neighboring time points with the central intensity in the moving window. Then, binary weights are multiplied with the pattern to come up with the local roughness description of the signal. Finally, histograms are constructed that describe the heartbeat signals of individual subjects in the database. One advantage of the proposed feature is that it does not depend on the accuracy of detecting QRS complex, unlike the conventional methods. Supervised recognition methods are then designed using minimum distance to mean and Bayesian classifiers to identify authentic human subjects. An experiment with sixty (60) ECG signals from sixty adult subjects from National Metrology Institute of Germany (NMIG) - PTB database, showed that the proposed new method is promising compared to a conventional interval and amplitude feature-based method.Keywords: human identification, ECG biometrics, local roughness patterns, supervised classification
Procedia PDF Downloads 40420358 A User Interface for Easiest Way Image Encryption with Chaos
Authors: D. López-Mancilla, J. M. Roblero-Villa
Abstract:
Since 1990, the research on chaotic dynamics has received considerable attention, particularly in light of potential applications of this phenomenon in secure communications. Data encryption using chaotic systems was reported in the 90's as a new approach for signal encoding that differs from the conventional methods that use numerical algorithms as the encryption key. The algorithms for image encryption have received a lot of attention because of the need to find security on image transmission in real time over the internet and wireless networks. Known algorithms for image encryption, like the standard of data encryption (DES), have the drawback of low level of efficiency when the image is large. The encrypting based on chaos proposes a new and efficient way to get a fast and highly secure image encryption. In this work, a user interface for image encryption and a novel and easiest way to encrypt images using chaos are presented. The main idea is to reshape any image into a n-dimensional vector and combine it with vector extracted from a chaotic system, in such a way that the vector image can be hidden within the chaotic vector. Once this is done, an array is formed with the original dimensions of the image and turns again. An analysis of the security of encryption from the images using statistical analysis is made and is used a stage of optimization for image encryption security and, at the same time, the image can be accurately recovered. The user interface uses the algorithms designed for the encryption of images, allowing you to read an image from the hard drive or another external device. The user interface, encrypt the image allowing three modes of encryption. These modes are given by three different chaotic systems that the user can choose. Once encrypted image, is possible to observe the safety analysis and save it on the hard disk. The main results of this study show that this simple method of encryption, using the optimization stage, allows an encryption security, competitive with complicated encryption methods used in other works. In addition, the user interface allows encrypting image with chaos, and to submit it through any public communication channel, including internet.Keywords: image encryption, chaos, secure communications, user interface
Procedia PDF Downloads 48920357 The Impact of Inflation Rate and Interest Rate on Islamic and Conventional Banking in Afghanistan
Authors: Tareq Nikzad
Abstract:
Since the first bank was established in 1933, Afghanistan's banking sector has seen a number of variations but hasn't been able to grow to its full potential because of the civil war. The implementation of dual banks in Afghanistan is investigated in this study in relation to the effects of inflation and interest rates. This research took data from World Bank Data (WBD) over a period of nineteen years. For the banking sector, inflation, which is the general rise in prices of goods and services over time, presents considerable difficulties. The objectives of this research are to analyze the effect of inflation and interest rates on conventional and Islamic banks in Afghanistan, identify potential differences between these two banking models, and provide insights for policymakers and practitioners. A mixed-methods approach is used in the research to analyze quantitative data and qualitatively examine the unique difficulties that banks in Afghanistan's economic atmosphere encounter. The findings contribute to the understanding of the relationship between interest rate, inflation rate, and the performance of both banking systems in Afghanistan. The paper concludes with recommendations for policymakers and banking institutions to enhance the stability and growth of the banking sector in Afghanistan. Interest is described as "a prefixed rate for use or borrowing of money" from an Islamic perspective. This "prefixed rate," known in Islamic economics as "riba," has been described as "something undesirable." Furthermore, by using the time series regression data technique on the annual data from 2003 to 2021, this research examines the effect of CPI inflation rate and interest rate of Banking in Afghanistan.Keywords: inflation, Islamic banking, conventional banking, interest, Afghanistan, impact
Procedia PDF Downloads 7120356 Effect of Formative Evaluation with Feedback on Students Economics Achievement in Secondary Education
Authors: Salihu Abdullahi Galle
Abstract:
Students' performance in Economics in schools and on standardized exams in Nigeria has been worrying throughout the years, owing to some teachers' use of conventional and lecture teaching methods. Other obstacles include a lack of training, standardized testing pressure, and aversion to change, all of which can have an impact on students' cognitive ability in Economics and future careers. The researchers employed formative evaluation with feedback (FEFB) to support the teaching and learning process by providing constant feedback to both teachers and students. The researchers employed a quasi-experimental research design to examine two teaching methods (FEFB and traditional). The pre-test and post-test interaction effects were evaluated between students in the experimental group (FEFB) and those in the conventional group. The interaction effects of pre-test and post-test on male and female in the two groups were also examined, with 90 participants. The findings show that students exposed to a FEFB-based teaching approach outperform pupils taught in a traditional classroom setting, and there is no gender interaction effect between the two groups. In light of these findings, the researchers urge that Economics teachers employ FEFB during teaching and learning to ensure timely feedback, and that policymakers ensure that Economics teachers receive training and re-training on FEFB approaches.Keywords: formative evaluation with feedback (FEFB), students, economics achievement, secondary education
Procedia PDF Downloads 4820355 Development and Characterization of Ethiopian Bamboo Fiber Polypropylene Composite
Authors: Tigist Girma Kedane
Abstract:
The purpose of this paper is to evaluate the properties of Ethiopian bamboo fiber polymer composites for headliner materials in the automobile industry. Accurate evaluation of its mechanical properties is thus critical for predicting its behavior during a vehicle's interior impact assessment. Conventional headliner materials are higher in weight, nonbiodegradable, expensive in cost, and unecofriendly during processing compared to the current researched materials. Three representatives of bamboo plants are harvested in three regions of bamboo species, three groups of ages, and two harvesting months. The statistical analysis was performed to validate the significant difference between the mean strength of bamboo ages, harvesting seasons, and bamboo species. Two-year-old bamboo fibers have the highest mechanical properties in all ages and November has higher mechanical properties compared to February. Injibara and Kombolcha have the highest and the lowest mechanical properties of bamboo fibers, respectively. Bamboo fiber epoxy composites have higher mechanical properties compared to bamboo fiber polypropylene composites. The flexural strength of bamboo fibre polymer composites has higher properties compared to tensile strength. Ethiopian bamboo fibers and their polymer composites have the best mechanical properties for the composite industry, which is used for headliner materials in the automobile industry compared to conventional headliner materials.Keywords: bampoo species, culm age, harvesting seasons, mechanical properties, polymer composite
Procedia PDF Downloads 5820354 On the Fourth-Order Hybrid Beta Polynomial Kernels in Kernel Density Estimation
Authors: Benson Ade Eniola Afere
Abstract:
This paper introduces a family of fourth-order hybrid beta polynomial kernels developed for statistical analysis. The assessment of these kernels' performance centers on two critical metrics: asymptotic mean integrated squared error (AMISE) and kernel efficiency. Through the utilization of both simulated and real-world datasets, a comprehensive evaluation was conducted, facilitating a thorough comparison with conventional fourth-order polynomial kernels. The evaluation procedure encompassed the computation of AMISE and efficiency values for both the proposed hybrid kernels and the established classical kernels. The consistently observed trend was the superior performance of the hybrid kernels when compared to their classical counterparts. This trend persisted across diverse datasets, underscoring the resilience and efficacy of the hybrid approach. By leveraging these performance metrics and conducting evaluations on both simulated and real-world data, this study furnishes compelling evidence in favour of the superiority of the proposed hybrid beta polynomial kernels. The discernible enhancement in performance, as indicated by lower AMISE values and higher efficiency scores, strongly suggests that the proposed kernels offer heightened suitability for statistical analysis tasks when compared to traditional kernels.Keywords: AMISE, efficiency, fourth-order Kernels, hybrid Kernels, Kernel density estimation
Procedia PDF Downloads 69