Search results for: statistical inference
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4157

Search results for: statistical inference

3587 The 1st Personal Pronouns as Evasive Devices in the 2016 Taiwanese Presidential Debate

Authors: Yan-Chi Chen

Abstract:

This study aims to investigate the 1st personal pronouns as evasive devices used by presidential candidates in the 2016 Taiwanese Presidential Debate within the framework of critical discourse analysis (CDA). This study finds that the personal pronoun ‘I’ is the highest frequent personal pronoun in the 2016 Taiwanese Presidential Debate. Generally speaking, the first personal pronouns were used most in the presidential debate, compared with the second and the third personal pronouns. Hence, a further quantitative analysis is conducted to explore the correlation between the frequencies of the two 1st personal pronouns and the other pronouns. Results show that the number of the personal pronoun ‘I’ increases from 26 to 49, with the personal pronoun ‘we’ decreases from 43 to 15 during the debate. Though it seems the personal pronoun ‘I’ has a higher tendency in pronominal choice, statistical evidence demonstrated that the personal pronoun ‘we’ has the greater statistical significance (p<0.0002), compared with that of ‘I’ (p<0.0116). The comparatively small p-value of the personal pronoun ‘we’ means it ‘has a stronger correlation with the overall pronominal choice, and the personal pronoun ‘we’ is more likely to be used than the personal pronoun ‘I’. Therefore, this study concludes that the pronominal choice varies with different evasive strategies. The ingrained functions of these personal pronouns are mainly categorized as ‘agreement’ and ‘justification’. The personal pronoun ’we’ is preferred in the agreement evasive strategies, and ‘I’ is used for justifying oneself. In addition, the personal pronoun ‘we’ can be defined as both ‘inclusive’ and ‘exclusive’ personal pronoun, which rendered ‘we’ more functions not limited to agreement evasive strategies. In conclusion, although the personal pronoun ‘I’ has the highest occurrences, the personal pronoun ‘we’ is more related to the first pronoun choices.

Keywords: critical discourse analysis (CDA), evasive devices, the 1st personal pronouns, the 2016 Taiwanese Presidential Debate

Procedia PDF Downloads 150
3586 Detecting Earnings Management via Statistical and Neural Networks Techniques

Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie

Abstract:

Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.

Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange

Procedia PDF Downloads 404
3585 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks

Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox

Abstract:

miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.

Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network

Procedia PDF Downloads 484
3584 An Investigation on Smartphone-Based Machine Vision System for Inspection

Authors: They Shao Peng

Abstract:

Machine vision system for inspection is an automated technology that is normally utilized to analyze items on the production line for quality control purposes, it also can be known as an automated visual inspection (AVI) system. By applying automated visual inspection, the existence of items, defects, contaminants, flaws, and other irregularities in manufactured products can be easily detected in a short time and accurately. However, AVI systems are still inflexible and expensive due to their uniqueness for a specific task and consuming a lot of set-up time and space. With the rapid development of mobile devices, smartphones can be an alternative device for the visual system to solve the existing problems of AVI. Since the smartphone-based AVI system is still at a nascent stage, this led to the motivation to investigate the smartphone-based AVI system. This study is aimed to provide a low-cost AVI system with high efficiency and flexibility. In this project, the object detection models, which are You Only Look Once (YOLO) model and Single Shot MultiBox Detector (SSD) model, are trained, evaluated, and integrated with the smartphone and webcam devices. The performance of the smartphone-based AVI is compared with the webcam-based AVI according to the precision and inference time in this study. Additionally, a mobile application is developed which allows users to implement real-time object detection and object detection from image storage.

Keywords: automated visual inspection, deep learning, machine vision, mobile application

Procedia PDF Downloads 105
3583 On the Fourth-Order Hybrid Beta Polynomial Kernels in Kernel Density Estimation

Authors: Benson Ade Eniola Afere

Abstract:

This paper introduces a family of fourth-order hybrid beta polynomial kernels developed for statistical analysis. The assessment of these kernels' performance centers on two critical metrics: asymptotic mean integrated squared error (AMISE) and kernel efficiency. Through the utilization of both simulated and real-world datasets, a comprehensive evaluation was conducted, facilitating a thorough comparison with conventional fourth-order polynomial kernels. The evaluation procedure encompassed the computation of AMISE and efficiency values for both the proposed hybrid kernels and the established classical kernels. The consistently observed trend was the superior performance of the hybrid kernels when compared to their classical counterparts. This trend persisted across diverse datasets, underscoring the resilience and efficacy of the hybrid approach. By leveraging these performance metrics and conducting evaluations on both simulated and real-world data, this study furnishes compelling evidence in favour of the superiority of the proposed hybrid beta polynomial kernels. The discernible enhancement in performance, as indicated by lower AMISE values and higher efficiency scores, strongly suggests that the proposed kernels offer heightened suitability for statistical analysis tasks when compared to traditional kernels.

Keywords: AMISE, efficiency, fourth-order Kernels, hybrid Kernels, Kernel density estimation

Procedia PDF Downloads 58
3582 Challenging the Theory of Mind: Autism Spectrum Disorder, Social Construction, and Biochemical Explanation

Authors: Caroline Kim

Abstract:

The designation autism spectrum disorder (ASD) groups complex disorders in the development of the brain. Autism is defined essentially as a condition in which an individual lacks a theory of mind. The theory of mind, in this sense, explains the ability of an individual to attribute feelings, emotions, or thoughts to another person. An autistic patient is characteristically unable to determine what an interlocutor is feeling, or to understand the beliefs of others. However, it is possible that autism cannot plausibly characterized as the lack of theory of mind in an individual. Genes, the bran, and its interplay with environmental factors may also cause autism. A mutation in a gene may be hereditary, or instigated by diseases such as mumps. Though an autistic patient may experience abnormalities in the cerebellum and the cortical regions, these are in fact only possible theories as to a biochemical explanation behind the disability. The prevailing theory identifying autism with lacking the theory of mind is supported by behavioral observation, but this form of observation is itself determined by socially constructed standards, limiting the possibility for empirical verification. The theory of mind infers that the beliefs and emotions of people are causally based on their behavior. This paper demonstrates the fallacy of this inference, critiquing its basis in socially constructed values, and arguing instead for a biochemical approach free from the conceptual apparatus of language and social expectation.

Keywords: autism spectrum disorder, sociology of psychology, social construction, the theory of mind

Procedia PDF Downloads 384
3581 Frame to Frameless: Stereotactic Operation Progress in Robot Time

Authors: Zengmin Tian, Bin Lv, Rui Hui, Yupeng Liu, Chuan Wang, Qing Liu, Hongyu Li, Yan Qi, Li Song

Abstract:

Objective Robot was used for replacement of the frame in recent years. The paper is to investigate the safety and effectiveness of frameless stereotactic surgery in the treatment of children with cerebral palsy. Methods Clinical data of 425 children with spastic cerebral palsy were retrospectively analyzed. The patients were treated with robot-assistant frameless stereotactic surgery of nuclear mass destruction. The motor function was evaluated by gross motor function measure-88 (GMFM-88) before the operation, 1 week and 3 months after the operation respectively. The statistical analysis was performed. Results The postoperative CT showed that the destruction area covered the predetermined target in all the patients. Minimal bleeding of puncture channel occurred in 2 patient, and mild fever in 3 cases. Otherwise, there was no severe surgical complication occurred. The GMFM-88 scores were 49.1±22.5 before the operation, 52.8±24.2 and 64.2±21.4 at the time of 1 week and 3 months after the operation, respectively. There was statistical difference between before and after the operation (P<0.01). After 3 months, the total effective rate was 98.1%, and the average improvement rate of motor function was 24.3% . Conclusion Replaced the traditional frame, the robot-assistant frameless stereotactic surgery is safe and reliable for children with spastic cerebral palsy, which has positive significance in improving patients’ motor function.

Keywords: cerebral palsy, robotics, stereotactic techniques, frameless operation

Procedia PDF Downloads 69
3580 Effect of White Roofing on Refrigerated Buildings

Authors: Samuel Matylewicz, K. W. Goossen

Abstract:

The deployment of white or cool (high albedo) roofing is a common energy savings recommendation for a variety of buildings all over the world. Here, the effect of a white roof on the energy savings of an ice rink facility in the northeastern US is determined by measuring the effect of solar irradiance on the consumption of the rink's ice refrigeration system. The consumption of the refrigeration system was logged over a year, along with multiple weather vectors, and a statistical model was applied. The experimental model indicates that the expected savings of replacing the existing grey roof with a white roof on the consumption of the refrigeration system is only 4.7 %. This overall result of the statistical model is confirmed with isolated instances of otherwise similar weather days, but cloudy vs. sunny, where there was no measurable difference in refrigeration consumption up to the noise in the local data, which was a few percent. This compares with a simple theoretical calculation that indicates 30% savings. The difference is attributed to a lack of convective cooling of the roof in the theoretical model. The best experimental model shows a relative effect of the weather vectors dry bulb temperature, solar irradiance, wind speed, and relative humidity on refrigeration consumption of 1, 0.026, 0.163, and -0.056, respectively. This result can have an impact on decisions to apply white roofing to refrigerated buildings in general.

Keywords: cool roofs, solar cooling load, refrigerated buildings, energy-efficient building envelopes

Procedia PDF Downloads 111
3579 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 292
3578 The Importance of Knowledge Innovation for External Audit on Anti-Corruption

Authors: Adel M. Qatawneh

Abstract:

This paper aimed to determine the importance of knowledge innovation for external audit on anti-corruption in the entire Jordanian bank companies are listed in Amman Stock Exchange (ASE). The study importance arises from the need to recognize the Knowledge innovation for external audit and anti-corruption as the development in the world of business, the variables that will be affected by external audit innovation are: reliability of financial data, relevantly of financial data, consistency of the financial data, Full disclosure of financial data and protecting the rights of investors to achieve the objectives of the study a questionnaire was designed and distributed to the society of the Jordanian bank are listed in Amman Stock Exchange. The data analysis found out that the banks in Jordan have a positive importance of Knowledge innovation for external audit on anti-corruption. They agree on the benefit of Knowledge innovation for external audit on anti-corruption. The statistical analysis showed that Knowledge innovation for external audit had a positive impact on the anti-corruption and that external audit has a significantly statistical relationship with anti-corruption, reliability of financial data, consistency of the financial data, a full disclosure of financial data and protecting the rights of investors.

Keywords: knowledge innovation, external audit, anti-corruption, Amman Stock Exchange

Procedia PDF Downloads 449
3577 The Effects of a Mathematics Remedial Program on Mathematics Success and Achievement among Beginning Mathematics Major Students: A Regression Discontinuity Analysis

Authors: Kuixi Du, Thomas J. Lipscomb

Abstract:

The proficiency in Mathematics skills is fundamental to success in the STEM disciplines. In the US, beginning college students who are placed in remedial/developmental Mathematics courses frequently struggle to achieve academic success. Therefore, Mathematics remediation in college has become an important concern, and providing Mathematics remediation is a prevalent way to help the students who may not be fully prepared for college-level courses. Programs vary, however, and the effectiveness of a particular remedial Mathematics program must be empirically demonstrated. The purpose of this study was to apply the sharp regression discontinuity (RD) technique to determine the effectiveness of the Jack Leaps Summer (JLS) Mathematic remediation program in supporting improved Mathematics learning outcomes among newly admitted Mathematics students in the South Dakota State University. The researchers studied the newly admitted Fall 2019 cohort of Mathematics majors (n=423). The results indicated that students whose pretest score was lower than the cut-off point and who were assigned to the JLS program experienced significantly higher scores on the post-test (Math 101 final score). Based on these results, there is evidence that the JLS program is effective in meeting its primary objective.

Keywords: causal inference, mathematisc remedial program evaluation, quasi-experimental research design, regression discontinuity design, cohort studies

Procedia PDF Downloads 76
3576 Comparing Groundwater Fluoride Level with WHO Guidelines and Classifying At-Risk Age Groups; Based on Health Risk Assessment

Authors: Samaneh Abolli, Kamyar Yaghmaeian, Ali Arab Aradani, Mahmood Alimohammadi

Abstract:

The main route of fluoride uptake is drinking water. Fluoride absorption in the acceptable range (0.5-1.5 mg L-¹) is suitable for the body, but it's too much consumption can have irreversible health effects. To compare fluoride concentration with the WHO guidelines, 112 water samples were taken from groundwater aquifers in 22 villages of Garmsar County, the central part of Iran, during 2018 to 2019.Fluoride concentration was measured by the SPANDS method, and its non-carcinogenic impacts were calculated using EDI and HQ. The statistical population was divided into four categories of infant, children, teenagers, and adults. Linear regression and Spearman rank correlation coefficient tests were used to investigate the relationships between the well's depth and fluoride concentration in the water samples. The annual mean concentrations of fluoride in 2018 and2019 were 0.75 and 0.64 mg -¹ and, the fluoride mean concentration in the samples classifying the cold and hot seasons of the studied years was 0.709 and 0.689 mg L-¹, respectively. The amount of fluoride in 27% of the samples in both years was less than the acceptable minimum (0.5 mg L-¹). Also, 11% of the samples in2018 (6 samples) had fluoride levels higher than 1.5 mg L-¹. The HQ showed that the children were vulnerable; teenagers and adults were in the next ranks, respectively. Statistical tests showed a reverse and significant correlation (R2 = 0.02, < 0.0001) between well depth and fluoride content. The border between the usefulness/harmfulness of fluoride is very narrow and requires extensive studies.

Keywords: fluoride, groundwater, health risk assessment, hazard quotient, Garmsar

Procedia PDF Downloads 48
3575 Applicability of Fuzzy Logic for Intrusion Detection in Mobile Adhoc Networks

Authors: Ruchi Makani, B. V. R. Reddy

Abstract:

Mobile Adhoc Networks (MANETs) are gaining popularity due to their potential of providing low-cost mobile connectivity solutions to real-world communication problems. Integrating Intrusion Detection Systems (IDS) in MANETs is a tedious task by reason of its distinctive features such as dynamic topology, de-centralized authority and highly controlled/limited resource environment. IDS primarily use automated soft-computing techniques to monitor the inflow/outflow of traffic packets in a given network to detect intrusion. Use of machine learning techniques in IDS enables system to make decisions on intrusion while continuous keep learning about their dynamic environment. An appropriate IDS model is essential to be selected to expedite this application challenges. Thus, this paper focused on fuzzy-logic based machine learning IDS technique for MANETs and presented their applicability for achieving effectiveness in identifying the intrusions. Further, the selection of appropriate protocol attributes and fuzzy rules generation plays significant role for accuracy of the fuzzy-logic based IDS, have been discussed. This paper also presents the critical attributes of MANET’s routing protocol and its applicability in fuzzy logic based IDS.

Keywords: AODV, mobile adhoc networks, intrusion detection, anomaly detection, fuzzy logic, fuzzy membership function, fuzzy inference system

Procedia PDF Downloads 160
3574 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology

Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh

Abstract:

The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.

Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology

Procedia PDF Downloads 206
3573 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications

Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani

Abstract:

This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.

Keywords: human activity detection, media pipe, machine learning, metaverse applications

Procedia PDF Downloads 156
3572 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models

Authors: Bipasha Sen, Aditya Agarwal

Abstract:

Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.

Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition

Procedia PDF Downloads 102
3571 A Development of Creative Instruction Model through Digital Media

Authors: Kathaleeya Chanda, Panupong Chanplin, Suppara Charoenpoom

Abstract:

This purposes of the development of creative instruction model through digital media are to: 1) enable learners to learn from instruction media application; 2) help learners implementing instruction media correctly and appropriately; and 3) facilitate learners to apply technology for searching information and practicing skills to implement technology creatively. The sample group consists of 130 cases of secondary students studying in Bo Kluea School, Bo Kluea Nuea Sub-district, Bo Kluea District, Nan Province. The probability sampling was selected through the simple random sampling and the statistics used in this research are percentage, mean, standard deviation and one group pretest – posttest design. The findings are summarized as follows: The congruence index of instruction media for occupation and technology subjects is appropriate. By comparing between learning achievements before implementing the instruction media and learning achievements after implementing the instruction media, it is found that the posttest achievements are higher than the pretest achievements with statistical significance at the level of .05. For the learning achievements from instruction media implementation, pretest mean is 16.24 while posttest mean is 26.28. Besides, pretest and posttest results are compared and differences of mean are tested, the test results show that the posttest achievements are higher than the pretest achievements with statistical significance at the level of .05. This can be interpreted that the learners achieve better learning progress.

Keywords: teaching learning model, digital media, creative instruction model, Bo Kluea school

Procedia PDF Downloads 126
3570 Presenting a Model Of Empowering New Knowledge-based Companies In Iran Insurance Industry

Authors: Pedram Saadati, Zahra Nazari

Abstract:

In the last decade, the role and importance of knowledge-based technological businesses in the insurance industry has greatly increased, and due to the weakness of previous studies in Iran, the current research deals with the design of the InsurTech empowerment model. In order to obtain the conceptual model of the research, a hybrid framework has been used. The statistical population of the research in the qualitative part were experts, and in the quantitative part, the InsurTech activists. The tools of data collection in the qualitative part were in-depth and semi-structured interviews and structured self-interaction matrix, and in the quantitative part, a researcher-made questionnaire. In the qualitative part, 55 indicators, 20 components and 8 concepts (dimensions) were obtained by the content analysis method, then the relationships of the concepts with each other and the levels of the components were investigated. In the quantitative part, the information was analyzed using the descriptive analytical method in the way of path analysis and confirmatory factor analysis. The proposed model consists of eight dimensions of supporter capability, supervisor of insurance innovation ecosystem, managerial, financial, technological, marketing, opportunity identification, innovative InsurTech capabilities. The results of statistical tests in identifying the relationships of the concepts with each other have been examined in detail and suggestions have been presented in the conclusion section.

Keywords: insurTech, knowledge-base, empowerment model, factor analysis, insurance

Procedia PDF Downloads 25
3569 Lipid Emulsion versus DigiFab in a Rat Model of Acute Digoxin Toxicity

Authors: Cansu Arslan Turan, Tuba Cimilli Ozturk, Ebru Unal Akoglu, Kemal Aygun, Ecem Deniz Kırkpantur, Ozge Ecmel Onur

Abstract:

Although the mechanism of action is not well known, Intravenous Lipid Emulsion (ILE) has been shown to be effective in the treatment of lipophilic drug intoxications. It is thought that ILE probably separate the lipophilic drugs from target tissue by creating a lipid-rich compartment in the plasma. The second theory is that ILE provides energy to myocardium with high dose free fatty acids activating the voltage gated calcium channels in the myocytes. In this study, the effects of ILE treatment on digoxin overdose which are frequently observed in emergency departments was searched in an animal model in terms of cardiac side effects and survival. The study was carried out at Yeditepe University, Faculty of Medicine-Experimental Animals Research Center Labs in December 2015. 40 Sprague-Dawley rats weighing 300-400 g were divided into 5 groups randomly. As the pre-treatment, the first group received saline, the second group received lipid, the third group received DigiFab, and the fourth group received DigiFab and lipid. Following that, digoxin was infused to all groups until death except the control group. First arrhythmia and cardiac arrest occurrence times were recorded. As no medication causing arrhythmia was infused, Group 5 was excluded from the statistical analysis performed for the comparisons of first arrhythmia and death time. According to the results although there was no significant difference in the statistical analysis comparing the four groups, as the rats, only exposed to digoxin intoxication were compared with the rats pre-treated with ILE in terms of first arrhythmia time and cardiac arrest occurrence times, significant difference was observed between the groups. According to our results, using DigiFab treatment, intralipid treatment, intralipid and DigiFab treatment for the rats exposed to digoxin intoxication makes no significant difference in terms of the first arrhythmia and death occurrence time. However, it is not possible to say that at the doses we use in the study, ILE treatment might be successful at least as a known antidote. The fact that the statistical significance between the two groups is not observed in the inter-comparisons of all the groups, the study should be repeated in the larger groups.

Keywords: arrhytmia, cardiac arrest, DigiFab, digoxin intoxication

Procedia PDF Downloads 216
3568 Analysis of Energy Efficiency Behavior with the Use of Train Dynamics Simulator and Statistical Tools: Case Study of Vitoria Minas Railway, Brazil

Authors: Eric Wilson Santos Cabral, Marta Monteiro Da Costa Cruz, Fabio Luis Maciel Machado, Henrique Andrade, Rodrigo Pirola Pestana, Vivian Andrea Parreira

Abstract:

The large variation in the price of diesel in Brazil directly affects the variable cost of companies operating in the transportation sector. In rail transport, the great challenge is to overcome the annual budget, cargo and ore transported with cost reduction in relation to previous years, becoming more efficient every year. Some effective measures are necessary to achieve the reduction of the liter ratio consumed by KTKB (Gross Ton per Kilometer multiplied by thousand). This acronym represents the indicator of energy efficiency of some railroads in the world. This study is divided into two parts: the first, to identify using statistical tools, part of the controlled variables in the railways, which have a correlation with the energy efficiency indicator, seeking to aid decision-making. The second, with the use of the train dynamics simulator, within scenarios defined in the operational reality of a railroad, seeks to optimize the train formations and the train stop model for the change of train drivers. With the completion of the study, companies in the rail sector are expected to be able to reduce some of their transportation costs.

Keywords: railway transport, railway simulation, energy efficiency, fuel consumption

Procedia PDF Downloads 316
3567 Modeling of Surface Roughness in Hard Turning of DIN 1.2210 Cold Work Tool Steel with Ceramic Tools

Authors: Mehmet Erdi Korkmaz, Mustafa Günay

Abstract:

Nowadays, grinding is frequently replaced with hard turning for reducing set up time and higher accuracy. This paper focused on mathematical modeling of average surface roughness (Ra) in hard turning of AISI L2 grade (DIN 1.2210) cold work tool steel with ceramic tools. The steel was hardened to 60±1 HRC after the heat treatment process. Cutting speed, feed rate, depth of cut and tool nose radius was chosen as the cutting conditions. The uncoated ceramic cutting tools were used in the machining experiments. The machining experiments were performed according to Taguchi L27 orthogonal array on CNC lathe. Ra values were calculated by averaging three roughness values obtained from three different points of machined surface. The influences of cutting conditions on surface roughness were evaluated as statistical and experimental. The analysis of variance (ANOVA) with 95% confidence level was applied for statistical analysis of experimental results. Finally, mathematical models were developed using the artificial neural networks (ANN). ANOVA results show that feed rate is the dominant factor affecting surface roughness, followed by tool nose radius and cutting speed.

Keywords: ANN, hard turning, DIN 1.2210, surface roughness, Taguchi method

Procedia PDF Downloads 351
3566 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 253
3565 Attitudinal Change: A Major Therapy for Non–Technical Losses in the Nigerian Power Sector

Authors: Fina O. Faithpraise, Effiong O. Obisung, Azele E. Peter, Chris R. Chatwin

Abstract:

This study investigates and identifies consumer attitude as a major influence that results in non-technical losses in the Nigerian electricity supply sector. This discovery is revealed by the combination of quantitative and qualitative research to complete a survey. The dataset employed is a simple random sampling of households using electricity (public power supply), and the number of units chosen is based on statistical power analysis. The units were subdivided into two categories (household with and without electrical meters). The hypothesis formulated was tested and analyzed using a chi-square statistical method. The results obtained shows that the critical value for the household with electrical prepared meter (EPM) was (9.488 < 427.4) and those without electrical prepared meter (EPMn) was (9.488 < 436.1) with a p-value of 0.01%. The analysis demonstrated so far established the real-time position, which shows that the wrong attitude towards handling the electricity supplied (not turning off light bulbs and electrical appliances when not in use within the rooms and outdoors within 12 hours of the day) characterized the non-technical losses in the power sector. Therefore the adoption of efficient lighting attitudes in individual households as recommended by the researcher is greatly encouraged. The results from this study should serve as a model for energy efficiency and use for the improvement of electricity consumption as well as a stable economy.

Keywords: attitudinal change, household, non-technical losses, prepared meter

Procedia PDF Downloads 164
3564 Statistical Analysis of Parameters Effects on Maximum Strain and Torsion Angle of FRP Honeycomb Sandwich Panels Subjected to Torsion

Authors: Mehdi Modabberifar, Milad Roodi, Ehsan Souri

Abstract:

In recent years, honeycomb fiber reinforced plastic (FRP) sandwich panels have been increasingly used in various industries. Low weight, low price, and high mechanical strength are the benefits of these structures. However, their mechanical properties and behavior have not been fully explored. The objective of this study is to conduct a combined numerical-statistical investigation of honeycomb FRP sandwich beams subject to torsion load. In this paper, the effect of geometric parameters of the sandwich panel on the maximum shear strain in both face and core and angle of torsion in a honeycomb FRP sandwich structures in torsion is investigated. The effect of Parameters including core thickness, face skin thickness, cell shape, cell size, and cell thickness on mechanical behavior of the structure were numerically investigated. Main effects of factors were considered in this paper and regression equations were derived. Taguchi method was employed as experimental design and an optimum parameter combination for the maximum structure stiffness has been obtained. The results showed that cell size and face skin thickness have the most significant impacts on torsion angle, maximum shear strain in face and core.

Keywords: finite element, honeycomb FRP sandwich panel, torsion, civil engineering

Procedia PDF Downloads 400
3563 Music Reading Expertise Facilitates Implicit Statistical Learning of Sentence Structures in a Novel Language: Evidence from Eye Movement Behavior

Authors: Sara T. K. Li, Belinda H. J. Chung, Jeffery C. N. Yip, Janet H. Hsiao

Abstract:

Music notation and text reading both involve statistical learning of music or linguistic structures. However, it remains unclear how music reading expertise influences text reading behavior. The present study examined this issue through an eye-tracking study. Chinese-English bilingual musicians and non-musicians read English sentences, Chinese sentences, musical phrases, and sentences in Tibetan, a language novel to the participants, with their eye movement recorded. Each set of stimuli consisted of two conditions in terms of structural regularity: syntactically correct and syntactically incorrect musical phrases/sentences. They then completed a sentence comprehension (for syntactically correct sentences) or a musical segment/word recognition task afterwards to test their comprehension/recognition abilities. The results showed that in reading musical phrases, as compared with non-musicians, musicians had a higher accuracy in the recognition task, and had shorter reading time, fewer fixations, and shorter fixation duration when reading syntactically correct (i.e., in diatonic key) than incorrect (i.e., in non-diatonic key/atonal) musical phrases. This result reflects their expertise in music reading. Interestingly, in reading Tibetan sentences, which was novel to both participant groups, while non-musicians did not show any behavior differences between reading syntactically correct or incorrect Tibetan sentences, musicians showed a shorter reading time and had marginally fewer fixations when reading syntactically correct sentences than syntactically incorrect ones. However, none of the musicians reported discovering any structural regularities in the Tibetan stimuli after the experiment when being asked explicitly, suggesting that they may have implicitly acquired the structural regularities in Tibetan sentences. This group difference was not observed when they read English or Chinese sentences. This result suggests that music reading expertise facilities reading texts in a novel language (i.e., Tibetan), but not in languages that the readers are already familiar with (i.e., English and Chinese). This phenomenon may be due to the similarities between reading music notations and reading texts in a novel language, as in both cases the stimuli follow particular statistical structures but do not involve semantic or lexical processing. Thus, musicians may transfer their statistical learning skills stemmed from music notation reading experience to implicitly discover structures of sentences in a novel language. This speculation is consistent with a recent finding showing that music reading expertise modulates the processing of English nonwords (i.e., words that do not follow morphological or orthographic rules) but not pseudo- or real words. These results suggest that the modulation of music reading expertise on language processing depends on the similarities in the cognitive processes involved. It also has important implications for the benefits of music education on language and cognitive development.

Keywords: eye movement behavior, eye-tracking, music reading expertise, sentence reading, structural regularity, visual processing

Procedia PDF Downloads 360
3562 The Impact of Public Open Space System on Housing Price in Chicago

Authors: Si Chen, Le Zhang, Xian He

Abstract:

The research explored the influences of public open space system on housing price through hedonic models, in order to support better open space plans and economic policies. We have three initial hypotheses: 1) public open space system has an overall positive influence on surrounding housing prices. 2) Different public open space types have different levels of influence on motivating surrounding housing prices. 3) Walking and driving accessibilities from property to public open spaces have different statistical relation with housing prices. Cook County, Illinois, was chosen to be a study area since data availability, sufficient open space types, and long-term open space preservation strategies. We considered the housing attributes, driving and walking accessibility scores from houses to nearby public open spaces, and driving accessibility scores to hospitals as influential features and used real housing sales price in 2010 as a dependent variable in the built hedonic model. Through ordinary least squares (OLS) regression analysis, General Moran’s I analysis and geographically weighted regression analysis, we observed the statistical relations between public open spaces and housing sale prices in the three built hedonic models and confirmed all three hypotheses.

Keywords: hedonic model, public open space, housing sale price, regression analysis, accessibility score

Procedia PDF Downloads 113
3561 Comparative Study to Evaluate Chronological Age and Dental Age in North Indian Population Using Cameriere Method

Authors: Ranjitkumar Patil

Abstract:

Age estimation has its importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seems to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’smethodand to compare the chronological age and dental age for validation of the Cameriere’smethod in the north Indian population. A comparative study of 02 year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with age ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from the institutional ethical committee. The data was obtained based on inclusion and exclusion criteria was analyzed by a software for dental age estimation. Statistical analysis: Student’s t test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. Regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between male and female, with their dental age assessed by using Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that cameriere’s method can be effectively applied in north Indianpopulation.

Keywords: Forensic, Chronological Age, Dental Age, Skeletal Age

Procedia PDF Downloads 75
3560 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 58
3559 Experimental Design for Formulation Optimization of Nanoparticle of Cilnidipine

Authors: Arti Bagada, Kantilal Vadalia, Mihir Raval

Abstract:

Cilnidipine is practically insoluble in water which results in its insufficient oral bioavailability. The purpose of the present investigation was to formulate cilnidipine nanoparticles by nanoprecipitation method to increase the aqueous solubility and dissolution rate and hence bioavailability by utilizing various experimental statistical design modules. Experimental design were used to investigate specific effects of independent variables during preparation cilnidipine nanoparticles and corresponding responses in optimizing the formulation. Plackett Burman design for independent variables was successfully employed for optimization of nanoparticles of cilnidipine. The influence of independent variables studied were drug concentration, solvent to antisolvent ratio, polymer concentration, stabilizer concentration and stirring speed. The dependent variables namely average particle size, polydispersity index, zeta potential value and saturation solubility of the formulated nanoparticles of cilnidipine. The experiments were carried out according to 13 runs involving 5 independent variables (higher and lower levels) employing Plackett-Burman design. The cilnidipine nanoparticles were characterized by average particle size, polydispersity index value, zeta potential value and saturation solubility and it results were 149 nm, 0.314, 43.24 and 0.0379 mg/ml, respectively. The experimental results were good correlated with predicted data analysed by Plackett-Burman statistical method.

Keywords: dissolution enhancement, nanoparticles, Plackett-Burman design, nanoprecipitation

Procedia PDF Downloads 145
3558 Antibacterial Evaluation, in Silico ADME and QSAR Studies of Some Benzimidazole Derivatives

Authors: Strahinja Kovačević, Lidija Jevrić, Miloš Kuzmanović, Sanja Podunavac-Kuzmanović

Abstract:

In this paper, various derivatives of benzimidazole have been evaluated against Gram-negative bacteria Escherichia coli. For all investigated compounds the minimum inhibitory concentration (MIC) was determined. Quantitative structure-activity relationships (QSAR) attempts to find consistent relationships between the variations in the values of molecular properties and the biological activity for a series of compounds so that these rules can be used to evaluate new chemical entities. The correlation between MIC and some absorption, distribution, metabolism and excretion (ADME) parameters was investigated, and the mathematical models for predicting the antibacterial activity of this class of compounds were developed. The quality of the multiple linear regression (MLR) models was validated by the leave-one-out (LOO) technique, as well as by the calculation of the statistical parameters for the developed models and the results are discussed on the basis of the statistical data. The results of this study indicate that ADME parameters have a significant effect on the antibacterial activity of this class of compounds. Principal component analysis (PCA) and agglomerative hierarchical clustering algorithms (HCA) confirmed that the investigated molecules can be classified into groups on the basis of the ADME parameters: Madin-Darby Canine Kidney cell permeability (MDCK), Plasma protein binding (PPB%), human intestinal absorption (HIA%) and human colon carcinoma cell permeability (Caco-2).

Keywords: benzimidazoles, QSAR, ADME, in silico

Procedia PDF Downloads 358