Search results for: elaboration likelihood model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17321

Search results for: elaboration likelihood model

17171 Exploring the Applications of Neural Networks in the Adaptive Learning Environment

Authors: Baladitya Swaika, Rahul Khatry

Abstract:

Computer Adaptive Tests (CATs) is one of the most efficient ways for testing the cognitive abilities of students. CATs are based on Item Response Theory (IRT) which is based on item selection and ability estimation using statistical methods of maximum information selection/selection from posterior and maximum-likelihood (ML)/maximum a posteriori (MAP) estimators respectively. This study aims at combining both classical and Bayesian approaches to IRT to create a dataset which is then fed to a neural network which automates the process of ability estimation and then comparing it to traditional CAT models designed using IRT. This study uses python as the base coding language, pymc for statistical modelling of the IRT and scikit-learn for neural network implementations. On creation of the model and on comparison, it is found that the Neural Network based model performs 7-10% worse than the IRT model for score estimations. Although performing poorly, compared to the IRT model, the neural network model can be beneficially used in back-ends for reducing time complexity as the IRT model would have to re-calculate the ability every-time it gets a request whereas the prediction from a neural network could be done in a single step for an existing trained Regressor. This study also proposes a new kind of framework whereby the neural network model could be used to incorporate feature sets, other than the normal IRT feature set and use a neural network’s capacity of learning unknown functions to give rise to better CAT models. Categorical features like test type, etc. could be learnt and incorporated in IRT functions with the help of techniques like logistic regression and can be used to learn functions and expressed as models which may not be trivial to be expressed via equations. This kind of a framework, when implemented would be highly advantageous in psychometrics and cognitive assessments. This study gives a brief overview as to how neural networks can be used in adaptive testing, not only by reducing time-complexity but also by being able to incorporate newer and better datasets which would eventually lead to higher quality testing.

Keywords: computer adaptive tests, item response theory, machine learning, neural networks

Procedia PDF Downloads 175
17170 Downside Risk Analysis of the Nigerian Stock Market: A Value at Risk Approach

Authors: Godwin Chigozie Okpara

Abstract:

This paper using standard GARCH, EGARCH, and TARCH models on day of the week return series (of 246 days) from the Nigerian Stock market estimated the model variants’ VaR. An asymmetric return distribution and fat-tail phenomenon in financial time series were considered by estimating the models with normal, student t and generalized error distributions. The analysis based on Akaike Information Criterion suggests that the EGARCH model with student t innovation distribution can furnish more accurate estimate of VaR. In the light of this, we apply the likelihood ratio tests of proportional failure rates to VaR derived from EGARCH model in order to determine the short and long positions VaR performances. The result shows that as alpha ranges from 0.05 to 0.005 for short positions, the failure rate significantly exceeds the prescribed quintiles while it however shows no significant difference between the failure rate and the prescribed quantiles for long positions. This suggests that investors and portfolio managers in the Nigeria stock market have long trading position or can buy assets with concern on when the asset prices will fall. Precisely, the VaR estimates for the long position range from -4.7% for 95 percent confidence level to -10.3% for 99.5 percent confidence level.

Keywords: downside risk, value-at-risk, failure rate, kupiec LR tests, GARCH models

Procedia PDF Downloads 443
17169 Revealing the Risks of Obstructive Sleep Apnea

Authors: Oyuntsetseg Sandag, Lkhagvadorj Khosbayar, Naidansuren Tsendeekhuu, Densenbal Dansran, Bandi Solongo

Abstract:

Introduction: Obstructive sleep apnea (OSA) is a common disorder affecting at least 2% to 4% of the adult population. It is estimated that nearly 80% of men and 93% of women with moderate to severe sleep apnea are undiagnosed. A number of screening questionnaires and clinical screening models have been developed to help identify patients with OSA, also it’s indeed to clinical practice. Purpose of study: Determine dependence of obstructive sleep apnea between for severe risk and risk factor. Material and Methods: A cross-sectional study included 114 patients presenting from theCentral state 3th hospital and Central state 1th hospital. Patients who had obstructive sleep apnea (OSA)selected in this study. Standard StopBang questionnaire was obtained from all patients.According to the patients’ response to the StopBang questionnaire was divided into low risk, intermediate risk, and high risk.Descriptive statistics were presented mean ± standard deviation (SD). Each questionnaire was compared on the likelihood ratio for a positive result, the likelihood ratio for a negative test result of regression. Statistical analyses were performed utilizing SPSS 16. Results: 114 patients were obtained (mean age 48 ± 16, male 57)that divided to low risk 54 (47.4%), intermediate risk 33 (28.9%), high risk 27 (23.7%). Result of risk factor showed significantly increasing that mean age (38 ± 13vs. 54 ± 14 vs. 59 ± 10, p<0.05), blood pressure (115 ± 18vs. 133 ± 19vs. 142 ± 21, p<0.05), BMI(24 IQR 22; 26 vs. 24 IQR 22; 29 vs. 28 IQR 25; 34, p<0.001), neck circumference (35 ± 3.4 vs. 38 ± 4.7 vs. 41 ± 4.4, p<0.05)were increased. Results from multiple logistic regressions showed that age is significantly independently factor for OSA (odds ratio 1.07, 95% CI 1.02-1.23, p<0.01). Predictive value of age was significantly higher factor for OSA (AUC=0.833, 95% CI 0.758-0.909, p<0.001). Our study showing that risk of OSA is beginning 47 years old (sensitivity 78.3%, specifity74.1%). Conclusions: According to most of all patients’ response had intermediate risk and high risk. Also, age, blood pressure, neck circumference and BMI were increased such as risk factor was increased for OSA. Especially age is independently factor and highest significance for OSA. Patients’ age one year is increased likelihood risk factor 1.1 times is increased.

Keywords: obstructive sleep apnea, Stop-Bang, BMI (Body Mass Index), blood pressure

Procedia PDF Downloads 310
17168 Prediction Factor of Recurrence Supraventricular Tachycardia After Adenosine Treatment in the Emergency Department

Authors: Chaiyaporn Yuksen

Abstract:

Backgroud: Supraventricular tachycardia (SVT) is an abnormally fast atrial tachycardia characterized by narrow (≤ 120 ms) and constant QRS. Adenosine was the drug of choice; the first dose was 6 mg. It can be repeated with the second and third doses of 12 mg, with greater than 90% success. The study found that patients observed at 4 hours after normal sinus rhythm was no recurrence within 24 hours. The objective of this study was to investigate the factors that influence the recurrence of SVT after adenosine in the emergency department (ED). Method: The study was conducted retrospectively exploratory model, prognostic study at the Emergency Department (ED) in Faculty of Medicine, Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand. The study was conducted for ten years period between 2010 and 2020. The inclusion criteria were age > 15 years, visiting the ED with SVT, and treating with adenosine. Those patients were recorded with the recurrence SVT in ED. The multivariable logistic regression model developed the predictive model and prediction score for recurrence PSVT. Result: 264 patients met the study criteria. Of those, 24 patients (10%) had recurrence PSVT. Five independent factors were predictive of recurrence PSVT. There was age>65 years, heart rate (after adenosine) > 100 per min, structural heart disease, and dose of adenosine. The clinical risk score to predict recurrence PSVT is developed accuracy 74.41%. The score of >6 had the likelihood ratio of recurrence PSVT by 5.71 times Conclusion: The clinical predictive score of > 6 was associated with recurrence PSVT in ED.

Keywords: clinical prediction score, SVT, recurrence, emergency department

Procedia PDF Downloads 155
17167 Operating Characteristics of Point-of-Care Ultrasound in Identifying Skin and Soft Tissue Abscesses in the Emergency Department

Authors: Sathyaseelan Subramaniam, Jacqueline Bober, Jennifer Chao, Shahriar Zehtabchi

Abstract:

Background: Emergency physicians frequently evaluate skin and soft tissue infections in order to differentiate abscess from cellulitis. This helps determine which patients will benefit from incision and drainage. Our objective was to determine the operating characteristics of point-of-care ultrasound (POCUS) compared to clinical examination in identifying abscesses in emergency department (ED) patients with features of skin and soft tissue infections. Methods: We performed a comprehensive search in the following databases: Medline, Web of Science, EMBASE, CINAHL and Cochrane Library. Trials were included if they compared the operating characteristics of POCUS with clinical examination in identifying skin and soft tissue abscesses. Trials that included patients with oropharyngeal abscesses or that requiring abscess drainage in the operating room were excluded. The presence of an abscess was determined by pus drainage. No pus seen on incision or resolution of symptoms without pus drainage at follow up, determined the absence of an abscess. Quality of included trials was assessed using GRADE criteria. Operating characteristics of POCUS are reported as sensitivity, specificity, positive likelihood (LR+) and negative likelihood (LR-) ratios and the respective 95% confidence intervals (CI). Summary measures were calculated by generating a hierarchical summary receiver operating characteristic model (HSROC). Results: Out of 3203 references identified, 5 observational studies with 615 patients in aggregate were included (2 adults and 3 pediatrics). We rated the quality of 3 trials as low and 2 as very low. The operating characteristics of POCUS and clinical examination in identifying soft tissue abscesses are presented in the table. The HSROC for POCUS revealed a sensitivity of 96% (95% CI = 89-98%), specificity of 79% (95% CI = 71-86), LR+ of 4.6 (95% CI = 3.2-6.8), and LR- of 0.06 (95% CI = 0.02-0.2). Conclusion: Existing evidence indicates that POCUS is useful in identifying abscesses in ED patients with skin or soft tissue infections.

Keywords: abscess, point-of-care ultrasound, pocus, skin and soft tissue infection

Procedia PDF Downloads 369
17166 Phylogenetic Analysis Based On the Internal Transcribed Spacer-2 (ITS2) Sequences of Diadegma semiclausum (Hymenoptera: Ichneumonidae) Populations Reveals Significant Adaptive Evolution

Authors: Ebraheem Al-Jouri, Youssef Abu-Ahmad, Ramasamy Srinivasan

Abstract:

The parasitoid, Diadegma semiclausum (Hymenoptera: Ichneumonidae) is one of the most effective exotic parasitoids of diamondback moth (DBM), Plutella xylostella in the lowland areas of Homs, Syria. Molecular evolution studies are useful tools to shed light on the molecular bases of insect geographical spread and adaptation to new hosts and environment and for designing better control strategies. In this study, molecular evolution analysis was performed based on the 42 nuclear internal transcribed spacer-2 (ITS2) sequences representing the D. semiclausum and eight other Diadegma spp. from Syria and worldwide. Possible recombination events were identified by RDP4 program. Four potential recombinants of the American D. insulare and D. fenestrale (Jeju) were detected. After detecting and removing recombinant sequences, the ratio of non-synonymous (dN) to synonymous (dS) substitutions per site (dN/dS=ɷ) has been used to identify codon positions involved in adaptive processes. Bayesian techniques were applied to detect selective pressures at a codon level by using five different approaches including: fixed effects likelihood (FEL), internal fixed effects likelihood (IFEL), random effects method (REL), mixed effects model of evolution (MEME) and Program analysis of maximum liklehood (PAML). Among the 40 positively selected amino acids (aa) that differed significantly between clades of Diadegma species, three aa under positive selection were only identified in D. semiclausum. Additionally, all D. semiclausum branches tree were highly found under episodic diversifying selection (EDS) at p≤0.05. Our study provide evidence that both recombination and positive selection have contributed to the molecular diversity of Diadegma spp. and highlights the significant contribution of D. semiclausum in adaptive evolution and influence the fitness in the DBM parasitoid.

Keywords: diadegma sp, DBM, ITS2, phylogeny, recombination, dN/dS, evolution, positive selection

Procedia PDF Downloads 416
17165 Aerodynamic Modeling Using Flight Data at High Angle of Attack

Authors: Rakesh Kumar, A. K. Ghosh

Abstract:

The paper presents the modeling of linear and nonlinear longitudinal aerodynamics using real flight data of Hansa-3 aircraft gathered at low and high angles of attack. The Neural-Gauss-Newton (NGN) method has been applied to model the linear and nonlinear longitudinal dynamics and estimate parameters from flight data. Unsteady aerodynamics due to flow separation at high angles of attack near stall has been included in the aerodynamic model using Kirchhoff’s quasi-steady stall model. NGN method is an algorithm that utilizes Feed Forward Neural Network (FFNN) and Gauss-Newton optimization to estimate the parameters and it does not require any a priori postulation of mathematical model or solving of equations of motion. NGN method was validated on real flight data generated at moderate angles of attack before application to the data at high angles of attack. The estimates obtained from compatible flight data using NGN method were validated by comparing with wind tunnel values and the maximum likelihood estimates. Validation was also carried out by comparing the response of measured motion variables with the response generated by using estimates a different control input. Next, NGN method was applied to real flight data generated by executing a well-designed quasi-steady stall maneuver. The results obtained in terms of stall characteristics and aerodynamic parameters were encouraging and reasonably accurate to establish NGN as a method for modeling nonlinear aerodynamics from real flight data at high angles of attack.

Keywords: parameter estimation, NGN method, linear and nonlinear, aerodynamic modeling

Procedia PDF Downloads 445
17164 Logistic Regression Model versus Additive Model for Recurrent Event Data

Authors: Entisar A. Elgmati

Abstract:

Recurrent infant diarrhea is studied using daily data collected in Salvador, Brazil over one year and three months. A logistic regression model is fitted instead of Aalen's additive model using the same covariates that were used in the analysis with the additive model. The model gives reasonably similar results to that using additive regression model. In addition, the problem with the estimated conditional probabilities not being constrained between zero and one in additive model is solved here. Also martingale residuals that have been used to judge the goodness of fit for the additive model are shown to be useful for judging the goodness of fit of the logistic model.

Keywords: additive model, cumulative probabilities, infant diarrhoea, recurrent event

Procedia PDF Downloads 635
17163 Quantitative Ranking Evaluation of Wine Quality

Authors: A. Brunel, A. Kernevez, F. Leclere, J. Trenteseaux

Abstract:

Today, wine quality is only evaluated by wine experts with their own different personal tastes, even if they may agree on some common features. So producers do not have any unbiased way to independently assess the quality of their products. A tool is here proposed to evaluate wine quality by an objective ranking based upon the variables entering wine elaboration, and analysed through principal component analysis (PCA) method. Actual climatic data are compared by measuring the relative distance between each considered wine, out of which the general ranking is performed.

Keywords: wine, grape, weather conditions, rating, climate, principal component analysis, metric analysis

Procedia PDF Downloads 318
17162 Heavy Vehicles Crash Injury Severity at T-Intersections

Authors: Sivanandan Balakrishnan, Sara Moridpour, Richard Tay

Abstract:

Heavy vehicles make a significant contribution to many developed economies, including Australia, because they are a major means of transporting goods within these countries. With the increase in road freight, there will be an increase in the heavy vehicle traffic proportion, and consequently, an increase in the possibility of collisions involving heavy vehicles. Crashes involving heavy vehicles are a major road safety concern because of the higher likelihood of fatal and serious injury, especially to any small vehicle occupant involved. The primary objective of this research is to identify the factors influencing injury severity to occupants in vehicle collisions involving heavy vehicle at T- intersection using a binary logit model in Victoria, Australia. Our results show that the factors influencing injury severity include occupants' gender, age and restraint use. Also, vehicles' type, movement, point-of-impact and damage, time-of-day, day-of-week and season, higher percentage of trucks in traffic volume, hit pedestrians, number of occupants involved and type of collisions are associated with severe injury.

Keywords: binary logit model, heavy vehicle, injury severity, T-intersections

Procedia PDF Downloads 394
17161 Intersection of Racial and Gender Microaggressions: Social Support as a Coping Strategy among Indigenous LGBTQ People in Taiwan

Authors: Ciwang Teyra, A. H. Y. Lai

Abstract:

Introduction: Indigenous LGBTQ individuals face with significant life stress such as racial and gender discrimination and microaggressions, which may lead to negative impacts of their mental health. Although studies relevant to Taiwanese indigenous LGBTQpeople gradually increase, most of them are primarily conceptual or qualitative in nature. This research aims to fulfill the gap by offering empirical quantitative evidence, especially investigating the impact of racial and gender microaggressions on mental health among Taiwanese indigenous LGBTQindividuals with an intersectional perspective, as well as examine whether social support can help them to cope with microaggressions. Methods: Participants were (n=200; mean age=29.51; Female=31%, Male=61%, Others=8%). A cross-sectional quantitative design was implemented using data collected in the year 2020. Standardised measurements was used, including Racial Microaggression Scale (10 items), Gender Microaggression Scale (9 items), Social Support Questionnaire-SF(6 items); Patient Health Questionnaire(9-item); and Generalised Anxiety Disorder(7-item). Covariates were age, gender, and perceived economic hardships. Structural equation modelling (SEM) was employed using Mplus 8.0 with the latent variables of depression and anxiety as outcomes. A main effect SEM model was first established (Model1).To test the moderation effects of perceived social support, an interaction effect model (Model 2) was created with interaction terms entered into Model1. Numerical integration was used with maximum likelihood estimation to estimate the interaction model. Results: Model fit statistics of the Model 1:X2(df)=1308.1 (795), p<.05; CFI/TLI=0.92/0.91; RMSEA=0.06; SRMR=0.06. For Model, the AIC and BIC values of Model 2 improved slightly compared to Model 1(AIC =15631 (Model1) vs. 15629 (Model2); BIC=16098 (Model1) vs. 16103 (Model2)). Model 2 was adopted as the final model. In main effect model 1, racialmicroaggressionand perceived social support were associated with depression and anxiety, but not sexual orientation microaggression(Indigenous microaggression: b = 0.27 for depression; b=0.38 for anxiety; Social support: b=-0.37 for depression; b=-0.34 for anxiety). Thus, an interaction term between social support and indigenous microaggression was added in Model 2. In the final Model 2, indigenous microaggression and perceived social support continues to be statistically significant predictors of both depression and anxiety. Social support moderated the effect of indigenous microaggression of depression (b=-0.22), but not anxiety. All covariates were not statistically significant. Implications: Results indicated that racial microaggressions have a significant impact on indigenous LGBTQ people’s mental health. Social support plays as a crucial role to buffer the negative impact of racial microaggression. To promote indigenous LGBTQ people’s wellbeing, it is important to consider how to support them to develop social support network systems.

Keywords: microaggressions, intersectionality, indigenous population, mental health, social support

Procedia PDF Downloads 146
17160 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 72
17159 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton

Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani

Abstract:

Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.

Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton

Procedia PDF Downloads 323
17158 Residual Analysis and Ground Motion Prediction Equation Ranking Metrics for Western Balkan Strong Motion Database

Authors: Manuela Villani, Anila Xhahysa, Christopher Brooks, Marco Pagani

Abstract:

The geological structure of Western Balkans is strongly affected by the collision between Adria microplate and the southwestern Euroasia margin, resulting in a considerably active seismic region. The Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project (BSHAP) (2007-2011, 2012-2015) by NATO supported the preparation of new seismic hazard maps of the Western Balkan, but when inspecting the seismic hazard models produced later by these countries on a national scale, significant differences in design PGA values are observed in the border, for instance, North Albania-Montenegro, South Albania- Greece, etc. Considering the fact that the catalogues were unified and seismic sources were defined within BSHAP framework, obviously, the differences arise from the Ground Motion Prediction Equations selection, which are generally the component with highest impact on the seismic hazard assessment. At the time of the project, a modest database was present, namely 672 three-component records, whereas nowadays, this strong motion database has increased considerably up to 20,939 records with Mw ranging in the interval 3.7-7 and epicentral distance distribution from 0.47km to 490km. Statistical analysis of the strong motion database showed the lack of recordings in the moderate-to-large magnitude and short distance ranges; therefore, there is need to re-evaluate the Ground Motion Prediction Equation in light of the recently updated database and the new generations of GMMs. In some cases, it was observed that some events were more extensively documented in one database than the other, like the 1979 Montenegro earthquake, with a considerably larger number of records in the BSHAP Analogue SM database when compared to ESM23. Therefore, the strong motion flat-file provided from the Harmonization of Seismic Hazard Maps in the Western Balkan Countries Project was merged with the ESM23 database for the polygon studied in this project. After performing the preliminary residual analysis, the candidate GMPE-s were identified. This process was done using the GMPE performance metrics available within the SMT in the OpenQuake Platform. The Likelihood Model and Euclidean Distance Based Ranking (EDR) were used. Finally, for this study, a GMPE logic tree was selected and following the selection of candidate GMPEs, model weights were assigned using the average sample log-likelihood approach of Scherbaum.

Keywords: residual analysis, GMPE, western balkan, strong motion, openquake

Procedia PDF Downloads 88
17157 Maximum-likelihood Inference of Multi-Finger Movements Using Neural Activities

Authors: Kyung-Jin You, Kiwon Rhee, Marc H. Schieber, Nitish V. Thakor, Hyun-Chool Shin

Abstract:

It remains unknown whether M1 neurons encode multi-finger movements independently or as a certain neural network of single finger movements although multi-finger movements are physically a combination of single finger movements. We present an evidence of correlation between single and multi-finger movements and also attempt a challenging task of semi-blind decoding of neural data with minimum training of the neural decoder. Data were collected from 115 task-related neurons in M1 of a trained rhesus monkey performing flexion and extension of each finger and the wrist (12 single and 6 two-finger-movements). By exploiting correlation of temporal firing pattern between movements, we found that correlation coefficient for physically related movements pairs is greater than others; neurons tuned to single finger movements increased their firing rate when multi-finger commands were instructed. According to this knowledge, neural semi-blind decoding is done by choosing the greatest and the second greatest likelihood for canonical candidates. We achieved a decoding accuracy about 60% for multiple finger movement without corresponding training data set. this results suggest that only with the neural activities on single finger movements can be exploited to control dexterous multi-fingered neuroprosthetics.

Keywords: finger movement, neural activity, blind decoding, M1

Procedia PDF Downloads 320
17156 Elaboration Development Strategy and the Analysis of Trends Shaping the Information Economy in Azerbaijan on the Basis of the Experience of Foreign Countries

Authors: Rasim M. Alguliyev, Alovsat G. Aliyev

Abstract:

In the paper information on economic development trends in developed countries are analyzed. The current status of information society and economy of the country is reviewed and some recommendations are given for future development. The problems of Information Society and establishment of its innovative economy are studied. In this turn, development trends information economy in developed countries are analyzed.

Keywords: information economy, ICT sector, ICT infrastructure, innovation, innovation system hi-tech products, antimonopoly policy

Procedia PDF Downloads 338
17155 Prediction Factor of Recurrence Supraventricular Tachycardia After Adenosine Treatment in the Emergency Department

Authors: Welawat Tienpratarn, Chaiyaporn Yuksen, Rungrawin Promkul, Chetsadakon Jenpanitpong, Pajit Bunta, Suthap Jaiboon

Abstract:

Supraventricular tachycardia (SVT) is an abnormally fast atrial tachycardia characterized by narrow (≤ 120 ms) and constant QRS. Adenosine was the drug of choice; the first dose was 6 mg. It can be repeated with the second and third doses of 12 mg, with greater than 90% success. The study found that patients observed at 4 hours after normal sinus rhythm was no recurrence within 24 hours. The objective of this study was to investigate the factors that influence the recurrence of SVT after adenosine in the emergency department (ED). The study was conducted retrospectively exploratory model, prognostic study at the Emergency Department (ED) in Faculty of Medicine, Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand. The study was conducted for ten years period between 2010 and 2020. The inclusion criteria were age > 15 years, visiting the ED with SVT, and treating with adenosine. Those patients were recorded with the recurrence SVT in ED. The multivariable logistic regression model developed the predictive model and prediction score for recurrence PSVT. 264 patients met the study criteria. Of those, 24 patients (10%) had recurrence PSVT. Five independent factors were predictive of recurrence PSVT. There was age>65 years, heart rate (after adenosine) > 100 per min, structural heart disease, and dose of adenosine. The clinical risk score to predict recurrence PSVT is developed accuracy 74.41%. The score of >6 had the likelihood ratio of recurrence PSVT by 5.71 times. The clinical predictive score of > 6 was associated with recurrence PSVT in ED.

Keywords: supraventricular tachycardia, recurrance, emergency department, adenosine

Procedia PDF Downloads 117
17154 Diffusion Magnetic Resonance Imaging and Magnetic Resonance Spectroscopy in Detecting Malignancy in Maxillofacial Lesions

Authors: Mohamed Khalifa Zayet, Salma Belal Eiid, Mushira Mohamed Dahaba

Abstract:

Introduction: Malignant tumors may not be easily detected by traditional radiographic techniques especially in an anatomically complex area like maxillofacial region. At the same time, the advent of biological functional MRI was a significant footstep in the diagnostic imaging field. Objective: The purpose of this study was to define the malignant metabolic profile of maxillofacial lesions using diffusion MRI and magnetic resonance spectroscopy, as adjunctive aids for diagnosing of such lesions. Subjects and Methods: Twenty-one patients with twenty-two lesions were enrolled in this study. Both morphological and functional MRI scans were performed, where T1, T2 weighted images, diffusion-weighted MRI with four apparent diffusion coefficient (ADC) maps were constructed for analysis, and magnetic resonance spectroscopy with qualitative and semi-quantitative analyses of choline and lactate peaks were applied. Then, all patients underwent incisional or excisional biopsies within two weeks from MR scans. Results: Statistical analysis revealed that not all the parameters had the same diagnostic performance, where lactate had the highest areas under the curve (AUC) of 0.9 and choline was the lowest with insignificant diagnostic value. The best cut-off value suggested for lactate was 0.125, where any lesion above this value is supposed to be malignant with 90 % sensitivity and 83.3 % specificity. Despite that ADC maps had comparable AUCs still, the statistical measure that had the final say was the interpretation of likelihood ratio. As expected, lactate again showed the best combination of positive and negative likelihood ratios, whereas for the maps, ADC map with 500 and 1000 b-values showed the best realistic combination of likelihood ratios, however, with lower sensitivity and specificity than lactate. Conclusion: Diffusion weighted imaging and magnetic resonance spectroscopy are state-of-art in the diagnostic arena and they manifested themselves as key players in the differentiation process of orofacial tumors. The complete biological profile of malignancy can be decoded as low ADC values, high choline and/or high lactate, whereas that of benign entities can be translated as high ADC values, low choline and no lactate.

Keywords: diffusion magnetic resonance imaging, magnetic resonance spectroscopy, malignant tumors, maxillofacial

Procedia PDF Downloads 171
17153 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model

Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova

Abstract:

The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.

Keywords: bacteriocins, cross-contamination, mathematical model, temperature

Procedia PDF Downloads 144
17152 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data

Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin

Abstract:

The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.

Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test

Procedia PDF Downloads 298
17151 Using the Yield-SAFE Model to Assess the Impacts of Climate Change on Yield of Coffee (Coffea arabica L.) Under Agroforestry and Monoculture Systems

Authors: Tesfay Gidey Bezabeh, Tânia Sofia Oliveira, Josep Crous-Duran, João H. N. Palma

Abstract:

Ethiopia's economy depends strongly on Coffea arabica production. Coffee, like many other crops, is sensitive to climate change. An urgent development and application of strategies against the negative impacts of climate change on coffee production is important. Agroforestry-based system is one of the strategies that may ensure sustainable coffee production amidst the likelihood of future impacts of climate change. This system involves the combination of trees in buffer extremes, thereby modifying microclimate conditions. This paper assessed coffee production under 1) coffee monoculture and 2) coffee grown using an agroforestry system, under a) current climate and b) two different future climate change scenarios. The study focused on two representative coffee-growing regions of Ethiopia under different soil, climate, and elevation conditions. A process-based growth model (Yield-SAFE) was used to simulate coffee production for a time horizon of 40 years. Climate change scenarios considered were representative concentration pathways (RCP) 4.5 and 8.5. The results revealed that in monoculture systems, the current coffee yields are between 1200-1250 kg ha⁻¹ yr⁻¹, with an expected decrease between 4-38% and 20-60% in scenarios RCP 4.5 and 8.5, respectively. However, in agroforestry systems, the current yields are between 1600-2200 kg ha⁻¹ yr⁻¹; the decrease was lower, ranging between 4-13% and 16-25% in RCP 4.5 and 8.5 scenarios, respectively. From the results, it can be concluded that coffee production under agroforestry systems has a higher level of resilience when facing future climate change and reinforces the idea of using this type of management in the near future for adapting climate change's negative impacts on coffee production.

Keywords: Albizia gummifera, CORDEX, Ethiopia, HADCM3 model, process-based model

Procedia PDF Downloads 118
17150 Crickets as Social Business Model for Rural Women in Colombia

Authors: Diego Cruz, Helbert Arevalo, Diana Vernot

Abstract:

In 2013, the Food and Agriculture Organization of the United Nations (FAO) said that insect production for food and feed could become an economic opportunity for rural women in developing countries. However, since then, just a few initiatives worldwide had tried to implement this kind of project in zones of tropical countries without previous experience in cricket production and insect human consumption, such as Colombia. In this project, ArthroFood company and the University of La Sabana join efforts to make a holistic multi-perspective analysis from biological, economic, culinary, and social sides of the Gryllodes sigillatus production by rural women of the municipality of La Mesa, Cundinamarca, Colombia. From a biological and economic perspective, G. sigillatus production in a 60m2 greenhouse was evaluated considering the effect of rearing density and substrates on final weight and length, developing time, survival rate, and proximate composition. Additionally, the production cost and labor hours were recorded for five months. On the other hand, from a socio- economic side, the intention of the rural women to implement cricket farms or micro-entrepreneurship around insect production was evaluated after developing ethnographies and empowerment, entrepreneurship, and cricket production workshops. Finally, the results of the elaboration of culinary recipes with cricket powder incorporating cultural aspects of the context of La Mesa, Cundinamarca, will be presented. This project represents Colombia's first attempt to create a social business model of cricket production involving rural women, academies, the private sector, and local authorities.

Keywords: cricket production, developing country, edible insects, entrepreneurship, insect culinary recipes

Procedia PDF Downloads 104
17149 A Risk Management Approach to the Diagnosis of Attention Deficit-Hyperactivity Disorder

Authors: Lloyd A. Taylor

Abstract:

An increase in the prevalence of Attention Deficit-Hyperactivity Disorder (ADHD) highlights the need to consider factors that may be exacerbating symptom presentation. Traditional diagnostic criteria provide a little framework for healthcare providers to consider as they attempt to diagnose and treat children with behavioral problems. In fact, aside from exclusion criteria, limited alternative considerations are available, and approaches fail to consider the impact of outside factors that could increase or decrease the likelihood of appropriate diagnosis and success of interventions. This paper will consider specific systems-based factors that influence behavior and intervention successes that, when not considered, could account for the upsurge of diagnoses. These include understanding (1) challenges in the healthcare system, (2) the influence and impact of educators and the educational system, (3) technology use, and (4) patient and parental attitudes about the diagnosis of ADHD. These factors must be considered both individually and as a whole when considering both the increase in diagnoses and the subsequent increases in prescriptions for psychostimulant medication. A theoretical model based on a risk management approach will be presented. Finally, data will be presented that demonstrates pediatric provider satisfaction with this approach to diagnoses and treatment of ADHD as it relates to practice trends.

Keywords: ADHD, diagnostic criteria, risk management model, pediatricians

Procedia PDF Downloads 94
17148 The Role of Teacher-Student Relationship on Teachers’ Attitudes towards School Bullying

Authors: Ghada Shahrour, Nusiebeh Ananbh, Heyam Dalky, Mohammad Rababa, Fatmeh Alzoubi

Abstract:

Positive teacher-student relationship has been found to affect students’ attitudes towards bullying and, in turn, their engagement in bullying behavior. However, no investigation has been conducted to explore whether teacher-student relationship affects teachers’ attitudes towards bullying. The aim of this study was to examine the role of teacher-student relationship on teachers’ attitudes towards bullying in terms of bullying seriousness, empathic responding, and likelihood to intervene in bullying situation. A cross-sectional, descriptive design was employed among a convenience sample of 173 school teachers (50.9% female) of 12 to 17-year-old students. The teachers were recruited from secondary public schools of three governorates in the Northern district of Jordan. Each group of students has multiple teachers for different subjects. Results showed that teacher-student relationship is partially related to teachers’ attitudes towards bullying. More specifically, having a close teacher-student relationship significantly increased teachers’ perception of bullying seriousness and empathy but not the likelihood to intervene. Research is needed to examine teachers’ obstacles for not providing bullying interventions, as the barriers may be culturally contextualized. Meanwhile, interventions that promote quality teacher-student relationship are necessary to increase teachers’ perception of bullying seriousness and empathy. Students have been found to adopt the values of their teachers, and this may deter them from engaging in bullying behavior.

Keywords: school bullying, teachers’ attitudes, teacher-student relationship, adolescent students

Procedia PDF Downloads 100
17147 Metaheuristics to Solve Tasks Scheduling

Authors: Rachid Ziteuni, Selt Omar

Abstract:

In this paper, we propose a new polynomial metaheuristic elaboration (tabu search) for solving scheduling problems. This method allows us to solve the scheduling problem of n tasks on m identical parallel machines with unavailability periods. This problem is NP-complete in the strong sens and finding an optimal solution appears unlikely. Note that all data in this problem are integer and deterministic. The performance criterion to optimize in this problem which we denote Pm/N-c/summs of (wjCj) is the weighted sum of the end dates of tasks.

Keywords: scheduling, parallel identical machines, unavailability periods, metaheuristic, tabu search

Procedia PDF Downloads 330
17146 Role of Business Incubators and Social Capital on Innovation and Growth of Firms: Evidence from Ethiopia

Authors: Hailemariam Gebremichael Gebretsadik, Abrham Hagos Tesfaslasea

Abstract:

To satisfy the high need for ICT entrepreneurship and rectify the weak entrepreneurial culture in Ethiopia, the country has established ICT Business incubation centers with the intention of preventing business failures, promoting innovation, and accelerating the growth and success of firms. This study investigates the role of business incubators and social capital on the innovation and growth of firms in Ethiopia. In this research, innovation and growth of firms were considered as dependent variables, whereas business incubation and social capital were treated as independent variables. The researcher employed an e-mail survey among 137 tenant Firms (Firms that joined and/or graduated to/from the Business incubation centers available in Ethiopia) to collect the data and obtained 113 responses that were appropriate for this research. The result of this study reveals that the dimensions of business incubation (physical resource, business support, and networking) have a significant effect on the innovation of Firms, but these dimensions of business incubation do not show a significant effect on the growth of firms. On the other hand, the dimensions of social capital (structural, cognitive, and relational) show a significant positive impact on the likelihood of Firms' growth but not on the innovation of firms. Moreover, the result of this study indicates that the dimensions of business incubation and social capital together have a significant effect on the likelihood of tenant firms innovating and growing.

Keywords: business incubation, innovation, social capital, tenant firms

Procedia PDF Downloads 83
17145 Injury Prediction for Soccer Players Using Machine Learning

Authors: Amiel Satvedi, Richard Pyne

Abstract:

Injuries in professional sports occur on a regular basis. Some may be minor, while others can cause huge impact on a player's career and earning potential. In soccer, there is a high risk of players picking up injuries during game time. This research work seeks to help soccer players reduce the risk of getting injured by predicting the likelihood of injury while playing in the near future and then providing recommendations for intervention. The injury prediction tool will use a soccer player's number of minutes played on the field, number of appearances, distance covered and performance data for the current and previous seasons as variables to conduct statistical analysis and provide injury predictive results using a machine learning linear regression model.

Keywords: injury predictor, soccer injury prevention, machine learning in soccer, big data in soccer

Procedia PDF Downloads 182
17144 The Reproducibility and Repeatability of Modified Likelihood Ratio for Forensics Handwriting Examination

Authors: O. Abiodun Adeyinka, B. Adeyemo Adesesan

Abstract:

The forensic use of handwriting depends on the analysis, comparison, and evaluation decisions made by forensic document examiners. When using biometric technology in forensic applications, it is necessary to compute Likelihood Ratio (LR) for quantifying strength of evidence under two competing hypotheses, namely the prosecution and the defense hypotheses wherein a set of assumptions and methods for a given data set will be made. It is therefore important to know how repeatable and reproducible our estimated LR is. This paper evaluated the accuracy and reproducibility of examiners' decisions. Confidence interval for the estimated LR were presented so as not get an incorrect estimate that will be used to deliver wrong judgment in the court of Law. The estimate of LR is fundamentally a Bayesian concept and we used two LR estimators, namely Logistic Regression (LoR) and Kernel Density Estimator (KDE) for this paper. The repeatability evaluation was carried out by retesting the initial experiment after an interval of six months to observe whether examiners would repeat their decisions for the estimated LR. The experimental results, which are based on handwriting dataset, show that LR has different confidence intervals which therefore implies that LR cannot be estimated with the same certainty everywhere. Though the LoR performed better than the KDE when tested using the same dataset, the two LR estimators investigated showed a consistent region in which LR value can be estimated confidently. These two findings advance our understanding of LR when used in computing the strength of evidence in handwriting using forensics.

Keywords: confidence interval, handwriting, kernel density estimator, KDE, logistic regression LoR, repeatability, reproducibility

Procedia PDF Downloads 124
17143 Identification of Flooding Attack (Zero Day Attack) at Application Layer Using Mathematical Model and Detection Using Correlations

Authors: Hamsini Pulugurtha, V.S. Lakshmi Jagadmaba Paluri

Abstract:

Distributed denial of service attack (DDoS) is one altogether the top-rated cyber threats presently. It runs down the victim server resources like a system of measurement and buffer size by obstructing the server to supply resources to legitimate shoppers. Throughout this text, we tend to tend to propose a mathematical model of DDoS attack; we discuss its relevancy to the choices like inter-arrival time or rate of arrival of the assault customers accessing the server. We tend to tend to further analyze the attack model in context to the exhausting system of measurement and buffer size of the victim server. The projected technique uses an associate in nursing unattended learning technique, self-organizing map, to make the clusters of identical choices. Lastly, the abstract applies mathematical correlation and so the standard likelihood distribution on the clusters and analyses their behaviors to look at a DDoS attack. These systems not exclusively interconnect very little devices exchanging personal data, but to boot essential infrastructures news standing of nuclear facilities. Although this interconnection brings many edges and blessings, it to boot creates new vulnerabilities and threats which might be conversant in mount attacks. In such sophisticated interconnected systems, the power to look at attacks as early as accomplishable is of paramount importance.

Keywords: application attack, bandwidth, buffer correlation, DDoS distribution flooding intrusion layer, normal prevention probability size

Procedia PDF Downloads 224
17142 Predicting Expectations of Non-Monogamy in Long-Term Romantic Relationships

Authors: Michelle R. Sullivan

Abstract:

Positive romantic relationships and marriages offer a buffer against a host of physical and emotional difficulties. Conversely, poor relationship quality and marital discord can have deleterious consequences for individuals and families. Research has described non-monogamy, infidelity, and consensual non-monogamy, as both consequential and causal of relationship difficulty, or as a unique way a couple strives to make a relationship work. Much research on consensual non-monogamy has built on feminist theory and critique. To the author’s best knowledge, to date, no studies have examined the predictive relationship between individual and relationship characteristics and expectations of non-monogamy. The current longitudinal study: 1) estimated the prevalence of expectations of partner non-monogamy and 2) evaluated whether gender, sexual identity, age, education, how a couple met, and relationship quality were predictive expectations of partner non-monogamy. This study utilized the publically available longitudinal dataset, How Couples Meet and Stay Together. Adults aged 18- to 98-years old (n=4002) were surveyed by phone over 5 waves from 2009-2014. Demographics and how a couple met were gathered through self-report in Wave 1, and relationship quality and expectations of partner non-monogamy were gathered through self-report in Waves 4 and 5 (n=1047). The prevalence of expectations of partner non-monogamy (encompassing both infidelity and consensual non-monogamy) was 4.8%. Logistic regression models indicated that sexual identity, gender, education, and relationship quality were significantly predictive of expectations of partner non-monogamy. Specifically, male gender, lower education, identifying as lesbian, gay, or bisexual, and a lower relationship quality scores were predictive of expectations of partner non-monogamy. Male gender was not predictive of expectations of partner non-monogamy in the follow up logistic regression model. Age and whether a couple met online were not associated with expectations of partner non-monogamy. Clinical implications include awareness of the increased likelihood of lesbian, gay, and bisexual individuals to have an expectation of non-monogamy and the sequelae of relationship dissatisfaction that may be related. Future research directions could differentiate between non-monogamy subtypes and the person and relationship variables that lead to the likelihood of consensual non-monogamy and infidelity as separate constructs, as well as explore the relationship between predicting partner behavior and actual partner behavioral outcomes.

Keywords: open relationship, polyamory, infidelity, relationship satisfaction

Procedia PDF Downloads 159