Search results for: predictive coding
1249 CMPD: Cancer Mutant Proteome Database
Authors: Po-Jung Huang, Chi-Ching Lee, Bertrand Chin-Ming Tan, Yuan-Ming Yeh, Julie Lichieh Chu, Tin-Wen Chen, Cheng-Yang Lee, Ruei-Chi Gan, Hsuan Liu, Petrus Tang
Abstract:
Whole-exome sequencing focuses on the protein coding regions of disease/cancer associated genes based on a priori knowledge is the most cost-effective method to study the association between genetic alterations and disease. Recent advances in high throughput sequencing technologies and proteomic techniques has provided an opportunity to integrate genomics and proteomics, allowing readily detectable mutated peptides corresponding to mutated genes. Since sequence database search is the most widely used method for protein identification using Mass spectrometry (MS)-based proteomics technology, a mutant proteome database is required to better approximate the real protein pool to improve disease-associated mutated protein identification. Large-scale whole exome/genome sequencing studies were launched by National Cancer Institute (NCI), Broad Institute, and The Cancer Genome Atlas (TCGA), which provide not only a comprehensive report on the analysis of coding variants in diverse samples cell lines but a invaluable resource for extensive research community. No existing database is available for the collection of mutant protein sequences related to the identified variants in these studies. CMPD is designed to address this issue, serving as a bridge between genomic data and proteomic studies and focusing on protein sequence-altering variations originated from both germline and cancer-associated somatic variations.Keywords: TCGA, cancer, mutant, proteome
Procedia PDF Downloads 5931248 Distributed Coordination of Connected and Automated Vehicles at Multiple Interconnected Intersections
Authors: Zhiyuan Du, Baisravan Hom Chaudhuri, Pierluigi Pisu
Abstract:
In connected vehicle systems where wireless communication is available among the involved vehicles and intersection controllers, it is possible to design an intersection coordination strategy that leads the connected and automated vehicles (CAVs) travel through the road intersections without the conventional traffic light control. In this paper, we present a distributed coordination strategy for the CAVs at multiple interconnected intersections that aims at improving system fuel efficiency and system mobility. We present a distributed control solution where in the higher level, the intersection controllers calculate the road desired average velocity and optimally assign reference velocities of each vehicle. In the lower level, every vehicle is considered to use model predictive control (MPC) to track their reference velocity obtained from the higher level controller. The proposed method has been implemented on a simulation-based case with two-interconnected intersection network. Additionally, the effects of mixed vehicle types on the coordination strategy has been explored. Simulation results indicate the improvement on vehicle fuel efficiency and traffic mobility of the proposed method.Keywords: connected vehicles, automated vehicles, intersection coordination systems, multiple interconnected intersections, model predictive control
Procedia PDF Downloads 3561247 The Biological Function and Clinical Significance of Long Non-coding RNA LINC AC008063 in Head and Neck Squamous Carcinoma
Authors: Maierhaba Mijiti
Abstract:
Objective:The aim is to understand the relationship between the expression level of the long-non-coding RNA LINC AC008063 and the clinicopathological parameters of patients with head and neck squamous cell carcinoma (HNSCC), and to clarify the biological function of LINC AC008063 in HNSCC cells. Moreover, it provides a potential biomarker for the diagnosis, treatment, and prognosis evaluation of HNSCC. Methods: The expression level of LINC AC008063 in the HNSCC was analyzed using transcriptome sequencing data from the TCGA (The cancer genome atlas) database. The expression levels of LINC AC008063 in human embryonic lung diploid cells 2BS, human immortalized keratinocytes HACAT, HNSCC cell lines CAL-27, Detroit562, AMC-HN-8, FD-LSC-1, FaDu and WSU-HN30 were determined by real-time quantitative PCR (qPCR). RNAi (RNA interference) was introduced for LINC AC008063 knockdown in HNSCC cell lines, the localization and abundance analysis of LINC AC008063 was determined by RT-qPCR, and the biological functions were examined by CCK-8, clone formation, flow cytometry, transwell invasion and migration assays, Seahorse assay. Results: LINC AC008063 was upregulated in HNSCC tissue (P<0.001), and verified b CCK-8, clone formation, flow cytometry, transwell invasion and migration assays, Seahorse assayy qPCR in HNSCC cell lines. The survival analysis revealed that the overall survival rate (OS) of patients with high LINC AC008063 expression group was significantly lower than that in the LINC AC008063 expression group, the median survival times for the two groups were 33.10 months and 61.27 months, respectively (P=0.002). The clinical correlation analysis revealed that its expression was positively correlated with the age of patients with HNSCC (P<0.001) and positively correlated with pathological state (T3+T4>T1+T2, P=0.03). The RT-qPCR results showed that LINC AC008063 was mainly enriched in cytoplasm (P=0.01). Knockdown of LINC AC008063 inhibited proliferation, colony formation, migration and invasion; the glycolytic capacity was significantly decreased in HNSCC cell lines (P<0.05). Conclusion: High level of LINC AC008063 was associated with the malignant progression of HNSCC as well as promoting the important biological functions of proliferation, colony formation, migration and invasion; in particular, the glycolytic capacity was decreased in HNSCC cells. Therefore, LINC AC008063 may serve as a potential biomarker for HNSCC and a distinct molecular target to inhibit glycolysis.Keywords: head and neck squamous cell carcinoma, oncogene, long non-coding RNA, LINC AC008063, invasion and metastasis
Procedia PDF Downloads 121246 Concussion: Clinical and Vocational Outcomes from Sport Related Mild Traumatic Brain Injury
Authors: Jack Nash, Chris Simpson, Holly Hurn, Ronel Terblanche, Alan Mistlin
Abstract:
There is an increasing incidence of mild traumatic brain injury (mTBI) cases throughout sport and with this, a growing interest from governing bodies to ensure these are managed appropriately and player welfare is prioritised. The Berlin consensus statement on concussion in sport recommends a multidisciplinary approach when managing those patients who do not have full resolution of mTBI symptoms. There are as of yet no standardised guideline to follow in the treatment of complex cases mTBI in athletes. The aim of this project was to analyse the outcomes, both clinical and vocational, of all patients admitted to the mild Traumatic Brain Injury (mTBI) service at the UK’s Defence Military Rehabilitation Centre Headley Court between 1st June 2008 and 1st February 2017, as a result of a sport induced injury, and evaluate potential predictive indicators of outcome. Patients were identified from a database maintained by the mTBI service. Clinical and occupational outcomes were ascertained from medical and occupational employment records, recorded prospectively, at time of discharge from the mTBI service. Outcomes were graded based on the vocational independence scale (VIS) and clinical documentation at discharge. Predictive indicators including referral time, age at time of injury, previous mental health diagnosis and a financial claim in place at time of entry to service were assessed using logistic regression. 45 Patients were treated for sport-related mTBI during this time frame. Clinically 96% of patients had full resolution of their mTBI symptoms after input from the mTBI service. 51% of patients returned to work at their previous vocational level, 4% had ongoing mTBI symptoms, 22% had ongoing physical rehabilitation needs, 11% required mental health input and 11% required further vestibular rehabilitation. Neither age, time to referral, pre-existing mental health condition nor compensation seeking had a significant impact on either vocational or clinical outcome in this population. The vast majority of patients reviewed in the mTBI clinic had persistent symptoms which could not be managed in primary care. A consultant-led, multidisciplinary approach to the diagnosis and management of mTBI has resulted in excellent clinical outcomes in these complex cases. High levels of symptom resolution suggest that this referral and treatment pathway is successful and is a model which could be replicated in other organisations with consultant led input. Further understanding of both predictive and individual factors would allow clinicians to focus treatments on those who are most likely to develop long-term complications following mTBI. A consultant-led, multidisciplinary service ensures a large number of patients will have complete resolution of mTBI symptoms after sport-related mTBI. Further research is now required to ascertain the key predictive indicators of outcome following sport-related mTBI.Keywords: brain injury, concussion, neurology, rehabilitation, sports injury
Procedia PDF Downloads 1571245 Modern Scotland Yard: Improving Surveillance Policies Using Adversarial Agent-Based Modelling and Reinforcement Learning
Authors: Olaf Visker, Arnout De Vries, Lambert Schomaker
Abstract:
Predictive policing refers to the usage of analytical techniques to identify potential criminal activity. It has been widely implemented by various police departments. Being a relatively new area of research, there are, to the author’s knowledge, no absolute tried, and true methods and they still exhibit a variety of potential problems. One of those problems is closely related to the lack of understanding of how acting on these prediction influence crime itself. The goal of law enforcement is ultimately crime reduction. As such, a policy needs to be established that best facilitates this goal. This research aims to find such a policy by using adversarial agent-based modeling in combination with modern reinforcement learning techniques. It is presented here that a baseline model for both law enforcement and criminal agents and compare their performance to their respective reinforcement models. The experiments show that our smart law enforcement model is capable of reducing crime by making more deliberate choices regarding the locations of potential criminal activity. Furthermore, it is shown that the smart criminal model presents behavior consistent with popular crime theories and outperforms the baseline model in terms of crimes committed and time to capture. It does, however, still suffer from the difficulties of capturing long term rewards and learning how to handle multiple opposing goals.Keywords: adversarial, agent based modelling, predictive policing, reinforcement learning
Procedia PDF Downloads 1481244 Design of Two-Channel Quadrature Mirror Filter Banks Using a Transformation Approach
Authors: Ju-Hong Lee, Yi-Lin Shieh
Abstract:
Two-dimensional (2-D) quadrature mirror filter (QMF) banks have been widely considered for high-quality coding of image and video data at low bit rates. Without implementing subband coding, a 2-D QMF bank is required to have an exactly linear-phase response without magnitude distortion, i.e., the perfect reconstruction (PR) characteristics. The design problem of 2-D QMF banks with the PR characteristics has been considered in the literature for many years. This paper presents a transformation approach for designing 2-D two-channel QMF banks. Under a suitable one-dimensional (1-D) to two-dimensional (2-D) transformation with a specified decimation/interpolation matrix, the analysis and synthesis filters of the QMF bank are composed of 1-D causal and stable digital allpass filters (DAFs) and possess the 2-D doubly complementary half-band (DC-HB) property. This facilitates the design problem of the two-channel QMF banks by finding the real coefficients of the 1-D recursive DAFs. The design problem is formulated based on the minimax phase approximation for the 1-D DAFs. A novel objective function is then derived to obtain an optimization for 1-D minimax phase approximation. As a result, the problem of minimizing the objective function can be simply solved by using the well-known weighted least-squares (WLS) algorithm in the minimax (L∞) optimal sense. The novelty of the proposed design method is that the design procedure is very simple and the designed 2-D QMF bank achieves perfect magnitude response and possesses satisfactory phase response. Simulation results show that the proposed design method provides much better design performance and much less design complexity as compared with the existing techniques.Keywords: Quincunx QMF bank, doubly complementary filter, digital allpass filter, WLS algorithm
Procedia PDF Downloads 2251243 Enzymatic Repair Prior To DNA Barcoding, Aspirations, and Restraints
Authors: Maxime Merheb, Rachel Matar
Abstract:
Retrieving ancient DNA sequences which in return permit the entire genome sequencing from fossils have extraordinarily improved in recent years, thanks to sequencing technology and other methodological advances. In any case, the quest to search for ancient DNA is still obstructed by the damage inflicted on DNA which accumulates after the death of a living organism. We can characterize this damage into three main categories: (i) Physical abnormalities such as strand breaks which lead to the presence of short DNA fragments. (ii) Modified bases (mainly cytosine deamination) which cause errors in the sequence due to an incorporation of a false nucleotide during DNA amplification. (iii) DNA modifications referred to as blocking lesions, will halt the PCR extension which in return will also affect the amplification and sequencing process. We can clearly see that the issues arising from breakage and coding errors were significantly decreased in recent years. Fast sequencing of short DNA fragments was empowered by platforms for high-throughput sequencing, most of the coding errors were uncovered to be the consequences of cytosine deamination which can be easily removed from the DNA using enzymatic treatment. The methodology to repair DNA sequences is still in development, it can be basically explained by the process of reintroducing cytosine rather than uracil. This technique is thus restricted to amplified DNA molecules. To eliminate any type of damage (particularly those that block PCR) is a process still pending the complete repair methodologies; DNA detection right after extraction is highly needed. Before using any resources into extensive, unreasonable and uncertain repair techniques, it is vital to distinguish between two possible hypotheses; (i) DNA is none existent to be amplified to begin with therefore completely un-repairable, (ii) the DNA is refractory to PCR and it is worth to be repaired and amplified. Hence, it is extremely important to develop a non-enzymatic technique to detect the most degraded DNA.Keywords: ancient DNA, DNA barcodong, enzymatic repair, PCR
Procedia PDF Downloads 4001242 Use of a New Multiplex Quantitative Polymerase Chain Reaction Based Assay for Simultaneous Detection of Neisseria Meningitidis, Escherichia Coli K1, Streptococcus agalactiae, and Streptococcus pneumoniae
Authors: Nastaran Hemmati, Farhad Nikkhahi, Amir Javadi, Sahar Eskandarion, Seyed Mahmuod Amin Marashi
Abstract:
Neisseria meningitidis, Escherichia coli K, Streptococcus agalactiae, and Streptococcus pneumoniae cause 90% of bacterial meningitis. Almost all infected people die or have irreversible neurological complications. Therefore, it is essential to have a diagnostic kit with the ability to quickly detect these fatal infections. The project involved 212 patients from whom cerebrospinal fluid samples were obtained. After total genome extraction and performing multiplex quantitative polymerase chain reaction (qPCR), the presence or absence of each infectious factor was determined by comparing with standard strains. The specificity, sensitivity, positive predictive value, and negative predictive value calculated were 100%, 92.9%, 50%, and 100%, respectively. So, due to the high specificity and sensitivity of the designed primers, they can be used instead of bacterial culture that takes at least 24 to 48 hours. The remarkable benefit of this method is associated with the speed (up to 3 hours) at which the procedure could be completed. It is also worth noting that this method can reduce the personnel unintentional errors which may occur in the laboratory. On the other hand, as this method simultaneously identifies four common factors that cause bacterial meningitis, it could be used as an auxiliary method diagnostic technique in laboratories particularly in cases of emergency medicine.Keywords: cerebrospinal fluid, meningitis, quantitative polymerase chain reaction, simultaneous detection, diagnosis testing
Procedia PDF Downloads 1161241 AI Predictive Modeling of Excited State Dynamics in OPV Materials
Authors: Pranav Gunhal., Krish Jhurani
Abstract:
This study tackles the significant computational challenge of predicting excited state dynamics in organic photovoltaic (OPV) materials—a pivotal factor in the performance of solar energy solutions. Time-dependent density functional theory (TDDFT), though effective, is computationally prohibitive for larger and more complex molecules. As a solution, the research explores the application of transformer neural networks, a type of artificial intelligence (AI) model known for its superior performance in natural language processing, to predict excited state dynamics in OPV materials. The methodology involves a two-fold process. First, the transformer model is trained on an extensive dataset comprising over 10,000 TDDFT calculations of excited state dynamics from a diverse set of OPV materials. Each training example includes a molecular structure and the corresponding TDDFT-calculated excited state lifetimes and key electronic transitions. Second, the trained model is tested on a separate set of molecules, and its predictions are rigorously compared to independent TDDFT calculations. The results indicate a remarkable degree of predictive accuracy. Specifically, for a test set of 1,000 OPV materials, the transformer model predicted excited state lifetimes with a mean absolute error of 0.15 picoseconds, a negligible deviation from TDDFT-calculated values. The model also correctly identified key electronic transitions contributing to the excited state dynamics in 92% of the test cases, signifying a substantial concordance with the results obtained via conventional quantum chemistry calculations. The practical integration of the transformer model with existing quantum chemistry software was also realized, demonstrating its potential as a powerful tool in the arsenal of materials scientists and chemists. The implementation of this AI model is estimated to reduce the computational cost of predicting excited state dynamics by two orders of magnitude compared to conventional TDDFT calculations. The successful utilization of transformer neural networks to accurately predict excited state dynamics provides an efficient computational pathway for the accelerated discovery and design of new OPV materials, potentially catalyzing advancements in the realm of sustainable energy solutions.Keywords: transformer neural networks, organic photovoltaic materials, excited state dynamics, time-dependent density functional theory, predictive modeling
Procedia PDF Downloads 1181240 Species Distribution Modelling for Assessing the Effect of Land Use Changes on the Habitat of Endangered Proboscis Monkey (Nasalis larvatus) in Kalimantan, Indonesia
Authors: Wardatutthoyyibah, Satyawan Pudyatmoko, Sena Adi Subrata, Muhammad Ali Imron
Abstract:
The proboscis monkey is an endemic species to the island of Borneo with conservation status IUCN (The International Union for Conservation of Nature) of endangered. The population of the monkey has a specific habitat and sensitive to habitat disturbances. As a consequence of increasing rates of land-use change in the last four decades, its population was reported significantly decreased. We quantified the effect of land use change on the proboscis monkey’s habitat through the species distribution modeling (SDM) approach with Maxent Software. We collected presence data and environmental variables, i.e., land cover, topography, bioclimate, distance to the river, distance to the road, and distance to the anthropogenic disturbance to generate predictive distribution maps of the monkeys. We compared two prediction maps for 2000 and 2015 data to represent the current habitat of the monkey. We overlaid the monkey’s predictive distribution map with the existing protected areas to investigate whether the habitat of the monkey is protected under the protected areas networks. The results showed that almost 50% of the monkey’s habitat reduced as the effect of land use change. And only 9% of the current proboscis monkey’s habitat within protected areas. These results are important for the master plan of conservation of the endangered proboscis monkey and provide scientific guidance for the future development incorporating biodiversity issue.Keywords: endemic species, land use change, maximum entropy, spatial distribution
Procedia PDF Downloads 1571239 Machine Learning Approaches Based on Recency, Frequency, Monetary (RFM) and K-Means for Predicting Electrical Failures and Voltage Reliability in Smart Cities
Authors: Panaya Sudta, Wanchalerm Patanacharoenwong, Prachya Bumrungkun
Abstract:
As With the evolution of smart grids, ensuring the reliability and efficiency of electrical systems in smart cities has become crucial. This paper proposes a distinct approach that combines advanced machine learning techniques to accurately predict electrical failures and address voltage reliability issues. This approach aims to improve the accuracy and efficiency of reliability evaluations in smart cities. The aim of this research is to develop a comprehensive predictive model that accurately predicts electrical failures and voltage reliability in smart cities. This model integrates RFM analysis, K-means clustering, and LSTM networks to achieve this objective. The research utilizes RFM analysis, traditionally used in customer value assessment, to categorize and analyze electrical components based on their failure recency, frequency, and monetary impact. K-means clustering is employed to segment electrical components into distinct groups with similar characteristics and failure patterns. LSTM networks are used to capture the temporal dependencies and patterns in customer data. This integration of RFM, K-means, and LSTM results in a robust predictive tool for electrical failures and voltage reliability. The proposed model has been tested and validated on diverse electrical utility datasets. The results show a significant improvement in prediction accuracy and reliability compared to traditional methods, achieving an accuracy of 92.78% and an F1-score of 0.83. This research contributes to the proactive maintenance and optimization of electrical infrastructures in smart cities. It also enhances overall energy management and sustainability. The integration of advanced machine learning techniques in the predictive model demonstrates the potential for transforming the landscape of electrical system management within smart cities. The research utilizes diverse electrical utility datasets to develop and validate the predictive model. RFM analysis, K-means clustering, and LSTM networks are applied to these datasets to analyze and predict electrical failures and voltage reliability. The research addresses the question of how accurately electrical failures and voltage reliability can be predicted in smart cities. It also investigates the effectiveness of integrating RFM analysis, K-means clustering, and LSTM networks in achieving this goal. The proposed approach presents a distinct, efficient, and effective solution for predicting and mitigating electrical failures and voltage issues in smart cities. It significantly improves prediction accuracy and reliability compared to traditional methods. This advancement contributes to the proactive maintenance and optimization of electrical infrastructures, overall energy management, and sustainability in smart cities.Keywords: electrical state prediction, smart grids, data-driven method, long short-term memory, RFM, k-means, machine learning
Procedia PDF Downloads 561238 Precision Pest Management by the Use of Pheromone Traps and Forecasting Module in Mobile App
Authors: Muhammad Saad Aslam
Abstract:
In 2021, our organization has launched our proprietary mobile App i.e. Farm Intelligence platform, an industrial-first precision agriculture solution, to Pakistan. It was piloted at 47 locations (spanning around 1,200 hectares of land), addressing growers’ pain points by bringing the benefits of precision agriculture to their doorsteps. This year, we have extended its reach by more than 10 times (nearly 130,000 hectares of land) in almost 600 locations across the country. The project team selected highly infested areas to set up traps, which then enabled the sales team to initiate evidence-based conversations with the grower community about preventive crop protection products that includes pesticides and insecticides. Mega farmer meeting field visits and demonstrations plots coupled with extensive marketing activities, were setup to include farmer community. With the help of App real-time pest monitoring (using heat maps and infestation prediction through predictive analytics) we have equipped our growers with on spot insights that will help them optimize pesticide applications. Heat maps allow growers to identify infestation hot spots to fine-tune pesticide delivery, while predictive analytics enable preventive application of pesticides before the situation escalates. Ultimately, they empower growers to keep their crops safe for a healthy harvest.Keywords: precision pest management, precision agriculture, real time pest tracking, pest forecasting
Procedia PDF Downloads 911237 Stress Hyperglycemia: A Predictor of Major Adverse Cardiac Events in Non-Diabetic Patients With Acute Heart Failure
Authors: Fahad Raj Khan, Suleman Khan
Abstract:
There is a lack of consensus about the predictive value of raised blood glucose levels in terms of major adverse cardiac events (MACEs) in non-diabetic patients admitted for acute decompensated heart failure. The purpose of this research was to examine the long-term prognosis of acute decompensated heart failure (ADHF) in non-diabetic persons who had increased blood glucose levels, i.e., stress hyperglycemia, at the time of their ADHF hospitalization. The research involved 650 non-diabetic patients. Based on their admission stress hyperglycemia, they were divided into two groups.ie with and without (SHGL). The two groups' one-year outcomes for major adverse cardiac events (MACEs) were compared, and key predictors of MACEs were discovered. For statistical analysis, the two-tailed Mann-Whitney U test, Fisher's exact test, and binary logistic regression analysis were utilized. SHGL was found in 353 (54.3%) individuals. It was more frequent in men than in women. About 27% of patients with SHGL had previously been admitted for ADHF. Almost 62% were hypertensive, whereas 14 % had CKD. MACEs were significantly predicted by SHGL, HTN, prior hospitalization for ADHF, CKD, and cardiogenic shock upon admission. SHGL at the time of ADHF admission, independent of DM status, may be a predictive indication of MACEs.Keywords: stress hyperglycemia, acute heart failure, major adverse cardiac events, MACEs
Procedia PDF Downloads 941236 Smart Disassembly of Waste Printed Circuit Boards: The Role of IoT and Edge Computing
Authors: Muhammad Mohsin, Fawad Ahmad, Fatima Batool, Muhammad Kaab Zarrar
Abstract:
The integration of the Internet of Things (IoT) and edge computing devices offers a transformative approach to electronic waste management, particularly in the dismantling of printed circuit boards (PCBs). This paper explores how these technologies optimize operational efficiency and improve environmental sustainability by addressing challenges such as data security, interoperability, scalability, and real-time data processing. Proposed solutions include advanced machine learning algorithms for predictive maintenance, robust encryption protocols, and scalable architectures that incorporate edge computing. Case studies from leading e-waste management facilities illustrate benefits such as improved material recovery efficiency, reduced environmental impact, improved worker safety, and optimized resource utilization. The findings highlight the potential of IoT and edge computing to revolutionize e-waste dismantling and make the case for a collaborative approach between policymakers, waste management professionals, and technology developers. This research provides important insights into the use of IoT and edge computing to make significant progress in the sustainable management of electronic wasteKeywords: internet of Things, edge computing, waste PCB disassembly, electronic waste management, data security, interoperability, machine learning, predictive maintenance, sustainable development
Procedia PDF Downloads 311235 Patient Outcomes Following Out-of-Hospital Cardiac Arrest
Authors: Scott Ashby, Emily Granger, Mark Connellan
Abstract:
Background: In-hospital management of Out-of-Hospital Cardiac Arrest (OHCA) is complex as the aetiologies are varied. Acute coronary angiography has been shown to improve outcomes for patients with coronary occlusion as the cause; however, these patients are difficult to identify. ECG results may help identify these patients, but the accuracy of this diagnostic test is under debate, and requires further investigation. Methods: Arrest and hospital management information was collated retrospectively for OHCA patients who presented to a single clinical site between 2009 and 2013. Angiography results were then collected and checked for significance with survival to discharge. The presence of a severe lesion (>70%) was then compared to categorised ECG findings, and the accuracy of the test was calculated. Results: 104 patients were included in this study, 44 survived to discharge, 52 died and 8 were transferred to other clinical sites. Angiography appears to significantly correlate with survival to discharge. ECG showed 54.8% sensitivity for detecting the presence of a severe lesion within the group that received angiography. A combined criterion including any ECG pathology showed 100% sensitivity and negative predictive value, however, a low specificity and positive predictive value. Conclusion: In the cohort investigated, ST elevation on ECG is not a sensitive enough screening test to be used to determine whether OHCA patients have coronary stenosis as the likely cause of their arrest, and more investigation into whether screening with a combined ECG criterion, or whether all patients should receive angiography routinely following OHCA is needed.Keywords: out of hospital cardiac arrest, coronary angiography, resuscitation, emergency medicine
Procedia PDF Downloads 3951234 Emotional Awareness and Working Memory as Predictive Factors for the Habitual Use of Cognitive Reappraisal among Adolescents
Authors: Yuri Kitahara
Abstract:
Background: Cognitive reappraisal refers to an emotion regulation strategy in which one changes the interpretation of emotion-eliciting events. Numerous studies show that cognitive reappraisal is associated with mental health and better social functioning. However the examination of the predictive factors of adaptive emotion regulation remains as an issue. The present study examined the factors contributing to the habitual use of cognitive reappraisal, with a focus on emotional awareness and working memory. Methods: Data was collected from 30 junior high school students, using a Japanese version of the Emotion Regulation Questionnaire (ERQ), the Levels of Emotional Awareness Scale for Children (LEAS-C), and N-back task. Results: A positive correlation between emotional awareness and cognitive reappraisal was observed in the high-working-memory group (r = .54, p < .05), whereas no significant relationship was found in the low-working-memory group. In addition, the results of the analysis of variance (ANOVA) showed a significant interaction between emotional awareness and working memory capacity (F(1, 26) = 7.74, p < .05). Subsequent analysis of simple main effects confirmed that high working memory capacity significantly increases the use of cognitive reappraisal for high-emotional-awareness subjects, and significantly decreases the use of cognitive reappraisal for low-emotional-awareness subjects. Discussion: These results indicate that under the condition when one has an adequate ability for simultaneous processing of information, explicit understanding of emotion would contribute to adaptive cognitive emotion regulation. The findings are discussed along with neuroscientific claims.Keywords: cognitive reappraisal, emotional awareness, emotion regulation, working memory
Procedia PDF Downloads 2321233 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival
Procedia PDF Downloads 3351232 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 1441231 A Hierarchical Method for Multi-Class Probabilistic Classification Vector Machines
Authors: P. Byrnes, F. A. DiazDelaO
Abstract:
The Support Vector Machine (SVM) has become widely recognised as one of the leading algorithms in machine learning for both regression and binary classification. It expresses predictions in terms of a linear combination of kernel functions, referred to as support vectors. Despite its popularity amongst practitioners, SVM has some limitations, with the most significant being the generation of point prediction as opposed to predictive distributions. Stemming from this issue, a probabilistic model namely, Probabilistic Classification Vector Machines (PCVM), has been proposed which respects the original functional form of SVM whilst also providing a predictive distribution. As physical system designs become more complex, an increasing number of classification tasks involving industrial applications consist of more than two classes. Consequently, this research proposes a framework which allows for the extension of PCVM to a multi class setting. Additionally, the original PCVM framework relies on the use of type II maximum likelihood to provide estimates for both the kernel hyperparameters and model evidence. In a high dimensional multi class setting, however, this approach has been shown to be ineffective due to bad scaling as the number of classes increases. Accordingly, we propose the application of Markov Chain Monte Carlo (MCMC) based methods to provide a posterior distribution over both parameters and hyperparameters. The proposed framework will be validated against current multi class classifiers through synthetic and real life implementations.Keywords: probabilistic classification vector machines, multi class classification, MCMC, support vector machines
Procedia PDF Downloads 2211230 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever
Abstract:
Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.Keywords: deep learning model, dengue fever, prediction, optimization
Procedia PDF Downloads 651229 Character Development Outcomes: A Predictive Model for Behaviour Analysis in Tertiary Institutions
Authors: Rhoda N. Kayongo
Abstract:
As behavior analysts in education continue to debate on how higher institutions can continue to benefit from their social and academic related programs, higher education is facing challenges in the area of character development. This is manifested in the percentages of college completion rates, teen pregnancies, drug abuse, sexual abuse, suicide, plagiarism, lack of academic integrity, and violence among their students. Attending college is a perceived opportunity to positively influence the actions and behaviors of the next generation of society; thus colleges and universities have to provide opportunities to develop students’ values and behaviors. Prior studies were mainly conducted in private institutions and more so in developed countries. However, with the complexity of the nature of student body currently due to the changing world, a multidimensional approach combining multiple factors that enhance character development outcomes is needed to suit the changing trends. The main purpose of this study was to identify opportunities in colleges and develop a model for predicting character development outcomes. A survey questionnaire composed of 7 scales including in-classroom interaction, out-of-classroom interaction, school climate, personal lifestyle, home environment, and peer influence as independent variables and character development outcomes as the dependent variable was administered to a total of five hundred and one students of 3rd and 4th year level in selected public colleges and universities in the Philippines and Rwanda. Using structural equation modelling, a predictive model explained 57% of the variance in character development outcomes. Findings from the results of the analysis showed that in-classroom interactions have a substantial direct influence on character development outcomes of the students (r = .75, p < .05). In addition, out-of-classroom interaction, school climate, and home environment contributed to students’ character development outcomes but in an indirect way. The study concluded that in the classroom are many opportunities for teachers to teach, model and integrate character development among their students. Thus, suggestions are made to public colleges and universities to deliberately boost and implement experiences that cultivate character within the classroom. These may contribute tremendously to the students' character development outcomes and hence render effective models of behaviour analysis in higher education.Keywords: character development, tertiary institutions, predictive model, behavior analysis
Procedia PDF Downloads 1361228 The Predictive Power of Successful Scientific Theories: An Explanatory Study on Their Substantive Ontologies through Theoretical Change
Authors: Damian Islas
Abstract:
Debates on realism in science concern two different questions: (I) whether the unobservable entities posited by theories can be known; and (II) whether any knowledge we have of them is objective or not. Question (I) arises from the doubt that since observation is the basis of all our factual knowledge, unobservable entities cannot be known. Question (II) arises from the doubt that since scientific representations are inextricably laden with the subjective, idiosyncratic, and a priori features of human cognition and scientific practice, they cannot convey any reliable information on how their objects are in themselves. A way of understanding scientific realism (SR) is through three lines of inquiry: ontological, semantic, and epistemological. Ontologically, scientific realism asserts the existence of a world independent of human mind. Semantically, scientific realism assumes that theoretical claims about reality show truth values and, thus, should be construed literally. Epistemologically, scientific realism believes that theoretical claims offer us knowledge of the world. Nowadays, the literature on scientific realism has proceeded rather far beyond the realism versus antirealism debate. This stance represents a middle-ground position between the two according to which science can attain justified true beliefs concerning relational facts about the unobservable realm but cannot attain justified true beliefs concerning the intrinsic nature of any objects occupying that realm. That is, the structural content of scientific theories about the unobservable can be known, but facts about the intrinsic nature of the entities that figure as place-holders in those structures cannot be known. There are two possible versions of SR: Epistemological Structural Realism (ESR) and Ontic Structural Realism (OSR). On ESR, an agnostic stance is preserved with respect to the natures of unobservable entities, but the possibility of knowing the relations obtaining between those entities is affirmed. OSR includes the rather striking claim that when it comes to the unobservables theorized about within fundamental physics, relations exist, but objects do not. Focusing on ESR, questions arise concerning its ability to explain the empirical success of a theory. Empirical success certainly involves predictive success, and predictive success implies a theory’s power to make accurate predictions. But a theory’s power to make any predictions at all seems to derive precisely from its core axioms or laws concerning unobservable entities and mechanisms, and not simply the sort of structural relations often expressed in equations. The specific challenge to ESR concerns its ability to explain the explanatory and predictive power of successful theories without appealing to their substantive ontologies, which are often not preserved by their successors. The response to this challenge will depend on the various and subtle different versions of ESR and OSR stances, which show a sort of progression through eliminativist OSR to moderate OSR of gradual increase in the ontological status accorded to objects. Knowing the relations between unobserved entities is methodologically identical to assert that these relations between unobserved entities exist.Keywords: eliminativist ontic structural realism, epistemological structuralism, moderate ontic structural realism, ontic structuralism
Procedia PDF Downloads 1181227 Refining Scheme Using Amphibious Epistemologies
Authors: David Blaine, George Raschbaum
Abstract:
The evaluation of DHCP has synthesized SCSI disks, and current trends suggest that the exploration of e-business that would allow for further study into robots will soon emerge. Given the current status of embedded algorithms, hackers worldwide obviously desire the exploration of replication, which embodies the confusing principles of programming languages. In our research we concentrate our efforts on arguing that erasure coding can be made "fuzzy", encrypted, and game-theoretic.Keywords: SCHI disks, robot, algorithm, hacking, programming language
Procedia PDF Downloads 4291226 Benefits of Tourist Experiences for Families: A Systematic Literature Review Using Nvivo
Authors: Diana Cunha, Catarina Coelho, Ana Paula Relvas, Elisabeth Kastenholz
Abstract:
Context: Tourist experiences have a recognized impact on the well-being of individuals. However, studies on the specific benefits of tourist experiences for families are scattered across different disciplines. This study aims to systematically review the literature to synthesize the evidence on the benefits of tourist experiences for families. Research Aim: The main objective is to systematize the evidence in the literature regarding the benefits of tourist experiences for families. Methodology: A systematic literature review was conducted using Nvivo, analyzing 33 scientific studies obtained from various databases. The search terms used were "family"/ "couple" and "tourist experience". The studies included quantitative, qualitative, mixed methods, and literature reviews. All works prior to the year 2000 were excluded, and the search was restricted to full text. A language filter was also used, considering articles in Portuguese, English, and Spanish. For NVivo analysis, information was coded based on both deductive and inductive perspectives. To minimize the subjectivity of the selection and coding process, two of the authors discussed the process and agreed on criteria that would make the coding more objective. Once the coding process in NVivo was completed, the data relating to the identification/characterization of the works were exported to the Statistical Package for the Social Sciences (SPPS), to characterize the sample. Findings: The results highlight that tourist experiences have several benefits for family systems, including the strengthening of family and marital bonds, the creation of family memories, and overall well-being and life satisfaction. These benefits contribute to both immediate relationship quality improvement and long-term family identity construction and transgenerational transmission. Theoretical Importance: This study emphasizes the systemic nature of the effects and relationships within family systems. It also shows that no harm was reported within these experiences, with only some challenges related to positive outcomes. Data Collection and Analysis Procedures: The study collected data from 33 scientific studies published predominantly after 2013. The data were analyzed using Nvivo, employing a systematic review approach. Question Addressed: The study addresses the question of the benefits of tourist experiences for families and how these experiences contribute to family functioning and individual well-being. Conclusion: Tourist experiences provide opportunities for families to enhance their interpersonal relationships and create lasting memories. The findings suggest that formal interventions based on evidence could further enhance the potential benefits of these experiences and be a valuable preventive tool in therapeutic interventions.Keywords: family systems, individual and family well-being, marital satisfaction, tourist experiences
Procedia PDF Downloads 691225 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions
Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini
Abstract:
This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing
Procedia PDF Downloads 1461224 The Regulation of the Cancer Epigenetic Landscape Lies in the Realm of the Long Non-coding RNAs
Authors: Ricardo Alberto Chiong Zevallos, Eduardo Moraes Rego Reis
Abstract:
Pancreatic adenocarcinoma (PDAC) patients have a less than 10% 5-year survival rate. PDAC has no defined diagnostic and prognostic biomarkers. Gemcitabine is the first-line drug in PDAC and several other cancers. Long non-coding RNAs (lncRNAs) contribute to the tumorigenesis and are potential biomarkers for PDAC. Although lncRNAs aren’t translated into proteins, they have important functions. LncRNAs can decoy or recruit proteins from the epigenetic machinery, act as microRNA sponges, participate in protein translocation through different cellular compartments, and even promote chemoresistance. The chromatin remodeling enzyme EZH2 is a histone methyltransferase that catalyzes the methylation of histone 3 at lysine 27, silencing local expression. EZH2 is ambivalent, it can also activate gene expression independently of its histone methyltransferase activity. EZH2 is overexpressed in several cancers and interacts with lncRNAs, being recruited to a specific locus. EZH2 can be recruited to activate an oncogene or silence a tumor suppressor. The lncRNAs misregulation in cancer can result in the differential recruitment of EZH2 and in a distinct epigenetic landscape, promoting chemoresistance. The relevance of the EZH2-lncRNAs interaction to chemoresistant PDAC was assessed by Real Time quantitative PCR (RT-qPCR) and RNA Immunoprecipitation (RIP) experiments with naïve and gemcitabine-resistant PDAC cells. The expression of several lncRNAs and EZH2 gene targets was evaluated contrasting naïve and resistant cells. Selection of candidate genes was made by bioinformatic analysis and literature curation. Indeed, the resistant cell line showed higher expression of chemoresistant-associated lncRNAs and protein coding genes. RIP detected lncRNAs interacting with EZH2 with varying intensity levels in the cell lines. During RIP, the nuclear fraction of the cells was incubated with an antibody for EZH2 and with magnetic beads. The RNA precipitated with the beads-antibody-EZH2 complex was isolated and reverse transcribed. The presence of candidate lncRNAs was detected by RT-qPCR, and the enrichment was calculated relative to INPUT (total lysate control sample collected before RIP). The enrichment levels varied across the several lncRNAs and cell lines. The EZH2-lncRNA interaction might be responsible for the regulation of chemoresistance-associated genes in multiple cancers. The relevance of the lncRNA-EZH2 interaction to PDAC was assessed by siRNA knockdown of a lncRNA, followed by the analysis of the EZH2 target expression by RT-qPCR. The chromatin immunoprecipitation (ChIP) of EZH2 and H3K27me3 followed by RT-qPCR with primers for EZH2 targets also assess the specificity of the EZH2 recruitment by the lncRNA. This is the first report of the interaction of EZH2 and lncRNAs HOTTIP and PVT1 in chemoresistant PDAC. HOTTIP and PVT1 were described as promoting chemoresistance in several cancers, but the role of EZH2 is not clarified. For the first time, the lncRNA LINC01133 was detected in a chemoresistant cancer. The interaction of EZH2 with LINC02577, LINC00920, LINC00941, and LINC01559 have never been reported in any context. The novel lncRNAs-EZH2 interactions regulate chemoresistant-associated genes in PDAC and might be relevant to other cancers. Therapies targeting EZH2 alone weren’t successful, and a combinatorial approach also targeting the lncRNAs interacting with it might be key to overcome chemoresistance in several cancers.Keywords: epigenetics, chemoresistance, long non-coding RNAs, pancreatic cancer, histone modification
Procedia PDF Downloads 961223 Educational Leadership Preparation Program Review of Employer Satisfaction
Authors: Glenn Koonce
Abstract:
There is a need to address the improvement of university educational leadership preparation programs through the processes of accreditation and continuous improvement. The program faculty in a university in the eastern part of the United States has incorporated an employer satisfaction focus group to address their national accreditation standard so that employers are satisfied with completers' preparation for the position of principal or assistant principal. Using the Council for the Accreditation of Educator Preparation (CAEP) required proficiencies, the following research questions are investigated: 1) what proficiencies do completers perform the strongest? 2) what proficiencies need to be strengthened? 3) what other strengths beyond the required proficiencies do completers demonstrate? 4) what other areas of responsibility beyond the required proficiencies do completers demonstrate? and 5) how can the program improve in preparing candidates for their positions? This study focuses on employers of one public school district that has a large number of educational leadership completers employed as principals and assistant principals. Central office directors who evaluate principals and principals who evaluate assistant principals are focus group participants. Construction of the focus group questions is a result of recommendations from an accreditation regulatory specialist, reviewed by an expert panel, and piloted by an experienced focus group leader. The focus group session was audio recorded, transcribed, and analyzed using the NVivo Version 14 software. After constructing folders in NVivo, the focus group transcript was loaded and skimmed by diagnosing significant statements and assessing core ideas for developing primary themes. These themes were aligned to address the research questions. From the transcript, codes were assigned to the themes and NVivo provided a coding hierarchy chart or graphical illustration for framing the coding. A final report of the coding process was designed using the primary themes and pertinent codes that were supported in excerpts from the transcript. The outcome of this study is to identify themes that can provide evidence that the educational leadership program is meeting its mission to improve PreK-12 student achievement through well-prepared completers who have achieved the position of principal or assistant principal. The considerations will be used to derive a composite profile of employers' satisfaction with program completers with the capacity to serve, influence, and thrive as educational leaders. Analysis of the idealized themes will result in identifying issues that may challenge university educational leadership programs to improve. Results, conclusions, and recommendations are used for continuous improvement, which is another national accreditation standard required for the program.Keywords: educational leadership preparation, CAEP accreditation, principal & assistant principal evaluations, continuous improvement
Procedia PDF Downloads 281222 Predictive Machine Learning Model for Assessing the Impact of Untreated Teeth Grinding on Gingival Recession and Jaw Pain
Authors: Joseph Salim
Abstract:
This paper proposes the development of a supervised machine learning system to predict the consequences of untreated bruxism (teeth grinding) on gingival (gum) recession and jaw pain (most often bilateral jaw pain with possible headaches and limited ability to open the mouth). As a general dentist in a multi-specialty practice, the author has encountered many patients suffering from these issues due to uncontrolled bruxism (teeth grinding) at night. The most effective treatment for managing this problem involves wearing a nightguard during sleep and receiving therapeutic Botox injections to relax the muscles (the masseter muscle) responsible for grinding. However, some patients choose to postpone these treatments, leading to potentially irreversible and costlier consequences in the future. The proposed machine learning model aims to track patients who forgo the recommended treatments and assess the percentage of individuals who will experience worsening jaw pain, gingival (gum) recession, or both within a 3-to-5-year timeframe. By accurately predicting these outcomes, the model seeks to motivate patients to address the root cause proactively, ultimately saving time and pain while improving quality of life and avoiding much costlier treatments such as full-mouth rehabilitation to help recover the loss of vertical dimension of occlusion due to shortened clinical crowns because of bruxism, gingival grafts, etc.Keywords: artificial intelligence, machine learning, predictive insights, bruxism, teeth grinding, therapeutic botox, nightguard, gingival recession, gum recession, jaw pain
Procedia PDF Downloads 931221 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records
Authors: Sara ElElimy, Samir Moustafa
Abstract:
Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).Keywords: big data analytics, machine learning, CDRs, 5G
Procedia PDF Downloads 1391220 Enhancing the Pricing Expertise of an Online Distribution Channel
Authors: Luis N. Pereira, Marco P. Carrasco
Abstract:
Dynamic pricing is a revenue management strategy in which hotel suppliers define, over time, flexible and different prices for their services for different potential customers, considering the profile of e-consumers and the demand and market supply. This means that the fundamentals of dynamic pricing are based on economic theory (price elasticity of demand) and market segmentation. This study aims to define a dynamic pricing strategy and a contextualized offer to the e-consumers profile in order to improve the number of reservations of an online distribution channel. Segmentation methods (hierarchical and non-hierarchical) were used to identify and validate an optimal number of market segments. A profile of the market segments was studied, considering the characteristics of the e-consumers and the probability of reservation a room. In addition, the price elasticity of demand was estimated for each segment using econometric models. Finally, predictive models were used to define rules for classifying new e-consumers into pre-defined segments. The empirical study illustrates how it is possible to improve the intelligence of an online distribution channel system through an optimal dynamic pricing strategy and a contextualized offer to the profile of each new e-consumer. A database of 11 million e-consumers of an online distribution channel was used in this study. The results suggest that an appropriate policy of market segmentation in using of online reservation systems is benefit for the service suppliers because it brings high probability of reservation and generates more profit than fixed pricing.Keywords: dynamic pricing, e-consumers segmentation, online reservation systems, predictive analytics
Procedia PDF Downloads 234