Search results for: game outcome prediction
1393 Membrane-Localized Mutations as Predictors of Checkpoint Blockade Efficacy in Cancer
Authors: Zoe Goldberger, Priscilla S. Briquez, Jeffrey A. Hubbell
Abstract:
Tumor cells have mutations resulting from genetic instability that the immune system can actively recognize. Immune checkpoint immunotherapy (ICI) is commonly used in the clinic to re-activate immune reactions against mutated proteins, called neoantigens, resulting in tumor remission in cancer patients. However, only around 20% of patients show durable response to ICI. While tumor mutational burden (TMB) has been approved by the Food and Drug Administration (FDA) as a criterion for ICI therapy, the relevance of the subcellular localizations of the mutated proteins within the tumor cell has not been investigated. Here, we hypothesized that localization of mutations impacts the effect of immune responsiveness to ICI. We analyzed publicly available tumor mutation sequencing data of ICI treated patients from 3 independent datasets. We extracted the subcellular localization from the UniProtKB/Swiss-Prot database and quantified the proportion of membrane, cytoplasmic, nuclear, or secreted mutations per patient. We analyzed this information in relation to response to ICI treatment and overall survival of patients showing with 1722 ICI-treated patients that high mutational burden localized at the membrane (mTMB), correlate with ICI responsiveness, and improved overall survival in multiple cancer types. We anticipate that our results will ameliorate predictability of cancer patient response to ICI with potential implications in clinical guidelines to tailor ICI treatment. This would not only increase patient survival for those receiving ICI, but also patients’ quality of life by reducing the number of patients enduring non-effective ICI treatments.Keywords: cancer, immunotherapy, membrane neoantigens, efficacy prediction, biomarkers
Procedia PDF Downloads 1081392 The Effect of Ice in Pain Control before Digital Nerve Block
Authors: Fatemeh Rasooli, Behzad Simiari, Pooya Payandemehr, Amir Nejati, Maryam Bahreini, Atefeh Abdollahi
Abstract:
Introduction: Pain is a complex physiological reaction to tissue injury. In the course of painful procedures such as nerve block, ice has been shown to be a feasible and inexpensive material to control pain. It delays nerve conduction, actives other senses and reduces inflammatory and painful responses. This study assessed the effect of ice in reducing pain caused by needling and infiltration during digital block. Patient satisfaction recorded as a secondary outcome. Methods: This study was designed as a non-blinded randomized clinical trial approved by Tehran University of Medical Sciences Ethical Committee. Informed consent was taken from all the participants who were then randomly divided into two groups. Digital block performed by standard approach in selected patients. Tubes of ice were prepared in gloves and were fragmented at a time of application for circling around the finger. Tubes were applied for 6 minutes before digital nerve block in the site of needling in the case group. Patients in the control group underwent digital nerve block with the conventional method without ice administration. Numeric Rating Scale (NRS) used for grading pain. 0 used for no pain and 10 for the worst pain that patient had experienced until now. Scores were analyzed by Wilcoxon Rank Sum test and compared in case and control groups. Results: 100 patients aged 16-50 years were enrolled. Mean NRS scores with and without ice were 1.5 mm (S.D ± 1.44) and 6.8 mm (S.D ± 1.40) for needling pain and for infiltration pain were 2.7mm ( S.D ±1.65) and 8.5mm ( S.D ± 1.47), respectively (p<0.001). Besides, patients’ satisfactions were significantly higher in the ice group (p<0.001). Conclusion: Application of ice for 6 minutes significantly reduced pain of needling and infiltration in digital nerve block; thus, it seems to be a feasible and inexpensive material which acts effectively to decrease pain and stress before the procedure.Keywords: digital block, ice, needle, pain
Procedia PDF Downloads 2341391 Prognostic Value of Tumor Markers in Younger Patients with Breast Cancer
Authors: Lola T. Alimkhodjaeva, Lola T. Zakirova, Soniya S. Ziyavidenova
Abstract:
Background: Breast cancer occupies the first place among the cancer in women in the world. It is urgent today to study the role of molecular markers which are capable of predicting the dynamics and outcome of the disease. The aim of this study is to define the prognostic value of the content of estrogen receptor (ER), progesterone receptor (PgR), and amplification of HER-2 / neu oncoprotein by studying 3 and 5-year overall and relapse-free survival in 470 patients with primary operable and 280 patients with locally–advanced breast cancer. Materials and methods: Study results of 3 and 5-year overall and relapse-free survival, depending on the content of RE, PgR in primary operable patients showed that ER positive (+) and PgR (+) survival was 100 (96.2%) and 97.3 (94.6%), for ER negative (-) and PgR (-) - 69.2 (60.3%) and 65.4 (57.7%), for ER positive (+) and negative PgR (-) 87.4 (80.1%) and 81.5 (79.3%), for ER negative (-) and positive PgR (+) - 97.4 (93.4%) and 90.4 (88.5%), respectively. Survival results depended also on the level of HER-2 / neu expression. In patients with HER-2 / neu negative the survival rates were as follows: 98.6 (94.7%) and 96.2 (92.3%). In group of patients with the level of HER-2 / neu (2+) expression these figures were: 45.3 (44.3%) and 45.1 (40.2%), and in group of patients with the level of HER-2 / neu (3+) expression - 41.2 (33.1%) and 34.3 (29.4%). The combination of ER negative (-), PgR (-), HER-2 / neu (-) they were 27.2 (25.4%) and 19.5 (15.3%), respectively. In patients with locally-advanced breast cancer the results of 3 and 5-year OS and RFS for ER (+) and PgR (+) were 76.3 (69.3%) and 62.2 (61.4%), for ER (-) and RP (-) 29.1 (23.7%) and 18.3 (12.6%), for ER (+) and PgR (-) 61.2 (47.2%) and 39.4 (25.6%), for ER (-) and PgR (+) 54.3 (43.1%) and 41.3 (18.3%), respectively. The level of HER-2 / neu expression also affected the survival results. Therefore, in HER-2/ neu negative patients the survival rate was 74.1 (67.6%) and 65.1 (57.3%), with the level of expression (2+) 20.4 (14.2%) and 8.6 (6.4%), with the level of expression (3+) 6.2 (3.1%) and 1.2 (1.5%), respectively. The combination for ER, PgR, HER-2 / neu negative was 22.1 (14.3%) and 8.4 (1.2%). Conclusion: Thus, the presence of steroid hormone receptors in breast tumor tissues at primary operable and locally- advanced process as the lack of HER-2/neu oncoprotein correlates with the highest rates of 3- and 5-year overall and relapse-free survival. The absence of steroid hormone receptors as well as of HER-2/neu overexpression in malignant breast tissues significantly degrades the 3- and 5-year overall and relapse-free survival. Tumors with ER, PgR and HER-2/neu negative have the most unfavorable prognostics.Keywords: breast cancer, estrogen receptor, oncoprotein, progesterone receptor
Procedia PDF Downloads 1881390 The Impact of Temporal Impairment on Quality of Experience (QoE) in Video Streaming: A No Reference (NR) Subjective and Objective Study
Authors: Muhammad Arslan Usman, Muhammad Rehan Usman, Soo Young Shin
Abstract:
Live video streaming is one of the most widely used service among end users, yet it is a big challenge for the network operators in terms of quality. The only way to provide excellent Quality of Experience (QoE) to the end users is continuous monitoring of live video streaming. For this purpose, there are several objective algorithms available that monitor the quality of the video in a live stream. Subjective tests play a very important role in fine tuning the results of objective algorithms. As human perception is considered to be the most reliable source for assessing the quality of a video stream, subjective tests are conducted in order to develop more reliable objective algorithms. Temporal impairments in a live video stream can have a negative impact on the end users. In this paper we have conducted subjective evaluation tests on a set of video sequences containing temporal impairment known as frame freezing. Frame Freezing is considered as a transmission error as well as a hardware error which can result in loss of video frames on the reception side of a transmission system. In our subjective tests, we have performed tests on videos that contain a single freezing event and also for videos that contain multiple freezing events. We have recorded our subjective test results for all the videos in order to give a comparison on the available No Reference (NR) objective algorithms. Finally, we have shown the performance of no reference algorithms used for objective evaluation of videos and suggested the algorithm that works better. The outcome of this study shows the importance of QoE and its effect on human perception. The results for the subjective evaluation can serve the purpose for validating objective algorithms.Keywords: objective evaluation, subjective evaluation, quality of experience (QoE), video quality assessment (VQA)
Procedia PDF Downloads 6001389 Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model
Authors: Myungjin Lee, Daegun Han, Jongsung Kim, Soojun Kim, Hung Soo Kim
Abstract:
Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05).Keywords: radar rainfall ensemble, rainfall-runoff models, blending method, optimum runoff hydrograph
Procedia PDF Downloads 2791388 Supply Chain Analysis with Product Returns: Pricing and Quality Decisions
Authors: Mingming Leng
Abstract:
Wal-Mart has allocated considerable human resources for its quality assurance program, in which the largest retailer serves its supply chains as a quality gatekeeper. Asda Stores Ltd., the second largest supermarket chain in Britain, is now investing £27m in significantly increasing the frequency of quality control checks in its supply chains and thus enhancing quality across its fresh food business. Moreover, Tesco, the largest British supermarket chain, already constructed a quality assessment center to carry out its gatekeeping responsibility. Motivated by the above practices, we consider a supply chain in which a retailer plays the gatekeeping role in quality assurance by identifying defects among a manufacturer's products prior to selling them to consumers. The impact of a retailer's gatekeeping activity on pricing and quality assurance in a supply chain has not been investigated in the operations management area. We draw a number of managerial insights that are expected to help practitioners judiciously consider the quality gatekeeping effort at the retail level. As in practice, when the retailer identifies a defective product, she immediately returns it to the manufacturer, who then replaces the defect with a good quality product and pays a penalty to the retailer. If the retailer does not recognize a defect but sells it to a consumer, then the consumer will identify the defect and return it to the retailer, who then passes the returned 'unidentified' defect to the manufacturer. The manufacturer also incurs a penalty cost. Accordingly, we analyze a two-stage pricing and quality decision problem, in which the manufacturer and the retailer bargain over the manufacturer's average defective rate and wholesale price at the first stage, and the retailer decides on her optimal retail price and gatekeeping intensity at the second stage. We also compare the results when the retailer performs quality gatekeeping with those when the retailer does not. Our supply chain analysis exposes some important managerial insights. For example, the retailer's quality gatekeeping can effectively reduce the channel-wide defective rate, if her penalty charge for each identified de-fect is larger than or equal to the market penalty for each unidentified defect. When the retailer imple-ments quality gatekeeping, the change in the negotiated wholesale price only depends on the manufac-turer's 'individual' benefit, and the change in the retailer's optimal retail price is only related to the channel-wide benefit. The retailer is willing to take on the quality gatekeeping responsibility, when the impact of quality relative to retail price on demand is high and/or the retailer has a strong bargaining power. We conclude that the retailer's quality gatekeeping can help reduce the defective rate for consumers, which becomes more significant when the retailer's bargaining position in her supply chain is stronger. Retailers with stronger bargaining powers can benefit more from their quality gatekeeping in supply chains.Keywords: bargaining, game theory, pricing, quality, supply chain
Procedia PDF Downloads 2771387 Classification for Obstructive Sleep Apnea Syndrome Based on Random Forest
Authors: Cheng-Yu Tsai, Wen-Te Liu, Shin-Mei Hsu, Yin-Tzu Lin, Chi Wu
Abstract:
Background: Obstructive Sleep apnea syndrome (OSAS) is a common respiratory disorder during sleep. In addition, Body parameters were identified high predictive importance for OSAS severity. However, the effects of body parameters on OSAS severity remain unclear. Objective: In this study, the objective is to establish a prediction model for OSAS by using body parameters and investigate the effects of body parameters in OSAS. Methodologies: Severity was quantified as the polysomnography and the mean hourly number of greater than 3% dips in oxygen saturation during examination in a hospital in New Taipei City (Taiwan). Four levels of OSAS severity were classified by the apnea and hypopnea index (AHI) with American Academy of Sleep Medicine (AASM) guideline. Body parameters, including neck circumference, waist size, and body mass index (BMI) were obtained from questionnaire. Next, dividing the collecting subjects into two groups: training and testing groups. The training group was used to establish the random forest (RF) to predicting, and test group was used to evaluated the accuracy of classification. Results: There were 3330 subjects recruited in this study, whom had been done polysomnography for evaluating severity for OSAS. A RF of 1000 trees achieved correctly classified 79.94 % of test cases. When further evaluated on the test cohort, RF showed the waist and BMI as the high import factors in OSAS. Conclusion It is possible to provide patient with prescreening by body parameters which can pre-evaluate the health risks.Keywords: apnea and hypopnea index, Body parameters, obstructive sleep apnea syndrome, Random Forest
Procedia PDF Downloads 1511386 The Next Generation’s Learning Ability, Memory, as Well as Cognitive Skills Is under the Influence of Paternal Physical Activity (An Intergenerational and Trans-Generational Effect): A Systematic Review and Meta-Analysis
Authors: Parvin Goli, Amirhosein Kefayat, Rezvan Goli
Abstract:
Background: It is well established that parents can influence their offspring's neurodevelopment. It is shown that paternal environment and lifestyle is beneficial for the progeny's fitness and might affect their metabolic mechanisms; however, the effects of paternal exercise on the brain in the offspring have not been explored in detail. Objective: This study aims to review the impact of paternal physical exercise on memory and learning, neuroplasticity, as well as DNA methylation levels in the off-spring's hippocampus. Study design: In this systematic review and meta-analysis, an electronic literature search was conducted in databases including PubMed, Scopus, and Web of Science. Eligible studies were those with an experimental design, including an exercise intervention arm, with the assessment of any type of memory function, learning ability, or any type of brain plasticity as the outcome measures. Standardized mean difference (SMD) and 95% confidence intervals (CI) were computed as effect size. Results: The systematic review revealed the important role of environmental enrichment in the behavioral development of the next generation. Also, offspring of exercised fathers displayed higher levels of memory ability and lower level of brain-derived neurotrophic factor. A significant effect of paternal exercise on the hippocampal volume was also reported in the few available studies. Conclusion: These results suggest an intergenerational effect of paternal physical activity on cognitive benefit, which may be associated with hippocampal epigenetic programming in offspring. However, the biological mechanisms of this modulation remain to be determined.Keywords: hippocampal plasticity, learning ability, memory, parental exercise
Procedia PDF Downloads 2081385 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 1451384 The Development of Nursing Model for Pregnant Women to Prevention of Early Postpartum Hemorrhage
Authors: Wadsana Sarakarn, Pimonpan Charoensri, Baliya Chaiyara
Abstract:
Objectives: To study the outcomes of the developed nursing model to prevent early postpartum hemorrhage (PPH). Materials and Methods: The analytical study was conducted in Sunpasitthiprasong Hospital during October 1st, 2015, until May 31st, 2017. After review the prevalence, risk factors, and outcomes of postpartum hemorrhage of the parturient who gave birth in Sunpasitthiprasong Hospital, the nursing model was developed under research regulation of Kemmis&McTaggart using 4 steps of operating procedures: 1) analyzing problem situation and gathering 2) creating the plan 3) noticing and performing 4) reflecting the result of the operation. The nursing model consisted of the screening tools for risk factors associated with PPH, the clinical nursing practice guideline (CNPG), and the collecting bag for measuring postpartum blood loss. Primary outcome was early postpartum hemorrhage. Secondary outcomes were postpartum hysterectomy, maternal mortality, personnel’s practice, knowledge, and satisfaction of the nursing model. The data were analyzed by using content analysis for qualitative data and descriptive statistics for quantitative data. Results: Before using the nursing model, the prevalence of early postpartum hemorrhage was under estimated (2.97%). There were 5 cases of postpartum hysterectomy and 2 cases of maternal death due to postpartum hemorrhage. During the study period, there was 22.7% prevalence of postpartum hemorrhage among 220 pregnant women who were vaginally delivered at Sunpasitthiprasong Hospital. No maternal death or postpartum hysterectomy was reported after using the nursing model. Among 16 registered nurses at the delivery room who evaluated using of the nursing model, they reported the high level of practice, knowledge, and satisfaction Conclusion: The nursing model for the prevention of early PPH is effective to decrease early PPH and other serious complications.Keywords: the development of a nursing model, prevention of postpartum hemorrhage, pregnant women, postpartum hemorrhage
Procedia PDF Downloads 981383 Educational Attainment of Owner-Managers and Performance of Micro- and Small Informal Businesses in Nigeria
Authors: Isaiah Oluranti Olurinola, Michael Kayode Bolarinwa, Ebenezer Bowale, Ifeoluwa Ogunrinola
Abstract:
Abstract - While much literature exists on microfinancing and its impact on the development of micro, small and medium-scale enterprises (MSME), yet little is known in respect of the impact of different types of education of owner-managers on the performances as well as innovative possibilities of such enterprises. This paper aims at contributing to the understanding of the impact of different types of education (academic, technical, apprenticeship, etc) that influence the performance of micro, small and medium-sized enterprise (MSME). This study utilises a recent and larger data-set collected in six states and FCT Abuja, Nigeria in the year 2014. Furthermore, the study carries out a comparative analysis of business performance among the different geo-political zones in Nigeria, given the educational attainment of the owner-managers. The data set were enterprise-based and were collected by the Nigerian Institute for Social and Economic Research (NISER) in the year 2014. Six hundred and eighty eight enterprises were covered in the survey. The method of data analysis for this study is the use of basic descriptive statistics in addition to the Logistic Regression model used in the prediction of the log of odds of business performance in relation to any of the identified educational attainment of the owner-managers in the sampled enterprises. An OLS econometric technique is also used to determine the effects of owner-managers' different educational types on the performance of the sampled MSME. Policy measures that will further enhance the contributions of education to MSME performance will be put forward.Keywords: Business Performance, Education, Microfinancing, Micro, Small and Medium Scale Enterprises
Procedia PDF Downloads 5201382 A Two Arm Double Parallel Randomized Controlled Trail of the Effects of Health Education Intervention on Insecticide Treated Nets Use and Its Practices among Pregnant Women Attending Antenatal Clinic: Study Protocol
Authors: Opara Monica, Suriani Ismail, Ahmad Iqmer Nashriq Mohd Nazan
Abstract:
The true magnitude of the mortality and morbidity attributable to malaria worldwide is, at best, a scientific guess, although it is not disputable that the greatest burden is in sub-Saharan Africa. Those at highest risk are children younger than 5 years and pregnant women, particularly primigravidae. Nationally, malaria remains the third leading cause of death and is still considered a major public health problem. Therefore, this study is aimed to assess the effectiveness of health education intervention on insecticide-treated net use and its practices among pregnant women attending antenatal clinics. Materials and Methods: This study will be an intervention study with two arms double parallel randomized controlled trial (blinded) to be conducted in 3 stages. The first stage will develop health belief model (HBM) program, while in the second stage, pregnant women will be recruited, assessed (baseline data), randomized into two arms of the study, and follow-up for six months. The third stage will evaluate the impact of the intervention on HBM and disseminate the findings. Data will be collected with the use of a structured questionnaire which will contain validated tools. The main outcome measurement will be the treatment effect using HBM, while data will be analysed using SPSS, version 22. Discussion: The study will contribute to the existing knowledge on hospital-based care programs for pregnant women in developing countries where the literature is scanty. It will generally give insight into the importance of HBM measurement in interventional studies on malaria and other related infectious diseases in this setting.Keywords: malaria, health education, insecticide-treated nets, sub-Saharan Africa
Procedia PDF Downloads 1211381 Efficient DNN Training on Heterogeneous Clusters with Pipeline Parallelism
Abstract:
Pipeline parallelism has been widely used to accelerate distributed deep learning to alleviate GPU memory bottlenecks and to ensure that models can be trained and deployed smoothly under limited graphics memory conditions. However, in highly heterogeneous distributed clusters, traditional model partitioning methods are not able to achieve load balancing. The overlap of communication and computation is also a big challenge. In this paper, HePipe is proposed, an efficient pipeline parallel training method for highly heterogeneous clusters. According to the characteristics of the neural network model pipeline training task, oriented to the 2-level heterogeneous cluster computing topology, a training method based on the 2-level stage division of neural network modeling and partitioning is designed to improve the parallelism. Additionally, a multi-forward 1F1B scheduling strategy is designed to accelerate the training time of each stage by executing the computation units in advance to maximize the overlap between the forward propagation communication and backward propagation computation. Finally, a dynamic recomputation strategy based on task memory requirement prediction is proposed to improve the fitness ratio of task and memory, which improves the throughput of the cluster and solves the memory shortfall problem caused by memory differences in heterogeneous clusters. The empirical results show that HePipe improves the training speed by 1.6×−2.2× over the existing asynchronous pipeline baselines.Keywords: pipeline parallelism, heterogeneous cluster, model training, 2-level stage partitioning
Procedia PDF Downloads 161380 Bi-Liquid Free Surface Flow Simulation of Liquid Atomization for Bi-Propellant Thrusters
Authors: Junya Kouwa, Shinsuke Matsuno, Chihiro Inoue, Takehiro Himeno, Toshinori Watanabe
Abstract:
Bi-propellant thrusters use impinging jet atomization to atomize liquid fuel and oxidizer. Atomized propellants are mixed and combusted due to auto-ignitions. Therefore, it is important for a prediction of thruster’s performance to simulate the primary atomization phenomenon; especially, the local mixture ratio can be used as indicator of thrust performance, so it is useful to evaluate it from numerical simulations. In this research, we propose a numerical method for considering bi-liquid and the mixture and install it to CIP-LSM which is a two-phase flow simulation solver with level-set and MARS method as an interfacial tracking method and can predict local mixture ratio distribution downstream from an impingement point. A new parameter, beta, which is defined as the volume fraction of one liquid in the mixed liquid within a cell is introduced and the solver calculates the advection of beta, inflow and outflow flux of beta to a cell. By validating this solver, we conducted a simple experiment and the same simulation by using the solver. From the result, the solver can predict the penetrating length of a liquid jet correctly and it is confirmed that the solver can simulate the mixing of liquids. Then we apply this solver to the numerical simulation of impinging jet atomization. From the result, the inclination angle of fan after the impingement in the bi-liquid condition reasonably agrees with the theoretical value. Also, it is seen that the mixture of liquids can be simulated in this result. Furthermore, simulation results clarify that the injecting condition affects the atomization process and local mixture ratio distribution downstream drastically.Keywords: bi-propellant thrusters, CIP-LSM, free-surface flow simulation, impinging jet atomization
Procedia PDF Downloads 2771379 A Comprehensive Characterization of Cell-free RNA in Spent Blastocyst Medium and Quality Prediction for Blastocyst
Authors: Huajuan Shi
Abstract:
Background: The biopsy of the preimplantation embryo may increase the potential risk and concern of embryo viability. Clinically discarded spent embryo medium (SEM) has entered the view of researchers, sparking an interest in noninvasive embryo screening. However, one of the major restrictions is the extremelty low quantity of cf-RNA, which is difficult to efficiently and unbiased amplify cf-RNA using traditional methods. Hence, there is urgently need to an efficient and low bias amplification method which can comprehensively and accurately obtain cf-RNA information to truly reveal the state of SEM cf-RNA. Result: In this present study, we established an agarose PCR amplification system, and has significantly improved the amplification sensitivity and efficiency by ~90 fold and 9.29 %, respectively. We applied agarose to sequencing library preparation (named AG-seq) to quantify and characterize cf-RNA in SEM. The number of detected cf-RNAs (3533 vs 598) and coverage of 3' end were significantly increased, and the noise of low abundance gene detection was reduced. The increasing percentage 5' end adenine and alternative splicing (AS) events of short fragments (< 400 bp) were discovered by AG-seq. Further, the profiles and characterizations of cf-RNA in spent cleavage medium (SCM) and spent blastocyst medium (SBM) indicated that 4‐mer end motifs of cf-RNA fragments could remarkably differentiate different embryo development stages. Significance: This study established an efficient and low-cost SEM amplification and library preparation method. Not only that, we successfully described the characterizations of SEM cf-RNA of preimplantation embryo by using AG-seq, including abundance features fragment lengths. AG-seq facilitates the study of cf-RNA as a noninvasive embryo screening biomarker and opens up potential clinical utilities of trace samples.Keywords: cell-free RNA, agarose, spent embryo medium, RNA sequencing, non-invasive detection
Procedia PDF Downloads 631378 In Stemming Out Societal Depravity: Existentialism, Realism, and Contrapuntal Criticism in Nigerian Arabic Poetry: Ibn Yusuf’s Anthology as Paradigm
Authors: Izzudeen Adetunji
Abstract:
The intrinsic nexus between man and society is apparently unknown to many people despite understanding the real responsibility and immense roles in society. Amongst the in-depth roles of a man as an agent of the societal reformer is to be a driven force towards installing normalcy and socio-cultural change in society. The paradoxical attitudes of man in engaging in social vices, illicit characters, and unwanted attitudes have given birth to decay and ill-society. However, the need for social change or socio-cultural evolution might be necessary to install normalcy and social order. Nigerian Arabic poets since the 19th century have tremendously engaged in utilizing their poetry for social change through socio-cultural, religious, economic, scientific, or technological forces. This engagement has hitherto yielded a positive outcome for societal reform. The anthology of Ibn Yusuf is one of the compendiums of poetries revealing societal depravity, man’s social vices, and atrocities; which later called to flawlessness. The theoretical framework would be examined through the Heraclitan model, focusing on a parallel to that of a living organism, which, in order to remain alive, must constantly change. Therefore, the thrust of this paper is to examine the societal maladies as elucidated in Ibn Yusuf’s anthology and proffer a contrapuntal criticism of it. Before delving into the main discussion, the paper will examine the concepts of existentialism and realism as a philosophical interface. Likewise, the issues of man and social change, an overview of Nigerian Arabic poetry, will be discussed. Ibn Yusuf’s biography and scholarship and the review of his anthology will be studied. The paper will conclude by critically examining the contrapuntal criticism of societal maladies through Ibn Yusuf’s anthology.Keywords: societal depravity, existentialism, realism, Nigeria Arabic poetry, Ibn Yusuf’s anthology, contrapuntal criticism
Procedia PDF Downloads 241377 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure
Authors: T. Nozu, K. Hibi, T. Nishiie
Abstract:
This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure
Procedia PDF Downloads 2431376 Analyzing Changes in Runoff Patterns Due to Urbanization Using SWAT Models
Authors: Asawari Ajay Avhad
Abstract:
The Soil and Water Assessment Tool (SWAT) is a hydrological model designed to predict the complex interactions within natural and human-altered watersheds. This research applies the SWAT model to the Ulhas River basin, a small watershed undergoing urbanization and characterized by bowl-like topography. Three simulation scenarios (LC17, LC22, and LC27) are investigated, each representing different land use and land cover (LULC) configurations, to assess the impact of urbanization on runoff. The LULC for the year 2027 is generated using the MOLUSCE Plugin of QGIS, incorporating various spatial factors such as DEM, Distance from Road, Distance from River, Slope, and distance from settlements. Future climate data is simulated within the SWAT model using historical data spanning 30 years. A susceptibility map for runoff across the basin is created, classifying runoff into five susceptibility levels ranging from very low to very high. Sub-basins corresponding to major urban settlements are identified as highly susceptible to runoff. With consideration of future climate projections, a slight increase in runoff is forecasted. The reliability of the methodology was validated through the identification of sub-basins known for experiencing severe flood events, which were determined to be highly susceptible to runoff. The susceptibility map successfully pinpointed these sub-basins with a track record of extreme flood occurrences, thus reinforcing the credibility of the assessment methodology. This study suggests that the methodology employed could serve as a valuable tool in flood management planning.Keywords: future land use impact, flood management, run off prediction, ArcSWAT
Procedia PDF Downloads 451375 Design and Optimization of an Electromagnetic Vibration Energy Converter
Authors: Slim Naifar, Sonia Bradai, Christian Viehweger, Olfa Kanoun
Abstract:
Vibration provides an interesting source of energy since it is available in many indoor and outdoor applications. Nevertheless, in order to have an efficient design of the harvesting system, vibration converters have to satisfy some criterion in terms of robustness, compactness and energy outcome. In this work, an electromagnetic converter based on mechanical spring principle is proposed. The designed harvester is formed by a coil oscillating around ten ring magnets using a mechanical spring. The proposed design overcomes one of the main limitation of the moving coil by avoiding the contact between the coil wires with the mechanical spring which leads to a better robustness for the converter. In addition, the whole system can be implemented in a cavity of a screw. Different parameters in the harvester were investigated by finite element method including the magnet size, the coil winding number and diameter and the excitation frequency and amplitude. A prototype was realized and tested. Experiments were performed for 0.5 g to 1 g acceleration. The used experimental setup consists of an electrodynamic shaker as an external artificial vibration source controlled by a laser sensor to measure the applied displacement and frequency excitation. Together with the laser sensor, a controller unit, and an amplifier, the shaker is operated in a closed loop which allows controlling the vibration amplitude. The resonance frequency of the proposed designs is in the range of 24 Hz. Results indicate that the harvester can generate 612 mV and 1150 mV maximum open circuit peak to peak voltage at resonance for 0.5 g and 1 g acceleration respectively which correspond to 4.75 mW and 1.34 mW output power. Tuning the frequency to other values is also possible due to the possibility to add mass to the moving part of the or by changing the mechanical spring stiffness.Keywords: energy harvesting, electromagnetic principle, vibration converter, moving coil
Procedia PDF Downloads 2941374 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis
Authors: Chang-Jen Lan
Abstract:
Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as XKeywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index
Procedia PDF Downloads 1301373 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection
Procedia PDF Downloads 2921372 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance
Authors: Habtamu Tkubet Ebuy
Abstract:
Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort
Procedia PDF Downloads 1021371 Ozone Therapy for Disc Herniation: A Non-surgical Option
Authors: Shahzad Karim Bhatti
Abstract:
Background: Ozone is a combination of oxygen and can be used in treatment of low back pain due to herniated disc. It is a minimally invasive procedure using biochemical properties of ozone resulting in reduced volume of disc and inflammation resulting in significant pain relief. Aim: The purpose of this study was to evaluate the effectiveness of ozone therapy in combination with peri-ganglionic injection of local anesthetic and corticosteroid. Material and Methods: This retrospective study was done at the Interventional Radiology Department of Mayo Hospital, Lahore. A total of 49000 patients were included from January 2008 to March 2022. All the patients presented with clinical signs and symptoms of lumber disc herniation, which was confirmed by a MRI scan of the lumbar sacral spine. The pain reduction was calculated using modified MacNab method. All the patients underwent percutaneous injection of ozone at a concentration of 27 micrograms/ml to lumber disc under fluoroscopic guidance with combination of local anesthetic and corticosteroid in peri-ganglionic space. Results were evaluated by two expert observers who were blinded to patient treatment. Results A satisfactory therapeutic outcome was obtained. 55% of the patients showed complete recovery with resolution of symptoms. 20% of the patients complained of occasional episodic pain with no limitation of occupational activity. 15% of cases showed insufficient improvement. 5% of cases had insufficient improvement and went for surgery. 10% of cases never turned up after the first visit. Conclusion Intradiscal ozone for the treatment of herniated discs has revolutionized percutaneous approach to nerve root compression making it safer, economical and easier to repeat without any side effects than treatments currently used in Pakistan.Keywords: pain, prolapse, Ozone, backpain
Procedia PDF Downloads 271370 Comparative Study in Treatment of Distal Humerus Fracture with Lateral Column Plate Percutaneous Medial Screw and Intercondylar Screw
Authors: Sameer Gupta, Prant Gupta
Abstract:
Context: Fractures in the distal humerus are complex and challenging injuries for orthopaedic surgeons that can be effectively treated with open reduction and internal fixation. Aims: The study analyses clinical outcomes in patients with intra-articular distal humerus fractures (AO type 13 C3 excluded) treated using a different method of fixation ( LCPMS). Subject and Methods: A study was performed, and the author's personal experiences were reported. Thirty patients were treated using an intercondylar screw with lateral column plating and percutaneous medial column screw fixation. Detailed analysis was done for functional outcomes (average arc of motion, union rate, and complications). Statistical Analysis Used: SPSS software version 22.0 was used for statistical analysis. Results: In our study, at the end of 6 months, Overall good to excellent results were achieved in 28 patients out of 30 after analysis on the basis of MEP score. The majority of patients regained full arc of motion, achieved fracture union without any major complications, and were able to perform almost all activities of daily living (which required good elbow joint movements and functions). Conclusion: We concluded that this novel method provides adequate stability and anatomical reconstruction with an early union rate observed at the end of 6 months. Excellent functional outcome was observed in almost all the patients because of less operating time and initiation of early physiotherapy, as most of the patients experienced mild nature of pain post-surgery.Keywords: intra arricular distal humerus fracture, percutaneous medial screw, lateral column plate, arc of motion
Procedia PDF Downloads 581369 Dewatering of Brewery Sludge through the Use of Biopolymers
Authors: Audrey Smith, M. Saifur Rahaman
Abstract:
The waste crisis has become a global issue, forcing many industries to reconsider their disposal methods and environmental practices. Sludge is a form of waste created in many fields, which include water and wastewater, pulp and paper, as well as from breweries. The composition of this sludge differs between sources and can, therefore, have varying disposal methods or future applications. When looking at the brewery industry, it produces a significant amount of sludge with a high water content. In order to avoid landfilling, this waste can further be processed into a valuable material. Specifically, the sludge must undergo dewatering, a process which typically involves the addition of coagulants like aluminum sulfate or ferric chloride. These chemicals, however, limit the potential uses of the sludge since it will contain traces of metals. In this case, the desired outcome of the brewery sludge would be to produce animal feed; however, these conventional coagulants would add a toxic component to the sludge. The use of biopolymers like chitosan, which act as a coagulant, can be used to dewater brewery sludge while allowing it to be safe for animal consumption. Chitosan is also a by-product created by the shellfish processing industry and therefore reduces the environmental imprint since it involves using the waste from one industry to treat the waste from another. In order to prove the effectiveness of this biopolymer, experiments using jar-tests will be utilised to determine the optimal dosages and conditions, while variances of contaminants like ammonium will also be observed. The efficiency of chitosan can also be compared to other polysaccharides to determine which is best suited for this waste. Overall a significant separation has been achieved between the solid and liquid content of the waste during the coagulation-flocculation process when applying chitosan. This biopolymer can, therefore, be used to dewater brewery sludge such that it can be repurposed as animal feed. The use of biopolymers can also be applied to treat sludge from other industries, which can reduce the amount of waste produced and allow for more diverse options for reuse.Keywords: animal feed, biopolymer, brewery sludge, chitosan
Procedia PDF Downloads 1561368 The Role of Phase Morphology on the Corrosion Fatigue Mechanism in Marine Steel
Authors: Victor Igwemezie, Ali Mehmanparast
Abstract:
The correct knowledge of corrosion fatigue mechanism in marine steel is very important. This is because it enables the design, selection, and use of steels for offshore applications. It also supports realistic corrosion fatigue life prediction of marine structures. A study has been conducted to increase the understanding of corrosion fatigue mechanism in marine steels. The materials investigated are normalized and advanced S355 Thermomechanical control process (TMCP) steels commonly used in the design of offshore wind turbine support structures. The experimental study was carried out by conducting corrosion fatigue tests under conditions pertinent to offshore wind turbine operations, using the state of the art facilities. A careful microstructural study of the crack growth path was conducted using metallurgical optical microscope (OM), scanning electron microscope (SEM) and Energy Dispersive X-Ray Spectroscopy (EDX). The test was conducted on three subgrades of S355 steel: S355J2+N, S355G8+M and S355G10+M and the data compared with similar studies in the literature. The result shows that the ferrite-pearlite morphology primarily controls the corrosion-fatigue crack growth path in marine steels. A corrosion fatigue mechanism which relies on the hydrogen embrittlement of the grain boundaries and pearlite phase is used to explain the crack propagation behaviour. The crack growth trend in the Paris region of the da/dN vs. ΔK curve is used to explain the dependency of the corrosion-fatigue crack growth rate on the ferrite-pearlite morphology.Keywords: corrosion-fatigue mechanism, fatigue crack growth rate, ferritic-pearlitic steel, microstructure, phase morphology
Procedia PDF Downloads 1581367 Acute Cartilage Defects of the Knee Treated With Chondral Restoration Procedures and Patellofemoral Stabilisation
Authors: John Scanlon, Antony Raymond, Randeep Aujla, Peter D’Alessandro, Satyen Gohil
Abstract:
Background: The incidence of significant acute chondral injuries with patella dislocation is around 10-15%. It is accepted that chondral procedures should only be performed in the presence of joint stability Methods:Patients were identified from surgeon/hospital logs. Patient demographics, lesion size and location, surgical procedure, patient reported outcome measures, post-operative MR imaging, and complications were recorded. PROMs and patient satisfaction was obtained. Results:20 knees (18 patients) were included. Mean age was 18.6 years (range; 11-39), and the mean follow-up was 16.6 months (range; 2-70). The defect locations were the lateral femoral condyle (9/20; 45%), patella (9/20; 45%), medial femoral condyle (1/20; 5%) and the trochlea (1/20; 5%). The mean defect size was 2.6cm2. Twelve knees were treated with cartilage fixation, 5 with microfracture, and 3 with OATS. At follow up, the overall mean Lysholm score was 77.4 (± 17.1), with no chondral regenerative procedure being statistically superior. There was no difference in Lysholm scores between those patients having acute medial patellofemoral ligament reconstruction versus medial soft tissue plication (p=0.59). Five (25%) knees required re-operation (one arthroscopic arthrolysis; one patella chondroplasty; two removal of loose bodies; one implant adjustment). Overall, 90% responded as being satisfied with surgery. Conclusion: Our aggressive pathway to identify and treat acute cartilage defects with early operative intervention and patella stabilisation has shown high rates of satisfaction and Lysholm scores. The full range of chondral restoration options should be considered by surgeons managing these patients.Keywords: patella dislocation, chondral restoration, knee, patella stabilisation
Procedia PDF Downloads 1261366 Comparing the Educational Effectiveness of eHealth to Deliver Health Knowledge between Higher Literacy Users and Lower Literacy Users
Authors: Yah-Ling Hung
Abstract:
eHealth is undoubtedly emerging as a promising vehicle to provide information for individual self-care management. However, the accessing ability, reading strategies and navigating behavior between higher literacy users and lower literacy users are significantly different. Yet, ways to tailor audiences’ health literacy and develop appropriate eHealth to feed their need become a big challenge. The purpose of this study is to compare the educational effectiveness of eHealth to deliver health knowledge between higher literacy users and lower literacy users, thus establishing useful design strategies of eHealth for users with different level of health literacy. The study was implemented in four stages, the first of which developed a website as the testing media to introduce health care knowledge relating to children’s allergy. Secondly, a reliability and validity test was conducted to make sure that all of the questions in the questionnaire were good indicators. Thirdly, a pre-post knowledge test was conducted with 66 participants, 33 users with higher literacy and 33 users with lower literacy respectively. Finally, a usability evaluation survey was undertaken to explore the criteria used by users with different levels of health literacy to evaluate eHealth. The results demonstrated that the eHealth Intervention in both groups had a positive outcome. There was no significant difference between the effectiveness of eHealth intervention between users with higher literacy and users with lower literacy. However, the average mean of lower literacy group was marginally higher than the average mean of higher literacy group. The findings also showed that the criteria used to evaluate eHealth could be analyzed in terms of the quality of information, appearance, appeal and interaction, but the users with lower literacy have different evaluation criteria from those with higher literacy. This is an interdisciplinary research which proposes the sequential key steps that incorporate the planning, developing and accessing issues that need to be considered when designing eHealth for patients with varying degrees of health literacy.Keywords: eHealth, health intervention, health literacy, usability evaluation
Procedia PDF Downloads 1391365 The Development, Validation, and Evaluation of the Code Blue Simulation Module in Improving the Code Blue Response Time among Nurses
Authors: Siti Rajaah Binti Sayed Sultan
Abstract:
Managing the code blue event is stressful for nurses, the patient, and the patient's families. The rapid response from the first and second responders in the code blue event will improve patient outcomes and prevent tissue hypoxia that leads to brain injury and other organ failures. Providing 1 minute for the cardiac massage and 2 minutes for defibrillation will significantly improve patient outcomes. As we know, the American Heart Association came out with guidelines for managing cardiac arrest patients. The hospital must provide competent staff to manage this situation. It can be achieved when the staff is well equipped with the skill, attitude, and knowledge to manage this situation with well-planned strategies, i.e., clear guidelines for managing the code blue event, competent staff, and functional equipment. The code blue simulation (CBS) was chosen in the training program for code blue management because it can mimic real scenarios. Having the code blue simulation module will allow the staff to appreciate what they will face during the code blue event, especially since it rarely happens in that area. This CBS module training will help the staff familiarize themselves with the activities that happened during actual events and be able to operate the equipment accordingly. Being challenged and independent in managing the code blue in the early phase gives the patient a better outcome. The CBS module will help the assessor and the hospital management team with the proper tools and guidelines for managing the code blue drill accordingly. As we know, prompt action will benefit the patient and their family. It also indirectly increases the confidence and job satisfaction among the nurses, increasing the standard of care, reducing the complication and hospital burden, and enhancing cost-effective care.Keywords: code blue simulation module, development of code blue simulation module, code blue response time, code blue drill, cardiorespiratory arrest, managing code blue
Procedia PDF Downloads 641364 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values
Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi
Abstract:
A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.Keywords: eXtreme gradient boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impair, multiclass classification, ADNI, support vector machine, random forest
Procedia PDF Downloads 186