Search results for: unmanned intervention algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6278

Search results for: unmanned intervention algorithm

1148 The Effect of Surgical Intervention on Pediatric and Adolescent Obstructive Sleep Apnea Syndrome

Authors: Ching-Yi Yiu, Hui-Chen Hsu

Abstract:

Objectives: Obstructive sleep apnea syndrome (OSAS) is a popular problem in the modern society. It usually leads to sleep disorder, excessive daytime sleepiness and associated with cardiovascular diseases, cognitive dysfunction and even death. The nonsurgical therapies include continuous positive airway pressure (CPAP), diet and oral appliances. The surgical approaches have nasal surgery, tonsillectomy, adenoidectomy, uvulopalatopharyngoplasty (UPPP) and transoral robotic surgery (TORS).We compare the impact of surgical treatments on these kinds of patients. Methods: Between January 2018 to September 2022, We have enrolled 125 OSAS patients including 82 male and 43 female in Chi Mei Medical Center, Liouying, Taiwan. The age distribution from 6 to 71 years old (y/o) with mean age 36.1 y/o. The averaged body mass index (BMI) is 25 kg/m2 in male and 25.5 kg/m2 in female. In this cohort, we evaluated their upper airway obstruction sites with nasopharyngoscopy and scheduled a planned surgery. Some of cases received polysomnography (PSG) preoperatively, the averaged apnea-hypopnea index (AHI) is 37.7 events/hour. We have 68 patients received tonsillectomy, 9 received UPPP, 42 received UPPP and septomeatoplasty (SMP) and 6 received adenoidectomy and tonsillectomy (A and T). The subjective daytime sleepiness was evaluated with the Epworth sleepiness scale (ESS). Results: In the 68 tonsillectomy group, the averaged BMI is 24.9 kg/m2. In the UPPP group, the averaged BMI is 28.9 kg/m2. In UPPP and SMP group, the averaged BMI is 27.9 kg/m2. In the A and T group, the averaged BMI is 17.2 kg/m2. The reduction of AHI less than 20 is 58% postoperatively. The ESS reduced from 10.9 to 4.9 after surgery. Conclusion: Obstructive sleep apnea syndrome is a common upper airway disturbance in the general population. The prevalence rate is ranging high depending on different regions, age, sex and race. It leads to severe morbidity and mortality including car accident, stroke, nocturnal desaand sudden death and should be considered to be a major public health problem. The CPAP is effective to improve daytime sleepiness but the long-term compliance is low. The surgical treatment with different modalities can produce 50% decrease in AHI and ESS after surgery in the 6 to 12 months short-term period.

Keywords: apnea-hypopnea index, obstructive sleep apnea syndrome, polysomnography, uvulopalatopharyngoplasty

Procedia PDF Downloads 95
1147 Coding and Decoding versus Space Diversity for ‎Rayleigh Fading Radio Frequency Channels ‎

Authors: Ahmed Mahmoud Ahmed Abouelmagd

Abstract:

The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.

Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, ‎convolution coding, viterbi decoding, space diversity

Procedia PDF Downloads 438
1146 Interior Architecture in the Anthropocene: Engaging the Subnature through the Intensification of Body-Surface Interaction

Authors: Verarisa Ujung

Abstract:

The Anthropocene – as scientists define as a new geological epoch where human intervention has the dominant influence on the geological, atmospheric, and ecological processes challenges the contemporary discourse in architecture and interior. The dominant influence characterises the incapability to distinguish the notion of nature, subnature, human and non-human. Consequently, living in the Anthropocene demands sensitivity and responsiveness to heighten our sense of the rhythm of transformation and recognition of our environment as a product of natural, social and historical processes. The notion of subnature is particularly emphasised in this paper to investigate the poetic sense of living with subnature. It could be associated with the critical tool for exploring the aesthetic and programmatic implications of subnature on interiority. The ephemeral immaterial attached to subnature promotes the sense of atmospheric delineation of interiority, the very inner significance of body-surface interaction, which central to interior architecture discourse. This would then reflect human’s activities; examine the transformative change, the architectural motion and the traces that left between moments. In this way, engaging the notion of subnature enable us to better understand the critical subject on interiority and might provide an in-depth study on interior architecture. Incorporating the exploration on the form, materiality, and pattern of subnature, this research seeks to grasp the inner significance of micro to macro approaches so that the future of interior might be compelled to depend more on the investigation and development of responsive environment. To reflect upon the form, materiality and intensity of subnature that specifically characterized by the natural, social and historical processes, this research examines a volcanic land, White Island/Whakaari, New Zealand as the chosen site of investigation. Emitting various forms and intensities of subnatures - smokes, mud, sulphur gas, this volcanic land also open to the new inhabitation within the sulphur factory ruins that reflects human’s past occupation. In this way, temporal and natural selected manifestations of materiality, artefact, and performance can be traced out and might reveal the meaningful relations among space, inhabitation, and well-being of inhabitants in the Anthropocene.

Keywords: anthropocene, body, intensification, intensity, interior architecture, subnature, surface

Procedia PDF Downloads 174
1145 A Protocol for Usability of Teaching to Students with Learning Difficulties at University: An Italian Research

Authors: Tamara Zappaterra

Abstract:

The Learning Difficulties have an evolutionary nature. The international research has focused its analysis on the characteristics of Learning Difficulties in childhood, but we are still far from a thorough understanding of the nature of such disorders in adolescence and adulthood. Such issues become even more urgent in the university context. Spelling, meaning, and appropriate use of the specific vocabulary of the various disciplines represent an additional challenge for the dyslexic student. This paper explores the characteristics of Learning Difficulties in adulthood and the impact with the university teaching. It presents the results of an interdisciplinary project (educational, medical and engineering area) at University of Florence. The purpose of project is to design of a protocol for usability of teaching and individual study at university level. The project, after a first reconnaissance of user needs that have been reached with the participation of the very same protagonists, is at the stage of guidelines drafting for inclusion and education, to be used by teachers, students and administrative staff. The methodologies used are a questionnaire built on purpose and a series of focus groups with users. For collecting data during the focus groups it was decided to use a method typical of the Quality Function Deployment, a tool originally used for quality management, whose versatility makes it easy to use in a number of different context. The paper presents furthermore the findings of the project, the most significant elements of the guidelines for teaching, i.e. the section for teachers, whose aim is to implement a Learning Difficulties-friendly teaching, even at the university level, in compliance with italian Law 170/2010. The Guidelines for the didactic and inclusion of Learning Difficulties students of the University of Florence are articulated around a global and systemic plan of action, meant to accompany and protect the students during their study career, even before enrolling at the University, with different declination: the logistical, relational, educational, and didactic levels have been considered. These guidelines in Italy received the endorsement of the CNUDD. It is a systemic intervention plan for Learning Difficulties students, which roused and keeps rousing the interest of all the university system, with a radical consideration on academic teaching. Since while we try to provide the best Learning Difficulties-friendly didactic in compliance with the rules, no one can be exempted from a wider consideration on the nature and the quality of university teaching offered to all students.

Keywords: didactic tools, learning difficulties, special and inclusive education, university teaching

Procedia PDF Downloads 282
1144 Development of Precise Ephemeris Generation Module for Thaichote Satellite Operations

Authors: Manop Aorpimai, Ponthep Navakitkanok

Abstract:

In this paper, the development of the ephemeris generation module used for the Thaichote satellite operations is presented. It is a vital part of the flight dynamics system, which comprises, the orbit determination, orbit propagation, event prediction and station-keeping maneuver modules. In the generation of the spacecraft ephemeris data, the estimated orbital state vector from the orbit determination module is used as an initial condition. The equations of motion are then integrated forward in time to predict the satellite states. The higher geopotential harmonics, as well as other disturbing forces, are taken into account to resemble the environment in low-earth orbit. Using a highly accurate numerical integrator based on the Burlish-Stoer algorithm the ephemeris data can be generated for long-term predictions, by using a relatively small computation burden and short calculation time. Some events occurring during the prediction course that are related to the mission operations, such as the satellite’s rise/set viewed from the ground station, Earth and Moon eclipses, the drift in ground track as well as the drift in the local solar time of the orbital plane are all detected and reported. When combined with other modules to form a flight dynamics system, this application is aimed to be applied for the Thaichote satellite and successive Thailand’s Earth-observation missions.

Keywords: flight dynamics system, orbit propagation, satellite ephemeris, Thailand’s Earth Observation Satellite

Procedia PDF Downloads 375
1143 Pediatric Hearing Aid Use: A Study Based on Data Logging Information

Authors: Mina Salamatmanesh, Elizabeth Fitzpatrick, Tim Ramsay, Josee Lagacé, Lindsey Sikora, JoAnne Whittingham

Abstract:

Introduction: Hearing loss (HL) is one of the most common disorders that presents at birth and in early childhood. Universal newborn hearing screening (UNHS) has been adopted based on the assumption that with early identification of HL, children will have access to optimal amplification and intervention at younger ages, therefore, taking advantage of the brain’s maximal plasticity. One particular challenge for parents in the early years is achieving consistent hearing aid (HA) use which is critical to the child’s development and constitutes the first step in the rehabilitation process. This study examined the consistency of hearing aid use in young children based on data logging information documented during audiology sessions in the first three years after hearing aid fitting. Methodology: The first 100 children who were diagnosed with bilateral HL before 72 months of age since 2003 to 2015 in a pediatric audiology clinic and who had at least two hearing aid follow-up sessions with available data logging information were included in the study. Data from each audiology session (age of child at the session, average hours of use per day (for each ear) in the first three years after HA fitting) were collected. Clinical characteristics (degree of hearing loss, age of HA fitting) were also documented to further understanding of factors that impact HA use. Results: Preliminary analysis of the results of the first 20 children shows that all of them (100%) have at least one data logging session recorded in the clinical audiology system (Noah). Of the 20 children, 17(85%) have three data logging events recorded in the first three years after HA fitting. Based on the statistical analysis of the first 20 cases, the median hours of use in the first follow-up session after the hearing aid fitting in the right ear is 3.9 hours with an interquartile range (IQR) of 10.2h. For the left ear the median is 4.4 and the IQR is 9.7h. In the first session 47% of the children use their hearing aids ≤5 hours, 12% use them between 5 to 10 hours and 22% use them ≥10 hours a day. However, these children showed increased use by the third follow-up session with a median (IQR) of 9.1 hours for the right ear and 2.5, and of 8.2 hours for left ear (IQR) IQR is 5.6 By the third follow-up session, 14% of children used hearing aids ≤5 hours, while 38% of children used them ≥10 hours. Based on the primary results, factors like age and level of HL significantly impact the hours of use. Conclusion: The use of data logging information to assess the actual hours of HA provides an opportunity to examine the: a) challenges of families of young children with HAs, b) factors that impact use in very young children. Data logging when used collaboratively with parents, can be a powerful tool to identify problems and to encourage and assist families in maximizing their child’s hearing potential.

Keywords: hearing loss, hearing aid, data logging, hours of use

Procedia PDF Downloads 229
1142 Introduce a New Model of Anomaly Detection in Computer Networks Using Artificial Immune Systems

Authors: Mehrshad Khosraviani, Faramarz Abbaspour Leyl Abadi

Abstract:

The fundamental component of the computer network of modern information society will be considered. These networks are connected to the network of the internet generally. Due to the fact that the primary purpose of the Internet is not designed for, in recent decades, none of these networks in many of the attacks has been very important. Today, for the provision of security, different security tools and systems, including intrusion detection systems are used in the network. A common diagnosis system based on artificial immunity, the designer, the Adhasaz Foundation has been evaluated. The idea of using artificial safety methods in the diagnosis of abnormalities in computer networks it has been stimulated in the direction of their specificity, there are safety systems are similar to the common needs of m, that is non-diagnostic. For example, such methods can be used to detect any abnormalities, a variety of attacks, being memory, learning ability, and Khodtnzimi method of artificial immune algorithm pointed out. Diagnosis of the common system of education offered in this paper using only the normal samples is required for network and any additional data about the type of attacks is not. In the proposed system of positive selection and negative selection processes, selection of samples to create a distinction between the colony of normal attack is used. Copa real data collection on the evaluation of ij indicates the proposed system in the false alarm rate is often low compared to other ir methods and the detection rate is in the variations.

Keywords: artificial immune system, abnormality detection, intrusion detection, computer networks

Procedia PDF Downloads 353
1141 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems

Procedia PDF Downloads 122
1140 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.

Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria

Procedia PDF Downloads 375
1139 Burnout and Salivary Cortisol Among Laboratory Personnel in Klang Valley, Malaysia During COVID-19 Pandemic

Authors: Maznieda Mahjom, Rohaida Ismail, Masita Arip, Mohd Shaiful Azlan, Nor’Ashikin Othman, Hafizah Abdullah, nor Zahrin Hasran, Joshita Jothimanickam, Syaqilah Shawaluddin, Nadia Mohamad, Raheel Nazakat, Tuan Mohd Amin, Mizanurfakhri Ghazali, Rosmanajihah Mat Lazim

Abstract:

COVID-19 outbreak is particularly detrimental to the mental health of everyone as well as leaving a long devastating crisis in the healthcare sector. Daily increment of COVID-19 cases and close contact, necessitating the testing of a large number of samples, thus increasing the workload and burden to laboratory personnel. This study aims to determine the prevalence of personal-, work- and client-related burnout as well as to measure the concentration of salivary cortisol among laboratory personnel in the main laboratories in Klang Valley, Malaysia. This cross-sectional study was conducted in late 2021 and recruited a total of 404 respondents from three laboratories in Klang Valley, Malaysia. The level of burnout was assessed using Copenhagen Burnout Inventory (CBI) comprising three sub-dimensions of personal-, work- and client-related burnout. The cut-off score of 50% and above indicated possible burnout. Meanwhile, salivary cortisol was measured using a competitive enzyme immunoassay kit (Salimetrics, State College, PA, USA). Normal levels of salivary cortisol concentration in adults are within 0.094 to 1.551 μg/dl (morning) and can be none detected to 0.359 μg/dl (evening). The prevalence of personal-, work- and client-related burnout among laboratory personnel were 36.1%, 17.8% and 7.2% respectively. Meanwhile, the abnormal morning and evening cortisol concentration recorded were 29.5% and 21.8% excluding 6.9%-7.4% missing data. While the IgA level is normal for most of the respondents, which recorded at 95.53%. Laboratory personnel were at risk of suffering burnout during the COVID-19 pandemic. Thus, mental health programs need to be addressed at the department and hospital level by regularly screening healthcare workers and designing an intervention program. It is also vital to improve the coping skills of laboratory personnel by increasing the awareness of good coping skill techniques. The training must be in an innovative way to ensure that the lab personnel can internalise the technique and practise it in real life.

Keywords: burnout, COVID-19, laborotary personnel, salivary cortisol

Procedia PDF Downloads 68
1138 The Preventive Effect of Metformin on Paclitaxel-Induced Peripheral Neuropathy

Authors: AliAkbar Hafezi, Jamshid Abedi, Jalal Taherian, Behnam Kadkhodaei, Mahsa Elahi

Abstract:

Background. Peripheral neuropathy is a common side effect of the administration of neurotoxic chemotherapy agents. This adverse effect is a major dose-limiting factor of many commonly used chemotherapy drugs. Currently, there are no Food and Drug Administration (FDA) approved medications for the prevention or treatment of chemotherapy-induced peripheral neuropathy. Therefore, this study was performed to investigate the efficacy and safety of metformin on paclitaxel-induced peripheral neuropathy (PIPN). Methods. In this randomized clinical trial, cancer patients who were candidates for chemotherapy with paclitaxel referred to the radiation oncology departments in Iran from 2022 to 2023 were studied. Patients were randomly divided into two groups; 1- Case group (n = 30) received metformin 500 mg orally twice a day after meals during chemotherapy with paclitaxel, and 2- Control group (30 people) received chemotherapy without metformin or any additional medication. Patients were visited in terms of numbness or other neurological symptoms two weeks before chemotherapy, 1-2 days before and weekly during chemotherapy, and at the end of the study. They were assessed by nerve conduction study (NCS) before intervention and one week after the end of chemotherapy. The primary outcome was the efficacy in reducing PIPN and the secondary outcome was adverse effects. Eventually, the outcomes were compared between the two groups of patients. Results. A total of 60 female cancer patients receiving chemotherapy with paclitaxel were evaluated in two groups. The groups were matched in terms of age, body mass index, fasting blood sugar, smoking, pathologic stage, and creatinine levels. The results showed that 18 patients (60.0 %) in the case group and 23 patients (76.6 %) in the control group had PIPN clinically (P = 0.267), and NCS showed 11 patients (36.6 %) in the case group and 15 patients (50.0 %) in the control group suffered from PIPN which no significant difference was observed between the two groups (P = 0.435). Diarrhea (n = 3; 10.0 %) and nausea (n = 3; 10.0 %) were the most common side effects of metformin in the case group and no serious side effects (lactic acidosis and anemia) were found in these patients. Conclusion. This study indicated that metformin did not significantly prevent PIPN in cancer patients receiving chemotherapy, although the frequency of peripheral neuropathy in the case group was lower than in the control group. The use of metformin in the patients had acceptable safety and no serious side effects were reported.

Keywords: peripheral neuropathy, chemotherapy, paclitaxel, metformin

Procedia PDF Downloads 42
1137 A Pipeline for Detecting Copy Number Variation from Whole Exome Sequencing Using Comprehensive Tools

Authors: Cheng-Yang Lee, Petrus Tang, Tzu-Hao Chang

Abstract:

Copy number variations (CNVs) have played an important role in many kinds of human diseases, such as Autism, Schizophrenia and a number of cancers. Many diseases are found in genome coding regions and whole exome sequencing (WES) is a cost-effective and powerful technology in detecting variants that are enriched in exons and have potential applications in clinical setting. Although several algorithms have been developed to detect CNVs using WES and compared with other algorithms for finding the most suitable methods using their own samples, there were not consistent datasets across most of algorithms to evaluate the ability of CNV detection. On the other hand, most of algorithms is using command line interface that may greatly limit the analysis capability of many laboratories. We create a series of simulated WES datasets from UCSC hg19 chromosome 22, and then evaluate the CNV detective ability of 19 algorithms from OMICtools database using our simulated WES datasets. We compute the sensitivity, specificity and accuracy in each algorithm for validation of the exome-derived CNVs. After comparison of 19 algorithms from OMICtools database, we construct a platform to install all of the algorithms in a virtual machine like VirtualBox which can be established conveniently in local computers, and then create a simple script that can be easily to use for detecting CNVs using algorithms selected by users. We also build a table to elaborate on many kinds of events, such as input requirement, CNV detective ability, for all of the algorithms that can provide users a specification to choose optimum algorithms.

Keywords: whole exome sequencing, copy number variations, omictools, pipeline

Procedia PDF Downloads 317
1136 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators

Authors: Andrea Bellucci, Martina Tofi

Abstract:

The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.

Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers

Procedia PDF Downloads 296
1135 Full-Field Estimation of Cyclic Threshold Shear Strain

Authors: E. E. S. Uy, T. Noda, K. Nakai, J. R. Dungca

Abstract:

Cyclic threshold shear strain is the cyclic shear strain amplitude that serves as the indicator of the development of pore water pressure. The parameter can be obtained by performing either cyclic triaxial test, shaking table test, cyclic simple shear or resonant column. In a cyclic triaxial test, other researchers install measuring devices in close proximity of the soil to measure the parameter. In this study, an attempt was made to estimate the cyclic threshold shear strain parameter using full-field measurement technique. The technique uses a camera to monitor and measure the movement of the soil. For this study, the technique was incorporated in a strain-controlled consolidated undrained cyclic triaxial test. Calibration of the camera was first performed to ensure that the camera can properly measure the deformation under cyclic loading. Its capacity to measure deformation was also investigated using a cylindrical rubber dummy. Two-dimensional image processing was implemented. Lucas and Kanade optical flow algorithm was applied to track the movement of the soil particles. Results from the full-field measurement technique were compared with the results from the linear variable displacement transducer. A range of values was determined from the estimation. This was due to the nonhomogeneous deformation of the soil observed during the cyclic loading. The minimum values were in the order of 10-2% in some areas of the specimen.

Keywords: cyclic loading, cyclic threshold shear strain, full-field measurement, optical flow

Procedia PDF Downloads 233
1134 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications

Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison

Abstract:

In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.

Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller

Procedia PDF Downloads 238
1133 Application of a Theoretical framework as a Context for a Travel Behavior Change Policy Intervention

Authors: F. Moghtaderi, M. Burke, J. Troelsen

Abstract:

There has been a significant decline in active travel as well as the massive increase use of car-dependent travel mode in many countries during past two decades. Evidential risks for people’s physical and mental health problems are followed by this increased use of motorized travel mode. These problems range from overweight and obesity to increasing air pollution. In response to these rising concerns, local councils and other interested organizations around the world have introduced a variety of initiatives regarding reduce the dominance of cars for the daily journeys. However, the nature of these kinds of interventions, which related to the human behavior, make lots of complexities. People’s travel behavior and changing this behavior, has two different aspects. People’s attitudes and perceptions toward the sustainable and healthy modes of travel, and motorized travel modes (especially private car use) is one these two aspects. The other one related to people’s behavior change processes. There are no comprehensive model in order to guide policy interventions to increase the level of succeed of such interventions. A comprehensive theoretical framework is required in accordance to facilitate and guide the processes of data collection and analysis to achieve the best possible guidelines for policy makers. Regarding this gaps in the travel behavior change research, this paper attempted to identify and suggest a multidimensional framework in order to facilitate planning interventions. A structured mixed-method is suggested regarding the expand the scope and improve the analytic power of the result according to the complexity of human behavior. In order to recognize people’s attitudes, a theory with the focus on people’s attitudes towards a particular travel behavior was needed. The literature around the theory of planned behavior (TPB) was the most useful, and had been proven to be a good predictor of behavior change. Another aspect of the research, related to the people’s decision-making process regarding explore guidelines for the further interventions. Therefore, a theory was needed to facilitate and direct the interventions’ design. The concept of the transtheoretical model of behavior change (TTM) was used regarding reach a set of useful guidelines for the further interventions with the aim to increase active travel and sustainable modes of travel. Consequently, a combination of these two theories (TTM and TPB) had presented as an appropriate concept to identify and design implemented travel behavior change interventions.

Keywords: behavior change theories, theoretical framework, travel behavior change interventions, urban research

Procedia PDF Downloads 373
1132 Use of Telehealth for Facilitating the Diagnostic Assessment of Autism Spectrum Disorder: A Scoping Review

Authors: Manahil Alfuraydan, Jodie Croxall, Lisa Hurt, Mike Kerr, Sinead Brophy

Abstract:

Autism Spectrum Disorder (ASD) is a developmental condition characterised by impairment in terms of social communication, social interaction, and a repetitive or restricted pattern of interest, behaviour, and activity. There is a significant delay between seeking help and a confirmed diagnosis of ASD. This may result in delay in receiving early intervention services, which are critical for positive outcomes. The long wait times also cause stress for the individuals and their families. Telehealth potentially offers a way of improving the diagnostic pathway for ASD. This review of the literature aims to examine which telehealth approaches have been used in the diagnosis and assessment of autism in children and adults, whether they are feasible and acceptable, and how they compare with face-to-face diagnosis and assessment methods. A comprehensive search of following databases- MEDLINE, CINAHL Plus with Full text, Business Sources Complete, Web of Science, Scopus, PsycINFO and trail and systematic review databases including Cochrane Library, Health Technology Assessment, Database of Abstracts and Reviews of Effectiveness and NHS Economic Evaluation was conducted, combining the terms of autism and telehealth from 2000 to 2018. A total of 10 studies were identified for inclusion in the review. This review of the literature found there to be two methods of using telehealth: (a) video conferencing to enable teams in different areas to consult with the families and to assess the child/adult in real time and (b) a video upload to a web portal that enables the clinical assessment of behaviours in the family home. The findings were positive, finding there to be high agreement in terms of the diagnosis between remote methods and face to face methods and with high levels of satisfaction among the families and clinicians. This field is in the very early stages, and so only studies with small sample size were identified, but the findings suggest that there is potential for telehealth methods to improve assessment and diagnosis of autism used in conjunction with existing methods, especially for those with clear autism traits and adults with autism. Larger randomised controlled trials of this technology are warranted.

Keywords: assessment, autism spectrum disorder, diagnosis, telehealth

Procedia PDF Downloads 127
1131 Dialectical Behavior Therapy in Managing Emotional Dysregulation, Depression, and Suicidality in Autism Spectrum Disorder Patients: A Systematic Review

Authors: Alvin Saputra, Felix Wijovi

Abstract:

Background: Adults with Autism Spectrum Disorder (ASD) often experience emotional dysregulation and heightened suicidality. Dialectical Behavior Therapy (DBT) and Radically Open DBT (RO-DBT) have shown promise in addressing these challenges, though research on their effectiveness in ASD populations remains limited. This systematic review aims to evaluate the impact of DBT and RO-DBT on emotional regulation, depression, and suicidality in adults with ASD. Methods: A systematic review was conducted by searching databases such as PubMed, PsycINFO, and Scopus for studies published on DBT and RO-DBT interventions in adults with Autism Spectrum Disorder (ASD). Inclusion criteria were peer-reviewed studies that reported on emotional regulation, suicidality, or depression outcomes. Data extraction focused on sample characteristics, intervention details, and outcome measures. Quality assessment was performed using standard systematic review criteria to ensure reliability and relevance of findings. Results: 4 studies comprising a total of 343 participants were included in this study. DBT and RO-DBT interventions demonstrated a medium effect size (Cohen's d = 0.53) in improving emotional regulation for adults with ASD, with ASD participants achieving significantly better outcomes than non-ASD individuals. RO-DBT was particularly effective in reducing maladaptive overcontrol, though high attrition and a predominantly White British sample limited generalizability. At end-of-treatment, DBT significantly reduced suicidal ideation (z = −2.24; p = 0.025) and suicide attempts (z = −3.15; p = 0.002) compared to treatment as usual (TAU), although this effect did not sustain at 12 months. Depression severity decreased with DBT (z = −1.99; p = 0.046), maintaining significance at follow-up (z = −2.46; p = 0.014). No significant effects were observed for social anxiety, and two suicides occurred in the TAU group. Conclusions: DBT and RO-DBT show potential efficacy in reducing emotional dysregulation, suicidality, and depression in adults with ASD, though the effects on suicidality may diminish over time. High dropout rates and limited sample diversity suggest further research is needed to confirm long-term benefits and improve applicability across broader populations.

Keywords: dialectical behaviour therapy, emotional dysregulation, autism spectrum disorder, suicidality

Procedia PDF Downloads 4
1130 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 189
1129 Rescaling Global Health and International Relations: Globalization of Health in a Low Security Environment

Authors: F. Argurio, F. G. Vaccaro

Abstract:

In a global environment defined by ever-increasing health issues, in spite of the progress made by modern medicine, this paper seeks to readdress the question of global health in an international relations perspective. The research hypothesis is: the lower the security environment, the higher the spread of communicable diseases. This question will be channeled by re-scaling the connotation of 'global' and 'international' dimension through the theoretical lens of glocalization, a theory by Bauman that starts its analysis from simple systems to get to the most complex ones. Glocalization theory will be operationalized by analyzing health in an armed-conflict context. In this respect, the independent variable 'low security environment' translates into the cases of Syria and Yemen, which provide a clear example of the all-encompassing nature of conflict on national health and the effects on regional development. In fact, Syria and Yemen have been affected by poliomyelitis and cholera outbreaks respectively. The dependent variable will be constructed on said communicable diseases which belong to the families of sanitation-related and vaccine-preventable diseases. The research will be both qualitative and quantitative, based on primary (interviews) and secondary (WHO and other NGO’s reports) sources. The methodology is based on the assessment of the vaccine coverage and case-analysis in time and space using epidemiological data. Moreover, local health facilities’ functioning and efficiency will be studied. The article posits that the intervention and cooperation of international organizations with the local authorities becomes crucial to provide the local populations with their primary health needs. In Yemen, the majority of fatal cholera cases were in the regions controlled by the Houthi rebels, not officially accredited by the International Community. Similarly, the polio outbreak in Syria primarily affected the areas not controlled by the Syrian Arab Republic forces, recognized as the leading interlocutor by the WHO. The jeopardized possibilities to access these countries have been pivotal to the determining the problem in controlling sanitation-related and vaccine preventable diseases. This represents a potential threat to global health.

Keywords: health in conflict-affected areas, cholera, polio, Yemen, Syria, glocalization

Procedia PDF Downloads 133
1128 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 141
1127 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data

Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.

Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query

Procedia PDF Downloads 160
1126 Finding the Longest Common Subsequence in Normal DNA and Disease Affected Human DNA Using Self Organizing Map

Authors: G. Tamilpavai, C. Vishnuppriya

Abstract:

Bioinformatics is an active research area which combines biological matter as well as computer science research. The longest common subsequence (LCSS) is one of the major challenges in various bioinformatics applications. The computation of the LCSS plays a vital role in biomedicine and also it is an essential task in DNA sequence analysis in genetics. It includes wide range of disease diagnosing steps. The objective of this proposed system is to find the longest common subsequence which presents in a normal and various disease affected human DNA sequence using Self Organizing Map (SOM) and LCSS. The human DNA sequence is collected from National Center for Biotechnology Information (NCBI) database. Initially, the human DNA sequence is separated as k-mer using k-mer separation rule. Mean and median values are calculated from each separated k-mer. These calculated values are fed as input to the Self Organizing Map for the purpose of clustering. Then obtained clusters are given to the Longest Common Sub Sequence (LCSS) algorithm for finding common subsequence which presents in every clusters. It returns nx(n-1)/2 subsequence for each cluster where n is number of k-mer in a specific cluster. Experimental outcomes of this proposed system produce the possible number of longest common subsequence of normal and disease affected DNA data. Thus the proposed system will be a good initiative aid for finding disease causing sequence. Finally, performance analysis is carried out for different DNA sequences. The obtained values show that the retrieval of LCSS is done in a shorter time than the existing system.

Keywords: clustering, k-mers, longest common subsequence, SOM

Procedia PDF Downloads 265
1125 Prospective Validation of the FibroTest Score in Assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4

Authors: G. Shiha, S. Seif, W. Samir, K. Zalata

Abstract:

Prospective Validation of the FibroTest Score in assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4 FibroTest (FT) is non-invasive score of liver fibrosis that combines the quantitative results of 5 serum biochemical markers (alpha-2-macroglobulin, haptoglobin, apolipoprotein A1, gamma glutamyl transpeptidase (GGT) and bilirubin) and adjusted with the patient's age and sex in a patented algorithm to generate a measure of fibrosis. FT has been validated in patients with chronic hepatitis C (CHC) (Halfon et al., Gastroenterol. Clin Biol.( 2008), 32 6suppl 1, 22-39). The validation of fibro test ( FT) in genotype IV is not well studied. Our aim was to evaluate the performance of FibroTest in an independent prospective cohort of hepatitis C patients with genotype 4. Subject was 122 patients with CHC. All liver biopsies were scored using METAVIR system. Our fibrosis score(FT) were measured, and the performance of the cut-off score were done using ROC curve. Among patients with advanced fibrosis, the FT was identically matched with the liver biopsy in 18.6%, overestimated the stage of fibrosis in 44.2% and underestimated the stage of fibrosis in 37.7% of cases. Also in patients with no/mild fibrosis, identical matching was detected in 39.2% of cases with overestimation in 48.1% and underestimation in 12.7%. So, the overall results of the test were identical matching, overestimation and underestimation in 32%, 46.7% and 21.3% respectively. Using ROC curve it was found that (FT) at the cut-off point of 0.555 could discriminate early from advanced stages of fibrosis with an area under ROC curve (AUC) of 0.72, sensitivity of 65%, specificity of 69%, PPV of 68%, NPV of 66% and accuracy of 67%. As FibroTest Score overestimates the stage of advanced fibrosis, it should not be considered as a reliable surrogate for liver biopsy in hepatitis C infection with genotype 4.

Keywords: fibrotest, chronic Hepatitis C, genotype 4, liver biopsy

Procedia PDF Downloads 413
1124 Health Trajectory Clustering Using Deep Belief Networks

Authors: Farshid Hajati, Federico Girosi, Shima Ghassempour

Abstract:

We present a Deep Belief Network (DBN) method for clustering health trajectories. Deep Belief Network (DBN) is a deep architecture that consists of a stack of Restricted Boltzmann Machines (RBM). In a deep architecture, each layer learns more complex features than the past layers. The proposed method depends on DBN in clustering without using back propagation learning algorithm. The proposed DBN has a better a performance compared to the deep neural network due the initialization of the connecting weights. We use Contrastive Divergence (CD) method for training the RBMs which increases the performance of the network. The performance of the proposed method is evaluated extensively on the Health and Retirement Study (HRS) database. The University of Michigan Health and Retirement Study (HRS) is a nationally representative longitudinal study that has surveyed more than 27,000 elderly and near-elderly Americans since its inception in 1992. Participants are interviewed every two years and they collect data on physical and mental health, insurance coverage, financial status, family support systems, labor market status, and retirement planning. The dataset is publicly available and we use the RAND HRS version L, which is easy to use and cleaned up version of the data. The size of sample data set is 268 and the length of the trajectories is equal to 10. The trajectories do not stop when the patient dies and represent 10 different interviews of live patients. Compared to the state-of-the-art benchmarks, the experimental results show the effectiveness and superiority of the proposed method in clustering health trajectories.

Keywords: health trajectory, clustering, deep learning, DBN

Procedia PDF Downloads 368
1123 Comparison of Two Anesthetic Methods during Interventional Neuroradiology Procedure: Propofol versus Sevoflurane Using Patient State Index

Authors: Ki Hwa Lee, Eunsu Kang, Jae Hong Park

Abstract:

Background: Interventional neuroradiology (INR) has been a rapidly growing and evolving neurosurgical part during the past few decades. Sevoflurane and propofol are both suitable anesthetics for INR procedure. Monitoring of depth of anesthesia is being used very widely. SEDLine™ monitor, a 4-channel processed EEG monitor, uses a proprietary algorithm to analyze the raw EEG signal and displays the Patient State Index (PSI) values. There are only a fewer studies examining the PSI in the neuro-anesthesia. We aimed to investigate the difference of PSI values and hemodynamic variables between sevoflurane and propofol anesthesia during INR procedure. Methods: We reviewed the medical records of patients who scheduled to undergo embolization of non-ruptured intracranial aneurysm by a single operator from May 2013 to December 2014, retrospectively. Sixty-five patients were categorized into two groups; sevoflurane (n = 33) vs propofol (n = 32) group. The PSI values, hemodynamic variables, and the use of hemodynamic drugs were analyzed. Results: Significant differences were seen between PSI values obtained during different perioperative stages in both two groups (P < 0.0001). The PSI values of propofol group were lower than that of sevoflurane group during INR procedure (P < 0.01). The patients in propofol group had more prolonged time of extubation and more phenylephrine requirement than sevoflurane group (p < 0.05). Anti-hypertensive drug was more administered to the patients during extubation in sevoflurane group (p < 0.05). Conclusions: The PSI can detect depth of anesthesia and changes of concentration of anesthetics during INR procedure. Extubation was faster in sevoflurane group, but smooth recovery was shown in propofol group.

Keywords: interventional neuroradiology, patient state index, propofol, sevoflurane

Procedia PDF Downloads 179
1122 Modelling Biological Treatment of Dye Wastewater in SBR Systems Inoculated with Bacteria by Artificial Neural Network

Authors: Yasaman Sanayei, Alireza Bahiraie

Abstract:

This paper presents a systematic methodology based on the application of artificial neural networks for sequencing batch reactor (SBR). The SBR is a fill-and-draw biological wastewater technology, which is specially suited for nutrient removal. Employing reactive dye by Sphingomonas paucimobilis bacteria at sequence batch reactor is a novel approach of dye removal. The influent COD, MLVSS, and reaction time were selected as the process inputs and the effluent COD and BOD as the process outputs. The best possible result for the discrete pole parameter was a= 0.44. In orderto adjust the parameters of ANN, the Levenberg-Marquardt (LM) algorithm was employed. The results predicted by the model were compared to the experimental data and showed a high correlation with R2> 0.99 and a low mean absolute error (MAE). The results from this study reveal that the developed model is accurate and efficacious in predicting COD and BOD parameters of the dye-containing wastewater treated by SBR. The proposed modeling approach can be applied to other industrial wastewater treatment systems to predict effluent characteristics. Note that SBR are normally operated with constant predefined duration of the stages, thus, resulting in low efficient operation. Data obtained from the on-line electronic sensors installed in the SBR and from the control quality laboratory analysis have been used to develop the optimal architecture of two different ANN. The results have shown that the developed models can be used as efficient and cost-effective predictive tools for the system analysed.

Keywords: artificial neural network, COD removal, SBR, Sphingomonas paucimobilis

Procedia PDF Downloads 412
1121 Combat Capability Improvement Using Sleep Analysis

Authors: Gabriela Kloudova, Miloslav Stehlik, Peter Sos

Abstract:

The quality of sleep can affect combat performance where the vigilance, accuracy and reaction time are a decisive factor. In the present study, airborne and special units are measured on duty using actigraphy fingerprint scoring algorithm and QEEG (quantitative EEG). Actigraphic variables of interest will be: mean nightly sleep duration, mean napping duration, mean 24-h sleep duration, mean sleep latency, mean sleep maintenance efficiency, mean sleep fragmentation index, mean sleep onset time, mean sleep offset time and mean midpoint time. In an attempt to determine the individual somnotype of each subject, the data like sleep pattern, chronotype (morning and evening lateness), biological need for sleep (daytime and anytime sleepability) and trototype (daytime and anytime wakeability) will be extracted. Subsequently, a series of recommendations will be included in the training plan based on daily routine, timing of the day and night activities, duration of sleep and the number of sleeping blocks in a defined time. The aim of these modifications in the training plan is to reduce day-time sleepiness, improve vigilance, attention, accuracy, speed of the conducted tasks and to optimize energy supplies. Regular improvement of the training supposed to have long-term neurobiological consequences including neuronal activity changes measured by QEEG. Subsequently, that should enhance cognitive functioning in subjects assessed by the digital cognitive test batteries and improve their overall performance.

Keywords: sleep quality, combat performance, actigraph, somnotype

Procedia PDF Downloads 164
1120 Efficient Credit Card Fraud Detection Based on Multiple ML Algorithms

Authors: Neha Ahirwar

Abstract:

In the contemporary digital era, the rise of credit card fraud poses a significant threat to both financial institutions and consumers. As fraudulent activities become more sophisticated, there is an escalating demand for robust and effective fraud detection mechanisms. Advanced machine learning algorithms have become crucial tools in addressing this challenge. This paper conducts a thorough examination of the design and evaluation of a credit card fraud detection system, utilizing four prominent machine learning algorithms: random forest, logistic regression, decision tree, and XGBoost. The surge in digital transactions has opened avenues for fraudsters to exploit vulnerabilities within payment systems. Consequently, there is an urgent need for proactive and adaptable fraud detection systems. This study addresses this imperative by exploring the efficacy of machine learning algorithms in identifying fraudulent credit card transactions. The selection of random forest, logistic regression, decision tree, and XGBoost for scrutiny in this study is based on their documented effectiveness in diverse domains, particularly in credit card fraud detection. These algorithms are renowned for their capability to model intricate patterns and provide accurate predictions. Each algorithm is implemented and evaluated for its performance in a controlled environment, utilizing a diverse dataset comprising both genuine and fraudulent credit card transactions.

Keywords: efficient credit card fraud detection, random forest, logistic regression, XGBoost, decision tree

Procedia PDF Downloads 64
1119 Time to Second Line Treatment Initiation Among Drug-Resistant Tuberculosis Patients in Nepal

Authors: Shraddha Acharya, Sharad Kumar Sharma, Ratna Bhattarai, Bhagwan Maharjan, Deepak Dahal, Serpahine Kaminsa

Abstract:

Background: Drug-resistant (DR) tuberculosis (TB) continues to be a threat in Nepal, with an estimated 2800 new cases every year. The treatment of DR-TB with second line TB drugs is complex and takes longer time with comparatively lower treatment success rate than drug-susceptible TB. Delay in treatment initiation for DR-TB patients might further result in unfavorable treatment outcomes and increased transmission. This study thus aims to determine median time taken to initiate second-line treatment among Rifampicin Resistant (RR) diagnosed TB patients and to assess the proportion of treatment delays among various type of DR-TB cases. Method: A retrospective cohort study was done using national routine electronic data (DRTB and TB Laboratory Patient Tracking System-DHIS2) on drug resistant tuberculosis patients between January 2020 and December 2022. The time taken for treatment initiation was computed as– days from first diagnosis as RR TB through Xpert MTB/Rif test to enrollment on second-line treatment. The treatment delay (>7 days after diagnosis) was calculated. Results: Among total RR TB cases (N=954) diagnosed via Xpert nationwide, 61.4% were enrolled under shorter-treatment regimen (STR), 33.0% under longer treatment regimen (LTR), 5.1% for Pre-extensively drug resistant TB (Pre-XDR) and 0.4% for Extensively drug resistant TB (XDR) treatment. Among these cases, it was found that the median time from diagnosis to treatment initiation was 6 days (IQR:2-15.8). The median time was 5 days (IQR:2.0-13.3) among STR, 6 days (IQR:3.0-15.0) among LTR, 30 days (IQR:5.5-66.8) among Pre-XDR and 4 days (IQR:2.5-9.0) among XDR TB cases. The overall treatment delay (>7 days after diagnosis) was observed in 42.4% of the patients, among which, cases enrolled under Pre-XDR contributed substantially to treatment delay (72.0%), followed by LTR (43.6%), STR (39.1%) and XDR (33.3%). Conclusion: Timely diagnosis and prompt treatment initiation remain fundamental focus of the National TB program. The findings of the study, however suggest gaps in timeliness of treatment initiation for the drug-resistant TB patients, which could bring adverse treatment outcomes. Moreover, there is an alarming delay in second line treatment initiation for the Pre-XDR TB patients. Therefore, this study generates evidence to identify existing gaps in treatment initiation and highlights need for formulating specific policies and intervention in creating effective linkage between the RR TB diagnosis and enrollment on second line TB treatment with intensified efforts from health providers for follow-ups and expansion of more decentralized, adequate, and accessible diagnostic and treatment services for DR-TB, especially Pre-XDR TB cases, due to the observed long treatment delays.

Keywords: drug-resistant, tuberculosis, treatment initiation, Nepal, treatment delay

Procedia PDF Downloads 82