Search results for: computer aided diagnosis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4322

Search results for: computer aided diagnosis

1352 Impact of Weather Conditions on Generalized Frequency Division Multiplexing over Gamma Gamma Channel

Authors: Muhammad Sameer Ahmed, Piotr Remlein, Tansal Gucluoglu

Abstract:

The technique called as Generalized frequency division multiplexing (GFDM) used in the free space optical channel can be a good option for implementation free space optical communication systems. This technique has several strengths e.g. good spectral efficiency, low peak-to-average power ratio (PAPR), adaptability and low co-channel interference. In this paper, the impact of weather conditions such as haze, rain and fog on GFDM over the gamma-gamma channel model is discussed. A Trade off between link distance and system performance under intense weather conditions is also analysed. The symbol error probability (SEP) of GFDM over the gamma-gamma turbulence channel is derived and verified with the computer simulations.

Keywords: free space optics, generalized frequency division multiplexing, weather conditions, gamma gamma distribution

Procedia PDF Downloads 156
1351 Methods Used to Perform Requirements Elicitation for FinTech Application Development

Authors: Zhao Pengcheng, Yin Siyuan

Abstract:

Fintech is the new hot topic of the 21st century, a discipline that combines financial theory with computer modelling. It can provide both digital analysis methods for investment banks and investment decisions for users. Given the variety of services available, it is necessary to provide a superior method of requirements elicitation to ensure that users' needs are addressed in the software development process. The accuracy of traditional software requirements elicitation methods is not sufficient, so this study attempts to use a multi-perspective based requirements heuristic framework. Methods such as interview and questionnaire combination, card sorting, and model driven are proposed. The collection results from PCA show that the new methods can better help with requirements elicitation. However, the method has some limitations and, there are some efficiency issues. However, the research in this paper provides a good theoretical extension that can provide researchers with some new research methods and perspectives viewpoints.

Keywords: requirement elicitation, FinTech, mobile application, survey, interview, model-driven

Procedia PDF Downloads 91
1350 Bioinformatic Prediction of Hub Genes by Analysis of Signaling Pathways, Transcriptional Regulatory Networks and DNA Methylation Pattern in Colon Cancer

Authors: Ankan Roy, Niharika, Samir Kumar Patra

Abstract:

Anomalous nexus of complex topological assemblies and spatiotemporal epigenetic choreography at chromosomal territory may forms the most sophisticated regulatory layer of gene expression in cancer. Colon cancer is one of the leading malignant neoplasms of the lower gastrointestinal tract worldwide. There is still a paucity of information about the complex molecular mechanisms of colonic cancerogenesis. Bioinformatics prediction and analysis helps to identify essential genes and significant pathways for monitoring and conquering this deadly disease. The present study investigates and explores potential hub genes as biomarkers and effective therapeutic targets for colon cancer treatment. Colon cancer patient sample containing gene expression profile datasets, such as GSE44076, GSE20916, and GSE37364 were downloaded from Gene Expression Omnibus (GEO) database and thoroughly screened using the GEO2R tool and Funrich software to find out common 2 differentially expressed genes (DEGs). Other approaches, including Gene Ontology (GO) and KEGG pathway analysis, Protein-Protein Interaction (PPI) network construction and hub gene investigation, Overall Survival (OS) analysis, gene correlation analysis, methylation pattern analysis, and hub gene-Transcription factors regulatory network construction, were performed and validated using various bioinformatics tool. Initially, we identified 166 DEGs, including 68 up-regulated and 98 down-regulated genes. Up-regulated genes are mainly associated with the Cytokine-cytokine receptor interaction, IL17 signaling pathway, ECM-receptor interaction, Focal adhesion and PI3K-Akt pathway. Downregulated genes are enriched in metabolic pathways, retinol metabolism, Steroid hormone biosynthesis, and bile secretion. From the protein-protein interaction network, thirty hub genes with high connectivity are selected using the MCODE and cytoHubba plugin. Survival analysis, expression validation, correlation analysis, and methylation pattern analysis were further verified using TCGA data. Finally, we predicted COL1A1, COL1A2, COL4A1, SPP1, SPARC, and THBS2 as potential master regulators in colonic cancerogenesis. Moreover, our experimental data highlights that disruption of lipid raft and RAS/MAPK signaling cascade affects this gene hub at mRNA level. We identified COL1A1, COL1A2, COL4A1, SPP1, SPARC, and THBS2 as determinant hub genes in colon cancer progression. They can be considered as biomarkers for diagnosis and promising therapeutic targets in colon cancer treatment. Additionally, our experimental data advertise that signaling pathway act as connecting link between membrane hub and gene hub.

Keywords: hub genes, colon cancer, DNA methylation, epigenetic engineering, bioinformatic predictions

Procedia PDF Downloads 110
1349 Family Resilience of Children with Cancer: A Latent Profile Analysis

Authors: Bowen Li, Dan Shu, Shiguang Pang, Li Wang, Qian Liu

Abstract:

Background: Every year, approximately 429,000 adolescents aged 0-19 are diagnosed with cancer worldwide. The diagnosis brings about substantial psychological pressure and caregiving responsibilities for family members and impacts the families significantly. Family resilience has been found to reduce caregiver distress and can also foster post-traumatic growth in cancer survivors. However, current research on family resilience in childhood cancer mainly focuses on individual caregiver resilience and child adaptation, with less attention given to categorizing family resilience among caregivers of children with cancer. Method: A total of 292 caregivers of children with cancer were recruited from four tertiary hospitals in central China from July 2022 to March 2024. This study was approved by the ethics committee, and participants provided informed consent, with the option to withdraw at any time. The Family Resilience Assessment Scale was used to measure family resilience among caregivers of children with cancer. The Quality of Life scale-family, The Perceived Social Support Scale, and The Connor-Davidson Resilience Scale were used to measure potential influencing factors. This study used latent profile analysis (LPA) to identify latent categories of family resilience among caregivers of children with cancer. Binary logistic regression was used to analyze the influencing factors of family resilience. Results: The results reveal two distinct categories: "high family resilience" and "low family resilience." "Low family resilience" group accounts for 85.96% of the total while "high family resilience" group is 14.04%. "High family resilience" scores higher across all dimensions compared to "low family resilience". Within-group comparisons reveals that "family communication and problem-solving" and "empowering the meaning of adversity" received the highest scores, while "utilizing social and economic resources" scores the lowest. "Maintaining a positive attitude" scores similarly high to "family communication and problem-solving" in the high family resilience group, whereas it scores similarly low to "utilizing social and economic resources" in the low family resilience group. In single-factor analysis, residence, number of siblings, caregiver's education level, resilience, social support, quality of life, physical well-being and psychological well-being showed significant difference between two categories. In binary logistic regression analysis, households with only one child are more likely to exhibit low family resilience, whereas high personal resilience is associated with a high level of family resilience. Conclusion: Most families with children suffering from cancer require strengthened family resilience. Support for utilizing socio-economic resources is important for both high and low family resilience families. Single-child families and caregivers with lower resilience require more attention. These findings imply the development of targeted interventions to enhance family resilience among families with children of cancer. Future studies could involve children and other family members for a comprehensive understanding of family resilience. Longitudinal studies are necessary to explore the dynamic changes in family resilience throughout the cancer journey.

Keywords: cancer children, caregivers, family resilience, latent profile analysis

Procedia PDF Downloads 19
1348 Efficient Heuristic Algorithm to Speed Up Graphcut in Gpu for Image Stitching

Authors: Tai Nguyen, Minh Bui, Huong Ninh, Tu Nguyen, Hai Tran

Abstract:

GraphCut algorithm has been widely utilized to solve various types of computer vision problems. Its expensive computational cost encouraged many researchers to improve the speed of the algorithm. Recent works proposed schemes that work on parallel computing platforms such as CUDA. However, the problem of low convergence speed prevents the usage of GraphCut for real time applications. In this paper, we propose global suppression heuristic to boost the conver-gence process of the algorithm. A parallel implementation of GraphCut algorithm on CUDA designed for the image stitching problem is introduced. Our method achieves up to 3× time boost on the graph of size 80 × 480 compared to the best sequential GraphCut algorithm while achieving satisfactory stitched images, suitable for panorama applications. Our source code will be soon available for further research.

Keywords: CUDA, graph cut, image stitching, texture synthesis, maxflow/mincut algorithm

Procedia PDF Downloads 110
1347 Gamification of a Business Intelligence Tool

Authors: Stephen Miller

Abstract:

The act of applying game mechanics and dynamics (which have been traditionally used in video games) into business applications is being widely trialed in an effort to make conventional business software a bit more participative, fun and engaging. This new trend, named ‘gamification’ has its believers and of course, its critics who still need convincing that the concept is an effective and beneficial business tool worthy of investment. The literature reveals that user engagement of business intelligence (BI) tools is much lower than expected and investors are failing to get a good return on their investment (ROI). So, a software prototype will be designed and developed to add gamification to a BI tool to determine its effect upon the user engagement levels of test participants. The experimental study will be evaluated using the comprehensive User Engagement Scale (UES) to see if there are improvements in areas such as; aesthetics, perceived usability, endurability, novelty, felt involvement and focused attention. The results of this unique study should demonstrate whether or not ‘gamifying’ a BI tool has the potential to increase an individual’s motivation to use BI software more often.

Keywords: business intelligence, gamification, human computer interaction, user engagement

Procedia PDF Downloads 568
1346 Quantitative Wide-Field Swept-Source Optical Coherence Tomography Angiography and Visual Outcomes in Retinal Artery Occlusion

Authors: Yifan Lu, Ying Cui, Ying Zhu, Edward S. Lu, Rebecca Zeng, Rohan Bajaj, Raviv Katz, Rongrong Le, Jay C. Wang, John B. Miller

Abstract:

Purpose: Retinal artery occlusion (RAO) is an ophthalmic emergency that can lead to poor visual outcome and is associated with an increased risk of cerebral stroke and cardiovascular events. Fluorescein angiography (FA) is the traditional diagnostic tool for RAO; however, wide-field swept-source optical coherence tomography angiography (WF SS-OCTA), as a nascent imaging technology, is able to provide quick and non-invasive angiographic information with a wide field of view. In this study, we looked for associations between OCT-A vascular metrics and visual acuity in patients with prior diagnosis of RAO. Methods: Patients with diagnoses of central retinal artery occlusion (CRAO) or branched retinal artery occlusion (BRAO) were included. A 6mm x 6mm Angio and a 15mm x 15mm AngioPlex Montage OCT-A image were obtained for both eyes in each patient using the Zeiss Plex Elite 9000 WF SS-OCTA device. Each 6mm x 6mm image was divided into nine Early Treatment Diabetic Retinopathy Study (ETDRS) subfields. The average measurement of the central foveal subfield, inner ring, and outer ring was calculated for each parameter. Non-perfusion area (NPA) was manually measured using 15mm x 15mm Montage images. A linear regression model was utilized to identify a correlation between the imaging metrics and visual acuity. A P-value less than 0.05 was considered to be statistically significant. Results: Twenty-five subjects were included in the study. For RAO eyes, there was a statistically significant negative correlation between vision and retinal thickness as well as superficial capillary plexus vessel density (SCP VD). A negative correlation was found between vision and deep capillary plexus vessel density (DCP VD) without statistical significance. There was a positive correlation between vision and choroidal thickness as well as choroidal volume without statistical significance. No statistically significant correlation was found between vision and the above metrics in contralateral eyes. For NPA measurements, no significant correlation was found between vision and NPA. Conclusions: This is the first study to our best knowledge to investigate the utility of WF SS-OCTA in RAO and to demonstrate correlations between various retinal vascular imaging metrics and visual outcomes. Further investigations should explore the associations between these imaging findings and cardiovascular risk as RAO patients are at elevated risk for symptomatic stroke. The results of this study provide a basis to understand the structural changes involved in visual outcomes in RAO. Furthermore, they may help guide management of RAO and prevention of cerebral stroke and cardiovascular accidents in patients with RAO.

Keywords: OCTA, swept-source OCT, retinal artery occlusion, Zeiss Plex Elite

Procedia PDF Downloads 120
1345 Current Applications of Artificial Intelligence (AI) in Chest Radiology

Authors: Angelis P. Barlampas

Abstract:

Learning Objectives: The purpose of this study is to inform briefly the reader about the applications of AI in chest radiology. Background: Currently, there are 190 FDA-approved radiology AI applications, with 42 (22%) pertaining specifically to thoracic radiology. Imaging findings OR Procedure details Aids of AI in chest radiology1: Detects and segments pulmonary nodules. Subtracts bone to provide an unobstructed view of the underlying lung parenchyma and provides further information on nodule characteristics, such as nodule location, nodule two-dimensional size or three dimensional (3D) volume, change in nodule size over time, attenuation data (i.e., mean, minimum, and/or maximum Hounsfield units [HU]), morphological assessments, or combinations of the above. Reclassifies indeterminate pulmonary nodules into low or high risk with higher accuracy than conventional risk models. Detects pleural effusion . Differentiates tension pneumothorax from nontension pneumothorax. Detects cardiomegaly, calcification, consolidation, mediastinal widening, atelectasis, fibrosis and pneumoperitoneum. Localises automatically vertebrae segments, labels ribs and detects rib fractures. Measures the distance from the tube tip to the carina and localizes both endotracheal tubes and central vascular lines. Detects consolidation and progression of parenchymal diseases such as pulmonary fibrosis or chronic obstructive pulmonary disease (COPD).Can evaluate lobar volumes. Identifies and labels pulmonary bronchi and vasculature and quantifies air-trapping. Offers emphysema evaluation. Provides functional respiratory imaging, whereby high-resolution CT images are post-processed to quantify airflow by lung region and may be used to quantify key biomarkers such as airway resistance, air-trapping, ventilation mapping, lung and lobar volume, and blood vessel and airway volume. Assesses the lung parenchyma by way of density evaluation. Provides percentages of tissues within defined attenuation (HU) ranges besides furnishing automated lung segmentation and lung volume information. Improves image quality for noisy images with built-in denoising function. Detects emphysema, a common condition seen in patients with history of smoking and hyperdense or opacified regions, thereby aiding in the diagnosis of certain pathologies, such as COVID-19 pneumonia. It aids in cardiac segmentation and calcium detection, aorta segmentation and diameter measurements, and vertebral body segmentation and density measurements. Conclusion: The future is yet to come, but AI already is a helpful tool for the daily practice in radiology. It is assumed, that the continuing progression of the computerized systems and the improvements in software algorithms , will redder AI into the second hand of the radiologist.

Keywords: artificial intelligence, chest imaging, nodule detection, automated diagnoses

Procedia PDF Downloads 49
1344 A 4-Month Low-carb Nutrition Intervention Study Aimed to Demonstrate the Significance of Addressing Insulin Resistance in 2 Subjects with Type-2 Diabetes for Better Management

Authors: Shashikant Iyengar, Jasmeet Kaur, Anup Singh, Arun Kumar, Ira Sahay

Abstract:

Insulin resistance (IR) is a condition that occurs when cells in the body become less responsive to insulin, leading to higher levels of both insulin and glucose in the blood. This condition is linked to metabolic syndromes, including diabetes. It is crucial to address IR promptly after diagnosis to prevent long-term complications associated with high insulin and high blood glucose. This four-month case study highlights the importance of treating the underlying condition to manage diabetes effectively. Insulin is essential for regulating blood sugar levels by facilitating the uptake of glucose into cells for energy or storage. In IR individuals, cells are less efficient at taking up glucose from the blood resulting in elevated blood glucose levels. As a result of IR, beta cells produce more insulin to make up for the body's inability to use insulin effectively. This leads to high insulin levels, a condition known as hyperinsulinemia, which further impairs glucose metabolism and can contribute to various chronic diseases. In addition to regulating blood glucose, insulin has anti-catabolic effects, preventing the breakdown of molecules in the body, such as inhibiting glycogen breakdown in the liver, inhibiting gluconeogenesis, and inhibiting lipolysis. If a person is insulin-sensitive or metabolically healthy, an optimal level of insulin prevents fat cells from releasing fat and promotes the storage of glucose and fat in the body. Thus optimal insulin levels are crucial for maintaining energy balance and plays a key role in metabolic processes. During the four-month study, researchers looked at the impact of a low-carb dietary (LCD) intervention on two male individuals (A & B) who had Type-2 diabetes. Althoughvneither of these individuals were obese, they were both slightly overweight and had abdominal fat deposits. Before the trial began, important markers such as fasting blood glucose (FBG), triglycerides (TG), high-density lipoprotein (HDL) cholesterol, and Hba1c were measured. These markers are essential in defining metabolic health, their individual values and variability are integral in deciphering metabolic health. The ratio of TG to HDL is used as a surrogate marker for IR. This ratio has a high correlation with the prevalence of metabolic syndrome and with IR itself. It is a convenient measure because it can be calculated from a standard lipid profile and does not require more complex tests. In this four-month trial, an improvement in insulin sensitivity was observed through the ratio of TG/HDL, which, in turn, improves fasting blood glucose levels and HbA1c. For subject A, HbA1c dropped from 13 to 6.28, and for subject B, it dropped from 9.4 to 5.7. During the trial, neither of the subjects were taking any diabetic medications. The significant improvements in their health markers, such as better glucose control, along with an increase in energy levels, demonstrate that incorporating LCD interventions can effectively manage diabetes.

Keywords: metabolic disorder, insulin resistance, type-2 diabetes, low-carb nutrition

Procedia PDF Downloads 18
1343 Using Group Concept Mapping to Identify a Pharmacy-Based Trigger Tool to Detect Adverse Drug Events

Authors: Rodchares Hanrinth, Theerapong Srisil, Peeraya Sriphong, Pawich Paktipat

Abstract:

The trigger tool is the low-cost, low-tech method to detect adverse events through clues called triggers. The Institute for Healthcare Improvement (IHI) has developed the Global Trigger Tool for measuring and preventing adverse events. However, this tool is not specific for detecting adverse drug events. The pharmacy-based trigger tool is needed to detect adverse drug events (ADEs). Group concept mapping is an effective method for conceptualizing various ideas from diverse stakeholders. This technique was used to identify a pharmacy-based trigger to detect adverse drug events (ADEs). The aim of this study was to involve the pharmacists in conceptualizing, developing, and prioritizing a feasible trigger tool to detect adverse drug events in a provincial hospital, the northeastern part of Thailand. The study was conducted during the 6-month period between April 1 and September 30, 2017. Study participants involved 20 pharmacists (17 hospital pharmacists and 3 pharmacy lecturers) engaging in three concept mapping workshops. In this meeting, the concept mapping technique created by Trochim, a highly constructed qualitative group technic for idea generating and sharing, was used to produce and construct participants' views on what triggers were potential to detect ADEs. During the workshops, participants (n = 20) were asked to individually rate the feasibility and potentiality of each trigger and to group them into relevant categories to enable multidimensional scaling and hierarchical cluster analysis. The outputs of analysis included the trigger list, cluster list, point map, point rating map, cluster map, and cluster rating map. The three workshops together resulted in 21 different triggers that were structured in a framework forming 5 clusters: drug allergy, drugs induced diseases, dosage adjustment in renal diseases, potassium concerning, and drug overdose. The first cluster is drug allergy such as the doctor’s orders for dexamethasone injection combined with chlorpheniramine injection. Later, the diagnosis of drug-induced hepatitis in a patient taking anti-tuberculosis drugs is one trigger in the ‘drugs induced diseases’ cluster. Then, for the third cluster, the doctor’s orders for enalapril combined with ibuprofen in a patient with chronic kidney disease is the example of a trigger. The doctor’s orders for digoxin in a patient with hypokalemia is a trigger in a cluster. Finally, the doctor’s orders for naloxone with narcotic overdose was classified as a trigger in a cluster. This study generated triggers that are similar to some of IHI Global trigger tool, especially in the medication module such as drug allergy and drug overdose. However, there are some specific aspects of this tool, including drug-induced diseases, dosage adjustment in renal diseases, and potassium concerning which do not contain in any trigger tools. The pharmacy-based trigger tool is suitable for pharmacists in hospitals to detect potential adverse drug events using clues of triggers.

Keywords: adverse drug events, concept mapping, hospital, pharmacy-based trigger tool

Procedia PDF Downloads 149
1342 Smart Grid Simulator

Authors: Ursachi Andrei

Abstract:

The Smart Grid Simulator is a computer software based on advanced algorithms which has as the main purpose to lower the energy bill in the most optimized price efficient way as possible for private households, companies or energy providers. It combines the energy provided by a number of solar modules and wind turbines with the consumption of one household or a cluster of nearby households and information regarding weather conditions and energy prices in order to predict the amount of energy that can be produced by renewable energy sources and the amount of energy that will be bought from the distributor for the following day. The user of the system will not only be able to minimize his expenditures on energy fractures, but also he will be informed about his hourly consumption, electricity prices fluctuation and money spent for energy bought as well as how much money he saved each day and since he installed the system. The paper outlines the algorithm that supports the Smart Grid Simulator idea and presents preliminary test results that support the discussion and implementation of the system.

Keywords: smart grid, sustainable energy, applied science, renewable energy sources

Procedia PDF Downloads 331
1341 Automated System: Managing the Production and Distribution of Radiopharmaceuticals

Authors: Shayma Mohammed, Adel Trabelsi

Abstract:

Radiopharmacy is the art of preparing high-quality, radioactive, medicinal products for use in diagnosis and therapy. Radiopharmaceuticals unlike normal medicines, this dual aspect (radioactive, medical) makes their management highly critical. One of the most convincing applications of modern technologies is the ability to delegate the execution of repetitive tasks to programming scripts. Automation has found its way to the most skilled jobs, to improve the company's overall performance by allowing human workers to focus on more important tasks than document filling. This project aims to contribute to implement a comprehensive system to insure rigorous management of radiopharmaceuticals through the use of a platform that links the Nuclear Medicine Service Management System to the Nuclear Radio-pharmacy Management System in accordance with the recommendations of World Health Organization (WHO) and International Atomic Energy Agency (IAEA). In this project we attempt to build a web application that targets radiopharmacies, the platform is built atop the inherently compatible web stack which allows it to work in virtually any environment. Different technologies are used in this project (PHP, Symfony, MySQL Workbench, Bootstrap, Angular 7, Visual Studio Code and TypeScript). The operating principle of the platform is mainly based on two parts: Radiopharmaceutical Backoffice for the Radiopharmacian, who is responsible for the realization of radiopharmaceutical preparations and their delivery and Medical Backoffice for the Doctor, who holds the authorization for the possession and use of radionuclides and he/she is responsible for ordering radioactive products. The application consists of sven modules: Production, Quality Control/Quality Assurance, Release, General Management, References, Transport and Stock Management. It allows 8 classes of users: The Production Manager (PM), Quality Control Manager (QCM), Stock Manager (SM), General Manager (GM), Client (Doctor), Parking and Transport Manager (PTM), Qualified Person (QP) and Technical and Production Staff. Digital platform bringing together all players involved in the use of radiopharmaceuticals and integrating the stages of preparation, production and distribution, Web technologies, in particular, promise to offer all the benefits of automation while requiring no more than a web browser to act as a user client, which is a strength because the web stack is by nature multi-platform. This platform will provide a traceability system for radiopharmaceuticals products to ensure the safety and radioprotection of actors and of patients. The new integrated platform is an alternative to write all the boilerplate paperwork manually, which is a tedious and error-prone task. It would minimize manual human manipulation, which has proven to be the main source of error in nuclear medicine. A codified electronic transfer of information from radiopharmaceutical preparation to delivery will further reduce the risk of maladministration.

Keywords: automated system, management, radiopharmacy, technical papers

Procedia PDF Downloads 143
1340 Abnormal Pap Smear Detection by Application of Revised Bethesda System in Commercial Sex Workers and a Control Group: A Comparative Study

Authors: Priyanka Manghani, Manthan Patel, Rahul Peddawad

Abstract:

Cervical Cancer is a major public health hurdle in the area of women’s health. The most common cause of Cervical Cancer is the Human Papilloma Virus (HPV). Human papilloma virus has various genotypes, with HPV 16 and HPV 18 being the major etiological factor causing carcinoma of the Cervix. Early screening and detection by Papanicolaou Smears (PAP) is an effective method for identifying premalignant and malignant lesions. In case of existing pre- malignant lesions /cervical dysplasia’s found with HPV 16 or 18, appropriate follow up can be done to prevent it from developing into a neoplasm. Aims and Objectives: Primary Aim; To study various abnormal cervical cytology reports as detected by Pap Smear Tests, using the Bethesda System in women at a Tertiary Care Hospital. Secondary Aim; To discuss the importance of Pap smear in Cervical Cancer Screening Program. Materials and Methods: Our study is a prospective study, based on 101 women who attended the Out-patient department of Obstetrics and Gynecology at a tertiary care hospital in age group 20-40 years with chief complaints of white/foul vaginal discharge, post-coital Bleeding, low back pain, irregular menstruation, etc. 60 women, who were tested, of the total no of women, were commercial sex workers, thus being a high-risk group for HPV infection. All women underwent conventional cytology. For all the abnormal smears, further cervical biopsies were done, and the final diagnosis was done on the basis of histopathology (gold standard). Results: In all these patients, 16 patients presented with normal smears out of which 2 belonged to the category of commercial sex workers (3.33%) and 14 being from the normal/control group (34.15%). 44 women presented with inflammatory smears out of which 30 were commercial sex workers (50%) and 14 from the control Group (34.15%). A total of 11 women presented with infectious etiology with 6 being commercial sex workers (10%) and 5 (12.2%) being in the control group. A total of 8 patients presented with low-grade squamous intra epithelial lesion (LSIL) with 7 (11.7%) being commercial sex workers and 1(2.44%) patient belonging to the control group. A Total of 7 patients presented with high-grade squamous intraepithelial lesion (HSIL) with 6 (10%) being commercial sex workers and 1 (2.44%) belonging to the control group. 9 patients in total presented with atypical squamous cells of undetermined significance (ASCUS) with 6(10%) being commercial sex workers and 3 (7.32%) belonging to the control group. Squamous cell carcinoma(SCC) presence was found only in 1(1.7%) commercial sex worker. Conclusion – We conclude that HSIL, LSIL, SCC and sexually related infections are comparatively more common in vulnerable groups such as sex workers due to a variety of factors such as multiple sexual partners and poor genital hygiene. Early screening and follow up interventions are highly needed for them along with Health education for risk factors and to emphasize on the importance of Pap smear screening.

Keywords: cervical cancer, papanicolaou (pap) smear, bethesda system, neoplasm

Procedia PDF Downloads 209
1339 Platform Virtual for Joint Amplitude Measurement Based in MEMS

Authors: Mauro Callejas-Cuervo, Andrea C. Alarcon-Aldana, Andres F. Ruiz-Olaya, Juan C. Alvarez

Abstract:

Motion capture (MC) is the construction of a precise and accurate digital representation of a real motion. Systems have been used in the last years in a wide range of applications, from films special effects and animation, interactive entertainment, medicine, to high competitive sport where a maximum performance and low injury risk during training and competition is seeking. This paper presents an inertial and magnetic sensor based technological platform, intended for particular amplitude monitoring and telerehabilitation processes considering an efficient cost/technical considerations compromise. Our platform particularities offer high social impact possibilities by making telerehabilitation accessible to large population sectors in marginal socio-economic sector, especially in underdeveloped countries that in opposition to developed countries specialist are scarce, and high technology is not available or inexistent. This platform integrates high-resolution low-cost inertial and magnetic sensors with adequate user interfaces and communication protocols to perform a web or other communication networks available diagnosis service. The amplitude information is generated by sensors then transferred to a computing device with adequate interfaces to make it accessible to inexperienced personnel, providing a high social value. Amplitude measurements of the platform virtual system presented a good fit to its respective reference system. Analyzing the robotic arm results (estimation error RMSE 1=2.12° and estimation error RMSE 2=2.28°), it can be observed that during arm motion in any sense, the estimation error is negligible; in fact, error appears only during sense inversion what can easily be explained by the nature of inertial sensors and its relation to acceleration. Inertial sensors present a time constant delay which acts as a first order filter attenuating signals at large acceleration values as is the case for a change of sense in motion. It can be seen a damped response of platform virtual in other images where error analysis show that at maximum amplitude an underestimation of amplitude is present whereas at minimum amplitude estimations an overestimation of amplitude is observed. This work presents and describes the platform virtual as a motion capture system suitable for telerehabilitation with the cost - quality and precision - accessibility relations optimized. These particular characteristics achieved by efficiently using the state of the art of accessible generic technology in sensors and hardware, and adequate software for capture, transmission analysis and visualization, provides the capacity to offer good telerehabilitation services, reaching large more or less marginal populations where technologies and specialists are not available but accessible with basic communication networks.

Keywords: inertial sensors, joint amplitude measurement, MEMS, telerehabilitation

Procedia PDF Downloads 242
1338 A Low Cost and Reconfigurable Experimental Platform for Engineering Lab Education

Authors: S. S. Kenny Lee, C. C. Kong, S. K. Ting

Abstract:

Teaching engineering lab provides opportunity for students to practice theories learned through physical experiment in the laboratory. However, building laboratories to accommodate increased number of students are expensive, making it impossible for an educational institution to afford the high expenses. In this paper, we develop a low cost and remote platform to aid teaching undergraduate students. The platform is constructed where the real experiment setting up in laboratory can be reconfigure and accessed remotely, the aim is to increase student’s desire to learn at which they can interact with the physical experiment using network enabled devices at anywhere in the campus. The platform is constructed with Raspberry Pi as a main control board that provides communication between computer interfaces to the actual experiment preset in the laboratory. The interface allows real-time remote viewing and triggering the physical experiment in the laboratory and also provides instructions and learning guide about the experimental.

Keywords: engineering lab, low cost, network, remote platform, reconfigure, real-time

Procedia PDF Downloads 291
1337 Its about Cortana, Microsoft’s Virtual Assistant

Authors: Aya Idriss, Esraa Othman, Lujain Malak

Abstract:

Artificial intelligence is the emulation of human intelligence processes by machines, particularly computer systems that act logically. Some of the specific applications of AI include natural language processing, speech recognition, and machine vision. Cortana is a virtual assistant and she’s an example of an AI Application. Microsoft made it possible for this app to be accessed not only on laptops and PCs but can be downloaded on mobile phones and used as a virtual assistant which was a huge success. Cortana can offer a lot apart from the basic orders such as setting alarms and marking the calendar. Its capabilities spread past that, for example, it provides us with listening to music and podcasts on the go, managing my to-do list and emails, connecting with my contacts hands-free by simply just telling the virtual assistant to call somebody, gives me instant answers and so on. A questionnaire was sent online to numerous friends and family members to perform the study, which is critical in evaluating Cortana's recognition capacity and the majority of the answers were in favor of Cortana’s capabilities. The results of the questionnaire assisted us in determining the level of Cortana's skills.

Keywords: artificial intelligence, Cortana, AI, abstract

Procedia PDF Downloads 162
1336 Subarray Based Multiuser Massive MIMO Design Adopting Large Transmit and Receive Arrays

Authors: Tetsiki Taniguchi, Yoshio Karasawa

Abstract:

This paper describes a subarray based low computational design method of multiuser massive multiple input multiple output (MIMO) system. In our previous works, use of large array is assumed only in transmitter, but this study considers the case both of transmitter and receiver sides are equipped with large array antennas. For this aim, receive arrays are also divided into several subarrays, and the former proposed method is modified for the synthesis of a large array from subarrays in both ends. Through computer simulations, it is verified that the performance of the proposed method is degraded compared with the original approach, but it can achieve the improvement in the aspect of complexity, namely, significant reduction of the computational load to the practical level.

Keywords: large array, massive multiple input multiple output (MIMO), multiuser, singular value decomposition, subarray, zero forcing

Procedia PDF Downloads 385
1335 Attitudes toward Programming Languages Based on Characteristics

Authors: Mohammad Shokoohi-Yekta, Hamid Mirebrahim

Abstract:

A body of research has been devoted to investigating the preferences of computer programmers. These researches used various questionnaires to find out what programming language is most popular among programmers. The problem with such research is that the programmers are usually familiar with only a few languages; therefore, disregarding a number of other languages which might have characteristics that match their preferences more closely. To overcome such a problem, we decided to investigate the preferences of programmers in regards to the characteristics of languages, which help us to discover the languages that include the most characteristics preferred by the users. We conducted a user study to measure the preferences of programmers on different characteristics of programming languages and then tried to compare existing languages in the areas of application, Web and system programming. Overall, the results of our study indicated that the Ruby programming language has the highest preference score in the two areas of application and Web, and C++ has the highest score in the system area. The results of our study can also help programming language designers know the characteristics they should consider when developing new programming languages in order to attract more programmers.

Keywords: object orientation, programming language design, programmers' preferences, characteristic

Procedia PDF Downloads 476
1334 Improving Binding Selectivity in Molecularly Imprinted Polymers from Templates of Higher Biomolecular Weight: An Application in Cancer Targeting and Drug Delivery

Authors: Ben Otange, Wolfgang Parak, Florian Schulz, Michael Alexander Rubhausen

Abstract:

The feasibility of extending the usage of molecular imprinting technique in complex biomolecules is demonstrated in this research. This technique is promising in diverse applications in areas such as drug delivery, diagnosis of diseases, catalysts, and impurities detection as well as treatment of various complications. While molecularly imprinted polymers MIP remain robust in the synthesis of molecules with remarkable binding sites that have high affinities to specific molecules of interest, extending the usage to complex biomolecules remains futile. This work reports on the successful synthesis of MIP from complex proteins: BSA, Transferrin, and MUC1. We show in this research that despite the heterogeneous binding sites and higher conformational flexibility of the chosen proteins, relying on their respective epitopes and motifs rather than the whole template produces highly sensitive and selective MIPs for specific molecular binding. Introduction: Proteins are vital in most biological processes, ranging from cell structure and structural integrity to complex functions such as transport and immunity in biological systems. Unlike other imprinting templates, proteins have heterogeneous binding sites in their complex long-chain structure, which makes their imprinting to be marred by challenges. In addressing this challenge, our attention is inclined toward the targeted delivery, which will use molecular imprinting on the particle surface so that these particles may recognize overexpressed proteins on the target cells. Our goal is thus to make surfaces of nanoparticles that specifically bind to the target cells. Results and Discussions: Using epitopes of BSA and MUC1 proteins and motifs with conserved receptors of transferrin as the respective templates for MIPs, significant improvement in the MIP sensitivity to the binding of complex protein templates was noted. Through the Fluorescence Correlation Spectroscopy FCS measurements on the size of protein corona after incubation of the synthesized nanoparticles with proteins, we noted a high affinity of MIPs to the binding of their respective complex proteins. In addition, quantitative analysis of hard corona using SDS-PAGE showed that only a specific protein was strongly bound on the respective MIPs when incubated with similar concentrations of the protein mixture. Conclusion: Our findings have shown that the merits of MIPs can be extended to complex molecules of higher biomolecular mass. As such, the unique merits of the technique, including high sensitivity and selectivity, relative ease of synthesis, production of materials with higher physical robustness, and higher stability, can be extended to more templates that were previously not suitable candidates despite their abundance and usage within the body.

Keywords: molecularly imprinted polymers, specific binding, drug delivery, high biomolecular mass-templates

Procedia PDF Downloads 34
1333 Applying Big Data to Understand Urban Design Quality: The Correlation between Social Activities and Automated Pedestrian Counts in Dilworth Park, Philadelphia

Authors: Jae Min Lee

Abstract:

Presence of people and intensity of activities have been widely accepted as an indicator for successful public spaces in urban design literature. This study attempts to predict the qualitative indicators, presence of people and intensity of activities, with the quantitative measurements of pedestrian counting. We conducted participant observation in Dilworth Park, Philadelphia to collect the total number of people and activities in the park. Then, the participant observation data is compared with detailed pedestrian counts at 10 exit locations to estimate the number of park users. The study found that there is a clear correlation between the intensity of social activities and automated pedestrian counts.

Keywords: automated pedestrian count, computer vision, public space, urban design

Procedia PDF Downloads 380
1332 Detection and Identification of Antibiotic Resistant UPEC Using FTIR-Microscopy and Advanced Multivariate Analysis

Authors: Uraib Sharaha, Ahmad Salman, Eladio Rodriguez-Diaz, Elad Shufan, Klaris Riesenberg, Irving J. Bigio, Mahmoud Huleihel

Abstract:

Antimicrobial drugs have played an indispensable role in controlling illness and death associated with infectious diseases in animals and humans. However, the increasing resistance of bacteria to a broad spectrum of commonly used antibiotics has become a global healthcare problem. Many antibiotics had lost their effectiveness since the beginning of the antibiotic era because many bacteria have adapted defenses against these antibiotics. Rapid determination of antimicrobial susceptibility of a clinical isolate is often crucial for the optimal antimicrobial therapy of infected patients and in many cases can save lives. The conventional methods for susceptibility testing require the isolation of the pathogen from a clinical specimen by culturing on the appropriate media (this culturing stage lasts 24 h-first culturing). Then, chosen colonies are grown on media containing antibiotic(s), using micro-diffusion discs (second culturing time is also 24 h) in order to determine its bacterial susceptibility. Other methods, genotyping methods, E-test and automated methods were also developed for testing antimicrobial susceptibility. Most of these methods are expensive and time-consuming. Fourier transform infrared (FTIR) microscopy is rapid, safe, effective and low cost method that was widely and successfully used in different studies for the identification of various biological samples including bacteria; nonetheless, its true potential in routine clinical diagnosis has not yet been established. The new modern infrared (IR) spectrometers with high spectral resolution enable measuring unprecedented biochemical information from cells at the molecular level. Moreover, the development of new bioinformatics analyses combined with IR spectroscopy becomes a powerful technique, which enables the detection of structural changes associated with resistivity. The main goal of this study is to evaluate the potential of the FTIR microscopy in tandem with machine learning algorithms for rapid and reliable identification of bacterial susceptibility to antibiotics in time span of few minutes. The UTI E.coli bacterial samples, which were identified at the species level by MALDI-TOF and examined for their susceptibility by the routine assay (micro-diffusion discs), are obtained from the bacteriology laboratories in Soroka University Medical Center (SUMC). These samples were examined by FTIR microscopy and analyzed by advanced statistical methods. Our results, based on 700 E.coli samples, were promising and showed that by using infrared spectroscopic technique together with multivariate analysis, it is possible to classify the tested bacteria into sensitive and resistant with success rate higher than 90% for eight different antibiotics. Based on these preliminary results, it is worthwhile to continue developing the FTIR microscopy technique as a rapid and reliable method for identification antibiotic susceptibility.

Keywords: antibiotics, E.coli, FTIR, multivariate analysis, susceptibility, UTI

Procedia PDF Downloads 160
1331 Cost Based Analysis of Risk Stratification Tool for Prediction and Management of High Risk Choledocholithiasis Patients

Authors: Shreya Saxena

Abstract:

Background: Choledocholithiasis is a common complication of gallstone disease. Risk scoring systems exist to guide the need for further imaging or endoscopy in managing choledocholithiasis. We completed an audit to review the American Society for Gastrointestinal Endoscopy (ASGE) scoring system for prediction and management of choledocholithiasis against the current practice at a tertiary hospital to assess its utility in resource optimisation. We have now conducted a cost focused sub-analysis on patients categorized high-risk for choledocholithiasis according to the guidelines to determine any associated cost benefits. Method: Data collection from our prior audit was used to retrospectively identify thirteen patients considered high-risk for choledocholithiasis. Their ongoing management was mapped against the guidelines. Individual costs for the key investigations were obtained from our hospital financial data. Total cost for the different management pathways identified in clinical practice were calculated and compared against predicted costs associated with recommendations in the guidelines. We excluded the cost of laparoscopic cholecystectomy and considered a set figure for per day hospital admission related expenses. Results: Based on our previous audit data, we identified a77% positive predictive value for the ASGE risk stratification tool to determine patients at high-risk of choledocholithiasis. 47% (6/13) had an magnetic resonance cholangiopancreatography (MRCP) prior to endoscopic retrograde cholangiopancreatography (ERCP), whilst 53% (7/13) went straight for ERCP. The average length of stay in the hospital was 7 days, with an additional day and cost of £328.00 (£117 for ERCP) for patients awaiting an MRCP prior to ERCP. Per day hospital admission was valued at £838.69. When calculating total cost, we assumed all patients had admission bloods and ultrasound done as the gold standard. In doing an MRCP prior to ERCP, there was a 130% increase in cost incurred (£580.04 vs £252.04) per patient. When also considering hospital admission and the average length of stay, it was an additional £1166.69 per patient. We then calculated the exact costs incurred by the department, over a three-month period, for all patients, for key investigations or procedures done in the management of choledocholithiasis. This was compared to an estimate cost derived from the recommended pathways in the ASGE guidelines. Overall, 81% (£2048.45) saving was associated with following the guidelines compared to clinical practice. Conclusion: MRCP is the most expensive test associated with the diagnosis and management of choledocholithiasis. The ASGE guidelines recommend endoscopy without an MRCP in patients stratified as high-risk for choledocholithiasis. Our audit that focused on assessing the utility of the ASGE risk scoring system showed it to be relatively reliable for identifying high-risk patients. Our cost analysis has shown significant cost savings per patient and when considering the average length of stay associated with direct endoscopy rather than an additional MRCP. Part of this is also because of an increased average length of stay associated with waiting for an MRCP. The above data supports the ASGE guidelines for the management of high-risk for choledocholithiasis patients from a cost perspective. The only caveat is our small data set that may impact the validity of our average length of hospital stay figures and hence total cost calculations.

Keywords: cost-analysis, choledocholithiasis, risk stratification tool, general surgery

Procedia PDF Downloads 82
1330 Translation and Validation of the Thai Version of the Japanese Sleep Questionnaire for Preschoolers

Authors: Natcha Lueangapapong, Chariya Chuthapisith, Lunliya Thampratankul

Abstract:

Background: There is a need to find an appropriate tool to help healthcare providers determine sleep problems in children for early diagnosis and management. The Japanese Sleep Questionnaire for Preschoolers (JSQ-P) is a parent-reported sleep questionnaire that has good psychometric properties and can be used in the context of Asian culture, which is likely suitable for Thai children. Objectives: This study aimed to translate and validate the Japanese Sleep Questionnaire for Preschoolers (JSQ-P) into a Thai version and to evaluate factors associated with sleep disorders in preschoolers. Methods: After approval by the original developer, the cross-cultural adaptation process of JSQ-P was performed, including forward translation, reconciliation, backward translation, and final approval of the Thai version of JSQ-P (TH-JSQ-P) by the original creator. This study was conducted between March 2021 and February 2022. The TH-JSQ-P was completed by 2,613 guardians whose children were aged 2-6 years twice in 10-14 days to assess its reliability and validity. Content validity was measured by an index of item-objective congruence (IOC) and a content validity index (CVI). Face validity, content validity, structural validity, construct validity (discriminant validity), criterion validity and predictive validity were assessed. The sensitivity and specificity of the TH-JSQ-P were also measured by using a total JSQ-P score cutoff point 84, recommended by the original JSQ-P and each subscale score among the clinical samples of obstructive sleep apnea syndrome. Results: Internal consistency reliability, evaluated by Cronbach’s α coefficient, showed acceptable reliability in all subscales of JSQ-P. It also had good test-retest reliability, as the intraclass correlation coefficient (ICC) for all items ranged between 0.42-0.84. The content validity was acceptable. For structural validity, our results indicated that the final factor solution for the Th-JSQ-P was comparable to the original JSQ-P. For construct validity, age group was one of the clinical parameters associated with some sleep problems. In detail, parasomnias, insomnia, daytime excessive sleepiness and sleep habits significantly decreased when the children got older; on the other hand, insufficient sleep was significantly increased with age. For criterion validity, all subscales showed a correlation with the Epworth Sleepiness Scale (r = -0.049-0.349). In predictive validity, the Epworth Sleepiness Scale was significantly a strong factor that correlated to sleep problems in all subscales of JSQ-P except in the subscale of sleep habit. The sensitivity and specificity of the total JSQ-P score were 0.72 and 0.66, respectively. Conclusion: The Thai version of JSQ-P has good internal consistency reliability and test-retest reliability. It passed 6 validity tests, and this can be used to evaluate sleep problems in preschool children in Thailand. Furthermore, it has satisfactory general psychometric properties and good reliability and validity. The data collected in examining the sensitivity of the Thai version revealed that the JSQ-P could detect differences in sleep problems among children with obstructive sleep apnea syndrome. This confirmed that the measure is sensitive and can be used to discriminate sleep problems among different children.

Keywords: preschooler, questionnaire, validation, Thai version

Procedia PDF Downloads 73
1329 Telomerase, a Biomarker in Oral Cancer Cell Proliferation and Tool for Its Prevention at Initial Stage

Authors: Shaista Suhail

Abstract:

As cancer populations is increasing sharply, the incidence of oral squamous cell carcinoma (OSCC) has also been expected to increase. Oral carcinogenesis is a highly complex, multistep process which involves accumulation of genetic alterations that lead to the induction of proteins promoting cell growth (encoded by oncogenes), increased enzymatic (telomerase) activity promoting cancer cell proliferation. The global increase in frequency and mortality, as well as the poor prognosis of oral squamous cell carcinoma, has intensified current research efforts in the field of prevention and early detection of this disease. The advances in the understanding of the molecular basis of oral cancer should help in the identification of new markers. The study of the carcinogenic process of the oral cancer, including continued analysis of new genetic alterations, along with their temporal sequencing during initiation, promotion and progression, will allow us to identify new diagnostic and prognostic factors, which will provide a promising basis for the application of more rational and efficient treatments. Telomerase activity has been readily found in most cancer biopsies, in premalignant lesions or germ cells. Activity of telomerase is generally absent in normal tissues. It is known to be induced upon immortalization or malignant transformation of human cells such as in oral cancer cells. Maintenance of telomeres plays an essential role during transformation of precancer to malignant stage. Mammalian telomeres, a specialized nucleoprotein structures are composed of large conctamers of the guanine-rich sequence 5_-TTAGGG-3_. The roles of telomeres in regulating both stability of genome and replicative immortality seem to contribute in essential ways in cancer initiation and progression. It is concluded that activity of telomerase can be used as a biomarker for diagnosis of malignant oral cancer and a target for inactivation in chemotherapy or gene therapy. Its expression will also prove to be an important diagnostic tool as well as a novel target for cancer therapy. The activation of telomerase may be an important step in tumorgenesis which can be controlled by inactivating its activity during chemotherapy. The expression and activity of telomerase are indispensable for cancer development. There are no drugs which can effect extremely to treat oral cancers. There is a general call for new emerging drugs or methods that are highly effective towards cancer treatment, possess low toxicity, and have a minor environment impact. Some novel natural products also offer opportunities for innovation in drug discovery. Natural compounds isolated from medicinal plants, as rich sources of novel anticancer drugs, have been of increasing interest with some enzyme (telomerase) blockage property. The alarming reports of cancer cases increase the awareness amongst the clinicians and researchers pertaining to investigate newer drug with low toxicity.

Keywords: oral carcinoma, telomere, telomerase, blockage

Procedia PDF Downloads 156
1328 An Integrative Review on Effects of Educational Interventions for Children with Eczema

Authors: Nam Sze Cheng, P. C. Janita Chau

Abstract:

Background: Eczema is a chronic inflammatory disease with high global prevalence rates in many childhood populations. It is also the most common paediatric skin problem. Although eczema education and proper skin care were effective in controlling eczema symptoms, the lack of both sufficient time for patient consultation and structured eczema education programme hindered the transferability of knowledge to patients and their parents. As a result, these young patients and their families suffer from a significant physical disability and psychological distress, which can substantially impair their quality of life. Objectives: This integrative review is to examine the effects of educational interventions for children with eczema and identify the core elements associated with an effective intervention. Methods: This integrative review targeted all articles published in 10 databases between May 2016 and February 2017 that reported the outcomes of disease interventions of any format for children and adolescents with the clinical diagnosis of eczema who were under 18 years of age. Five randomized controlled trials (RCT) and one systematic review of 10 RCTs were identified for review. All these publications had high methodological quality, except one study of web-based eczema education that was limited by selection bias and poor subject blinding. Findings: This review found that most studies adopted nurse-led or multi-disciplinary parental eczema education programme at the outpatient clinic setting. The format of these programmes included individual lectures, demonstration and group sharing, and the educational materials covered basic eczema knowledge and management as well as methods to interrupt itch-scratch cycle. The main outcome measures of these studies included severity of eczema symptoms, treatment adherence and quality of life of both patients and their families. Nine included studies reported statistically significant improvement in the primary outcome of symptom severity of these eczematous children. On the other hand, all these reviews failed to identify an effective dosage of intervention under these educational programmes that was attributed to the heterogeneity of the interventions. One study that was designed based on the social cognitive theory to guide the interventional content yielded statistically significant results. The systematic review recommended the importance of measuring parental self-efficacy. Implication: This integrative review concludes that structured educational programme can help nurses understand the theories behind different health interventions. They can then deliver eczema education to their patients in a consistent manner. These interventions also result in behavioral changes through patient education. Due to the lack of validated educational programmes in Chinese, it is imperative to conduct an RCT of eczema educational programme to investigate its effects on eczema severity, quality of life and treatment adherence in Hong Kong children as well as to promote the importance of parental self-efficacy.

Keywords: children, eczema, education, intervention

Procedia PDF Downloads 100
1327 Valorization of Surveillance Data and Assessment of the Sensitivity of a Surveillance System for an Infectious Disease Using a Capture-Recapture Model

Authors: Jean-Philippe Amat, Timothée Vergne, Aymeric Hans, Bénédicte Ferry, Pascal Hendrikx, Jackie Tapprest, Barbara Dufour, Agnès Leblond

Abstract:

The surveillance of infectious diseases is necessary to describe their occurrence and help the planning, implementation and evaluation of risk mitigation activities. However, the exact number of detected cases may remain unknown whether surveillance is based on serological tests because identifying seroconversion may be difficult. Moreover, incomplete detection of cases or outbreaks is a recurrent issue in the field of disease surveillance. This study addresses these two issues. Using a viral animal disease as an example (equine viral arteritis), the goals were to establish suitable rules for identifying seroconversion in order to estimate the number of cases and outbreaks detected by a surveillance system in France between 2006 and 2013, and to assess the sensitivity of this system by estimating the total number of outbreaks that occurred during this period (including unreported outbreaks) using a capture-recapture model. Data from horses which exhibited at least one positive result in serology using viral neutralization test between 2006 and 2013 were used for analysis (n=1,645). Data consisted of the annual antibody titers and the location of the subjects (towns). A consensus among multidisciplinary experts (specialists in the disease and its laboratory diagnosis, epidemiologists) was reached to consider seroconversion as a change in antibody titer from negative to at least 32 or as a three-fold or greater increase. The number of seroconversions was counted for each town and modeled using a unilist zero-truncated binomial (ZTB) capture-recapture model with R software. The binomial denominator was the number of horses tested in each infected town. Using the defined rules, 239 cases located in 177 towns (outbreaks) were identified from 2006 to 2013. Subsequently, the sensitivity of the surveillance system was estimated as the ratio of the number of detected outbreaks to the total number of outbreaks that occurred (including unreported outbreaks) estimated using the ZTB model. The total number of outbreaks was estimated at 215 (95% credible interval CrI95%: 195-249) and the surveillance sensitivity at 82% (CrI95%: 71-91). The rules proposed for identifying seroconversion may serve future research. Such rules, adjusted to the local environment, could conceivably be applied in other countries with surveillance programs dedicated to this disease. More generally, defining ad hoc algorithms for interpreting the antibody titer could be useful regarding other human and animal diseases and zoonosis when there is a lack of accurate information in the literature about the serological response in naturally infected subjects. This study shows how capture-recapture methods may help to estimate the sensitivity of an imperfect surveillance system and to valorize surveillance data. The sensitivity of the surveillance system of equine viral arteritis is relatively high and supports its relevance to prevent the disease spreading.

Keywords: Bayesian inference, capture-recapture, epidemiology, equine viral arteritis, infectious disease, seroconversion, surveillance

Procedia PDF Downloads 279
1326 An Exhaustive All-Subsets Examination of Trade Theory on WTO Data

Authors: Masoud Charkhabi

Abstract:

We examine trade theory with this motivation. The full set of World Trade Organization data are organized into country-year pairs, each treated as a different entity. Topological Data Analysis reveals that among the 16 region and 240 region-year pairs there exists in fact a distinguishable group of region-period pairs. The generally accepted periods of shifts from dissimilar-dissimilar to similar-similar trade in goods among regions are examined from this new perspective. The period breaks are treated as cumulative and are flexible. This type of all-subsets analysis is motivated from computer science and is made possible with Lossy Compression and Graph Theory. The results question many patterns in similar-similar to dissimilar-dissimilar trade. They also show indications of economic shifts that only later become evident in other economic metrics.

Keywords: econometrics, globalization, network science, topological data, analysis, trade theory, visualization, world trade

Procedia PDF Downloads 352
1325 Performance Evaluation of Flexible Manufacturing System: A Simulation Study

Authors: Mohammed Ali

Abstract:

In this paper, evaluation of flexible manufacturing system is made under different manufacturing strategies. The objective of this paper is to test the impact of pallets and routing flexibility on system performance operating at different sequencing rules, dispatching rules and at unbalanced load condition. A computer simulation model is developed to evaluate the effects of aforementioned manufacturing strategies on the make-span performance of flexible manufacturing system. The impact of number of pallets is shown with the different levels of routing flexibility. In this paper, the same manufacturing system is modeled under different combination of sequencing and dispatching rules. A series of simulation experiments are conducted and results analyzed. The result of the simulation shows that there is impact of pallets and routing flexibility on the performance of the system.

Keywords: flexibility, flexible manufacturing system, pallets, make-span, simulation

Procedia PDF Downloads 406
1324 SENSE-SEAT: Improving Creativity and Productivity through the Redesign of a Multisensory Technological Office Chair

Authors: Fernando Miguel Campos, Carlos Ferreira, João Pestana, Pedro Campos, Nils Ehrenberg, Wojciech Hydzik

Abstract:

The current trend of organizations offering their workers open-office spaces and co-working offices has been primed for stimulating teamwork and collaboration. However, this is not always valid as these kinds of spaces bring other types of challenges that compromise workers productivity and creativity. We present an approach for improving creativity and productivity at the workspace by redesigning an office chair that incorporates subtle technological elements that help users focus, relax and being more productive and creative. This sheds light on how we can better design interactive furniture for such popular contexts, as we develop this new chair through a multidisciplinary approach using ergonomics, interior design, interaction design, hardware and software engineering and psychology.

Keywords: creativity, co-working, ergonomics, human-computer interaction, interaction, interactive furniture, productivity

Procedia PDF Downloads 309
1323 Detecting Characters as Objects Towards Character Recognition on Licence Plates

Authors: Alden Boby, Dane Brown, James Connan

Abstract:

Character recognition is a well-researched topic across disciplines. Regardless, creating a solution that can cater to multiple situations is still challenging. Vehicle licence plates lack an international standard, meaning that different countries and regions have their own licence plate format. A problem that arises from this is that the typefaces and designs from different regions make it difficult to create a solution that can cater to a wide range of licence plates. The main issue concerning detection is the character recognition stage. This paper aims to create an object detection-based character recognition model trained on a custom dataset that consists of typefaces of licence plates from various regions. Given that characters have featured consistently maintained across an array of fonts, YOLO can be trained to recognise characters based on these features, which may provide better performance than OCR methods such as Tesseract OCR.

Keywords: computer vision, character recognition, licence plate recognition, object detection

Procedia PDF Downloads 102