Search results for: predictive tracking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1847

Search results for: predictive tracking

17 Auditory Rehabilitation via an VR Serious Game for Children with Cochlear Implants: Bio-Behavioral Outcomes

Authors: Areti Okalidou, Paul D. Hatzigiannakoglou, Aikaterini Vatou, George Kyriafinis

Abstract:

Young children are nowadays adept at using technology. Hence, computer-based auditory training programs (CBATPs) have become increasingly popular in aural rehabilitation for children with hearing loss and/or with cochlear implants (CI). Yet, their clinical utility for prognostic, diagnostic, and monitoring purposes has not been explored. The purposes of the study were: a) to develop an updated version of the auditory rehabilitation tool for Greek-speaking children with cochlear implants, b) to develop a database for behavioral responses, and c) to compare accuracy rates and reaction times in children differing in hearing status and other medical and demographic characteristics, in order to assess the tool’s clinical utility in prognosis, diagnosis, and progress monitoring. The updated version of the auditory rehabilitation tool was developed on a tablet, retaining the User-Centered Design approach and the elements of the Virtual Reality (VR) serious game. The visual stimuli were farm animals acting in simple game scenarios designed to trigger children’s responses to animal sounds, names, and relevant sentences. Based on an extended version of Erber’s auditory development model, the VR game consisted of six stages, i.e., sound detection, sound discrimination, word discrimination, identification, comprehension of words in a carrier phrase, and comprehension of sentences. A familiarization stage (learning) was set prior to the game. Children’s tactile responses were recorded as correct, false, or impulsive, following a child-dependent set up of a valid delay time after stimulus offset for valid responses. Reaction times were also recorded, and the database was in Εxcel format. The tablet version of the auditory rehabilitation tool was piloted in 22 preschool children with Νormal Ηearing (ΝΗ), which led to improvements. The study took place in clinical settings or at children’s homes. Fifteen children with CI, aged 5;7-12;3 years with post-implantation 0;11-5;1 years used the auditory rehabilitation tool. Eight children with CI were monolingual, two were bilingual and five had additional disabilities. The control groups consisted of 13 children with ΝΗ, aged 2;6-9;11 years. A comparison of both accuracy rates, as percent correct, and reaction times (in sec) was made at each stage, across hearing status, age, and also, within the CI group, based on presence of additional disability and bilingualism. Both monolingual Greek-speaking children with CI with no additional disabilities and hearing peers showed high accuracy rates at all stages, with performances falling above the 3rd quartile. However, children with normal hearing scored higher than the children with CI, especially in the detection and word discrimination tasks. The reaction time differences between the two groups decreased in language-based tasks. Results for children with CI with additional disability or bilingualism varied. Finally, older children scored higher than younger ones in both groups (CI, NH), but larger differences occurred in children with CI. The interactions between familiarization of the software, age, hearing status and demographic characteristics are discussed. Overall, the VR game is a promising tool for tracking the development of auditory skills, as it provides multi-level longitudinal empirical data. Acknowledgment: This work is part of a project that has received funding from the Research Committee of the University of Macedonia under the Basic Research 2020-21 funding programme.

Keywords: VR serious games, auditory rehabilitation, auditory training, children with cochlear implants

Procedia PDF Downloads 82
16 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions

Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer

Abstract:

The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.

Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping

Procedia PDF Downloads 208
15 Understanding Systemic Barriers (and Opportunities) to Increasing Uptake of Subcutaneous Medroxy Progesterone Acetate Self-Injection in Health Facilities in Nigeria

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: The DISC project collaborated with partners to implement demand creation and service delivery interventions, including the MoT (Moment of Truth) innovation, in over 500 health facilities across 15 states. This has increased the voluntary conversion rate to self-injection among women who opt for injectable contraception. While some facilities recorded an increasing trend in key performance indicators, few others persistently performed sub-optimally due to provider and system-related barriers. Methodology: Twenty-two facilities performing sub-optimally were selected purposively from three Nigerian states. Low productivity was appraised using low reporting rates and poor SI conversion rates as indicators. Interviews were conducted with health providers across these health facilities using a rapid diagnosis tool. The project also conducted a data quality assessment that evaluated the veracity of data elements reported across the three major sources of family planning data in the facility. Findings: The inability and sometimes refusal of providers to support clients to self-inject effectively was associated with the misunderstanding of its value to their work experience. It was also observed that providers still held a strong influence over clients’ method choices. Furthermore, providers held biases and misconceptions about DMPA-SC that restricted the access of obese clients and new acceptors to services – a clear departure from the recommendations of the national guidelines. Additionally, quality of care standards was compromised because job aids were not used to inform service delivery. Facilities performing sub-optimally often under-reported DMPA-SC utilization data, and there were multiple uncoordinated responsibilities for recording and reporting. Additionally, data validation meetings were not regularly convened, and these meetings were ineffective in authenticating data received from health facilities. Other reasons for sub-optimal performance included poor documentation and tracking of stock inventory resulting in commodity stockouts, low client flow because of poor positioning of health facilities, and ineffective messaging. Some facilities lacked adequate human and material resources to provide services effectively and received very few supportive supervision visits. Supportive supervision visits and Data Quality Audits have been useful to address the aforementioned performance barriers. The project has deployed digital DMPA-SC self-injection checklists that have been aligned with nationally approved templates. During visits, each provider and community mobilizer is accorded special attention by the supervisor until he/she can perform procedures in line with best practice (protocol). Conclusion: This narrative provides a summary of a range of factors that identify health facilities performing sub-optimally in their provision of DMPA-SC services. Findings from this assessment will be useful during project design to inform effective strategies. As the project enters its final stages of implementation, it is transitioning high-impact activities to state institutions in the quest to sustain the quality of service beyond the tenure of the project. The project has flagged activities, as well as created protocols and tools aimed at placing state-level stakeholders at the forefront of improving productivity in health facilities.

Keywords: family planning, contraception, DMPA-SC, self-care, self-injection, barriers, opportunities, performance

Procedia PDF Downloads 75
14 Low-Cost Aviation Solutions to Strengthen Counter-Poaching Efforts in Kenya

Authors: Kuldeep Rawat, Michael O'Shea, Maureen McGough

Abstract:

The paper will discuss a National Institute of Justice (NIJ) funded project to provide cost-effective aviation technologies and research to support counter-poaching operations related to endangered, protected, and/or regulated wildlife. The goal of this project is to provide cost-effective aviation technology and research support to Kenya Wildlife Service (KWS) in their counter-poaching efforts. In pursuit of this goal, Elizabeth City State University (ECSU) is assisting the National Institute of Justice (NIJ) in enhancing the Kenya Wildlife Service’s aviation technology and related capacity to meet its counter-poaching mission. Poaching, at its core, is systemic as poachers go to the most extreme lengths to kill high target species such as elephant and rhino. These high target wildlife species live in underdeveloped or impoverished nations, where poachers find fewer barriers to their operations. In Kenya, with fifty-nine (59) parks and reserves, spread over an area of 225,830 square miles (584,897 square kilometers) adequate surveillance on the ground is next to impossible. Cost-effective aviation surveillance technologies, based on a comprehensive needs assessment and operational evaluation, are needed to curb poaching and effectively prevent wildlife trafficking. As one of the premier law enforcement Air Wings in East Africa, KWS plays a crucial role in Kenya, not only in counter-poaching and wildlife conservation efforts, but in aerial surveillance, counterterrorism and national security efforts as well. While the Air Wing has done, a remarkable job conducting aerial patrols with limited resources, additional aircraft and upgraded technology should significantly advance the Air Wing’s ability to achieve its wildlife protection mission. The project includes: (i) Needs Assessment of the KWS Air Wing, to include the identification of resources, current and prospective capacity, operational challenges and priority goals for expansion, (ii) Acquisition of Low-Cost Aviation Technology to meet priority needs, and (iii) Operational Evaluation of technology performance, with a focus on implementation and effectiveness. The Needs Assessment reflects the priorities identified through two site visits to the KWS Air Wing in Nairobi, Kenya, as well as field visits to multiple national parks receiving aerial support and interviewing/surveying KWS Air wing pilots and leadership. Needs Assessment identified some immediate technology needs that includes, GPS with upgrades, including weather application, Night flying capabilities, to include runway lights and night vision technology, Cameras and surveillance equipment, Flight tracking system and/or Emergency Position Indicating Radio Beacon, Lightweight ballistic-resistant body armor, and medical equipment, to include a customized stretcher and standard medical evacuation equipment. Results of this assessment, along with significant input from the KWS Air Wing, will guide the second phase of this project: technology acquisition. Acquired technology will then be evaluated in the field, with a focus on implementation and effectiveness. Results will ultimately be translated for any rural or tribal law enforcement agencies with comparable aerial surveillance missions and operational environments, and jurisdictional challenges, seeking to implement low-cost aviation technology. Results from Needs Assessment phase, including survey results and our ongoing technology acquisition and baseline operational evaluation will be discussed in the paper.

Keywords: aerial surveillance mission, aviation technology, counter-poaching, wildlife protection

Procedia PDF Downloads 270
13 A Human Factors Approach to Workload Optimization for On-Screen Review Tasks

Authors: Christina Kirsch, Adam Hatzigiannis

Abstract:

Rail operators and maintainers worldwide are increasingly replacing walking patrols in the rail corridor with mechanized track patrols -essentially data capture on trains- and on-screen reviews of track infrastructure in centralized review facilities. The benefit is that infrastructure workers are less exposed to the dangers of the rail corridor. The impact is a significant change in work design from walking track sections and direct observation in the real world to sedentary jobs in the review facility reviewing captured data on screens. Defects in rail infrastructure can have catastrophic consequences. Reviewer performance regarding accuracy and efficiency of reviews within the available time frame is essential to ensure safety and operational performance. Rail operators must optimize workload and resource loading to transition to on-screen reviews successfully. Therefore, they need to know what workload assessment methodologies will provide reliable and valid data to optimize resourcing for on-screen reviews. This paper compares objective workload measures, including track difficulty ratings and review distance covered per hour, and subjective workload assessments (NASA TLX) and analyses the link between workload and reviewer performance, including sensitivity, precision, and overall accuracy. An experimental study was completed with eight on-screen reviewers, including infrastructure workers and engineers, reviewing track sections with different levels of track difficulty over nine days. Each day the reviewers completed four 90-minute sessions of on-screen inspection of the track infrastructure. Data regarding the speed of review (km/ hour), detected defects, false negatives, and false positives were collected. Additionally, all reviewers completed a subjective workload assessment (NASA TLX) after each 90-minute session and a short employee engagement survey at the end of the study period that captured impacts on job satisfaction and motivation. The results showed that objective measures for tracking difficulty align with subjective mental demand, temporal demand, effort, and frustration in the NASA TLX. Interestingly, review speed correlated with subjective assessments of physical and temporal demand, but to mental demand. Subjective performance ratings correlated with all accuracy measures and review speed. The results showed that subjective NASA TLX workload assessments accurately reflect objective workload. The analysis of the impact of workload on performance showed that subjective mental demand correlated with high precision -accurately detected defects, not false positives. Conversely, high temporal demand was negatively correlated with sensitivity and the percentage of detected existing defects. Review speed was significantly correlated with false negatives. With an increase in review speed, accuracy declined. On the other hand, review speed correlated with subjective performance assessments. Reviewers thought their performance was higher when they reviewed the track sections faster, despite the decline in accuracy. The study results were used to optimize resourcing and ensure that reviewers had enough time to review the allocated track sections to improve defect detection rates in accordance with the efficiency-thoroughness trade-off. Overall, the study showed the importance of a multi-method approach to workload assessment and optimization, combining subjective workload assessments with objective workload and performance measures to ensure that recommendations for work system optimization are evidence-based and reliable.

Keywords: automation, efficiency-thoroughness trade-off, human factors, job design, NASA TLX, performance optimization, subjective workload assessment, workload analysis

Procedia PDF Downloads 116
12 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning

Authors: Pei Yi Lin

Abstract:

Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.

Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model

Procedia PDF Downloads 66
11 Leveraging Digital Transformation Initiatives and Artificial Intelligence to Optimize Readiness and Simulate Mission Performance across the Fleet

Authors: Justin Woulfe

Abstract:

Siloed logistics and supply chain management systems throughout the Department of Defense (DOD) has led to disparate approaches to modeling and simulation (M&S), a lack of understanding of how one system impacts the whole, and issues with “optimal” solutions that are good for one organization but have dramatic negative impacts on another. Many different systems have evolved to try to understand and account for uncertainty and try to reduce the consequences of the unknown. As the DoD undertakes expansive digital transformation initiatives, there is an opportunity to fuse and leverage traditionally disparate data into a centrally hosted source of truth. With a streamlined process incorporating machine learning (ML) and artificial intelligence (AI), advanced M&S will enable informed decisions guiding program success via optimized operational readiness and improved mission success. One of the current challenges is to leverage the terabytes of data generated by monitored systems to provide actionable information for all levels of users. The implementation of a cloud-based application analyzing data transactions, learning and predicting future states from current and past states in real-time, and communicating those anticipated states is an appropriate solution for the purposes of reduced latency and improved confidence in decisions. Decisions made from an ML and AI application combined with advanced optimization algorithms will improve the mission success and performance of systems, which will improve the overall cost and effectiveness of any program. The Systecon team constructs and employs model-based simulations, cutting across traditional silos of data, aggregating maintenance, and supply data, incorporating sensor information, and applying optimization and simulation methods to an as-maintained digital twin with the ability to aggregate results across a system’s lifecycle and across logical and operational groupings of systems. This coupling of data throughout the enterprise enables tactical, operational, and strategic decision support, detachable and deployable logistics services, and configuration-based automated distribution of digital technical and product data to enhance supply and logistics operations. As a complete solution, this approach significantly reduces program risk by allowing flexible configuration of data, data relationships, business process workflows, and early test and evaluation, especially budget trade-off analyses. A true capability to tie resources (dollars) to weapon system readiness in alignment with the real-world scenarios a warfighter may experience has been an objective yet to be realized to date. By developing and solidifying an organic capability to directly relate dollars to readiness and to inform the digital twin, the decision-maker is now empowered through valuable insight and traceability. This type of educated decision-making provides an advantage over the adversaries who struggle with maintaining system readiness at an affordable cost. The M&S capability developed allows program managers to independently evaluate system design and support decisions by quantifying their impact on operational availability and operations and support cost resulting in the ability to simultaneously optimize readiness and cost. This will allow the stakeholders to make data-driven decisions when trading cost and readiness throughout the life of the program. Finally, sponsors are available to validate product deliverables with efficiency and much higher accuracy than in previous years.

Keywords: artificial intelligence, digital transformation, machine learning, predictive analytics

Procedia PDF Downloads 155
10 Towards Dynamic Estimation of Residential Building Energy Consumption in Germany: Leveraging Machine Learning and Public Data from England and Wales

Authors: Philipp Sommer, Amgad Agoub

Abstract:

The construction sector significantly impacts global CO₂ emissions, particularly through the energy usage of residential buildings. To address this, various governments, including Germany's, are focusing on reducing emissions via sustainable refurbishment initiatives. This study examines the application of machine learning (ML) to estimate energy demands dynamically in residential buildings and enhance the potential for large-scale sustainable refurbishment. A major challenge in Germany is the lack of extensive publicly labeled datasets for energy performance, as energy performance certificates, which provide critical data on building-specific energy requirements and consumption, are not available for all buildings or require on-site inspections. Conversely, England and other countries in the European Union (EU) have rich public datasets, providing a viable alternative for analysis. This research adapts insights from these English datasets to the German context by developing a comprehensive data schema and calibration dataset capable of predicting building energy demand effectively. The study proposes a minimal feature set, determined through feature importance analysis, to optimize the ML model. Findings indicate that ML significantly improves the scalability and accuracy of energy demand forecasts, supporting more effective emissions reduction strategies in the construction industry. Integrating energy performance certificates into municipal heat planning in Germany highlights the transformative impact of data-driven approaches on environmental sustainability. The goal is to identify and utilize key features from open data sources that significantly influence energy demand, creating an efficient forecasting model. Using Extreme Gradient Boosting (XGB) and data from energy performance certificates, effective features such as building type, year of construction, living space, insulation level, and building materials were incorporated. These were supplemented by data derived from descriptions of roofs, walls, windows, and floors, integrated into three datasets. The emphasis was on features accessible via remote sensing, which, along with other correlated characteristics, greatly improved the model's accuracy. The model was further validated using SHapley Additive exPlanations (SHAP) values and aggregated feature importance, which quantified the effects of individual features on the predictions. The refined model using remote sensing data showed a coefficient of determination (R²) of 0.64 and a mean absolute error (MAE) of 4.12, indicating predictions based on efficiency class 1-100 (G-A) may deviate by 4.12 points. This R² increased to 0.84 with the inclusion of more samples, with wall type emerging as the most predictive feature. After optimizing and incorporating related features like estimated primary energy consumption, the R² score for the training and test set reached 0.94, demonstrating good generalization. The study concludes that ML models significantly improve prediction accuracy over traditional methods, illustrating the potential of ML in enhancing energy efficiency analysis and planning. This supports better decision-making for energy optimization and highlights the benefits of developing and refining data schemas using open data to bolster sustainability in the building sector. The study underscores the importance of supporting open data initiatives to collect similar features and support the creation of comparable models in Germany, enhancing the outlook for environmental sustainability.

Keywords: machine learning, remote sensing, residential building, energy performance certificates, data-driven, heat planning

Procedia PDF Downloads 51
9 Advancing Dialysis Care Access and Health Information Management: A Blueprint for Nairobi Hospital

Authors: Kimberly Winnie Achieng Otieno

Abstract:

The Nairobi Hospital plays a pivotal role in healthcare provision in East and Central Africa, yet it faces challenges in providing accessible dialysis care. This paper explores strategic interventions to enhance dialysis care, improve access and streamline health information management, with an aim of fostering an integrated and patient-centered healthcare system in our region. Challenges at The Nairobi Hospital The Nairobi Hospital currently grapples with insufficient dialysis machines which results in extended turn around times. This issue stems from both staffing bottle necks and infrastructural limitations given our growing demand for renal care services. Our Paper-based record keeping system and fragmented flow of information downstream hinders the hospital’s ability to manage health data effectively. There is also a need for investment in expanding The Nairobi Hospital dialysis facilities to far reaching communities. Setting up satellite clinics that are closer to people who live in areas far from the main hospital will ensure better access to underserved areas. Community Outreach and Education Implementing education programs on kidney health within local communities is vital for early detection and prevention. Collaborating with local leaders and organizations can establish a proactive approach to renal health hence reducing the demand for acute dialysis interventions. We can amplify this effort by expanding The Nairobi Hospital’s corporate social responsibility outreach program with weekend engagement activities such as walks, awareness classes and fund drives. Enhancing Efficiency in Dialysis Care Demand for dialysis services continues to rise due to an aging Kenyan population and the increasing prevalence of chronic kidney disease (CKD). Present at this years International Nursing Conference are a diverse group of caregivers from around the world who can share with us their process optimization strategies, patient engagement techniques and resource utilization efficiencies to catapult The Nairobi Hospital to the 21st century and beyond. Plans are underway to offer ongoing education opportunities to keep staff updated on best practices and emerging technologies in addition to utilizing a patient feedback mechanisms to identify areas for improvement and enhance satisfaction. Staff empowerment and suggestion boxes address The Nairobi Hospital’s organizational challenges. Current financial constraints may limit a leapfrog in technology integration such as the acquisition of new dialysis machines and an investment in predictive analytics to forecast patient needs and optimize resource allocation. Streamlining Health Information Management Fully embracing a shift to 100% Electronic Health Records (EHRs) is a transformative step toward efficient health information management. Shared information promotes a holistic understanding of patients’ medical history, minimizing redundancies and enhancing overall care quality. To manage the transition to community-based care and EHRs effectively, a phased implementation approach is recommended. Conclusion By strategically enhancing dialysis care access and streamlining health information management, The Nairobi Hospital can strengthen its position as a leading healthcare institution in both East and Central Africa. This comprehensive approach aligns with the hospital’s commitment to providing high-quality, accessible, and patient-centered care in an evolving landscape of healthcare delivery.

Keywords: Africa, urology, diaylsis, healthcare

Procedia PDF Downloads 53
8 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid

Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang

Abstract:

Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.

Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal

Procedia PDF Downloads 75
7 SEAWIZARD-Multiplex AI-Enabled Graphene Based Lab-On-Chip Sensing Platform for Heavy Metal Ions Monitoring on Marine Water

Authors: M. Moreno, M. Alique, D. Otero, C. Delgado, P. Lacharmoise, L. Gracia, L. Pires, A. Moya

Abstract:

Marine environments are increasingly threatened by heavy metal contamination, including mercury (Hg), lead (Pb), and cadmium (Cd), posing significant risks to ecosystems and human health. Traditional monitoring techniques often fail to provide the spatial and temporal resolution needed for real-time detection of these contaminants, especially in remote or harsh environments. SEAWIZARD addresses these challenges by leveraging the flexibility, adaptability, and cost-effectiveness of printed electronics with the integration of microfluidics to develop a compact, portable, and reusable sensor platform designed specifically for real-time monitoring of heavy metal ions in seawater. The SEAWIZARD sensor is a multiparametric Lab-on-Chip (LoC) device, a miniaturized system that integrates several laboratory functions into a single chip, drastically reducing sample volumes and improving adaptability. This platform integrates three screen-printed graphene electrodes for the simultaneous detection of Hg, Cd and Pb via square wave voltammetry. These electrodes share the reference and the counter electrodes to improve space efficiency. Additionally, it integrates printed pH and temperature sensors to correct environmental interferences that may impact the accuracy of metal detection. The pH sensor is based on a carbon electrode with iridium oxide electrodeposited, while the temperature sensor is graphene-based. A protective dielectric layer is printed on top of the sensor to safeguard it in harsh marine conditions. The use of flexible polyethylene terephthalate (PET) as the substrate enables the sensor to conform to various surfaces and operate in challenging environments. One of the key innovations of SEAWIZARD is its integrated microfluidic layer fabricated from cyclic olefin copolymer (COC). This microfluidic component allows a controlled flow of seawater over the sensing area, allowing for significantly improved detection limits compared to direct water sampling. The system’s dual-channel design separates the detection of heavy metals from the measurement of pH and temperature, ensuring that each parameter is measured under optimal conditions. In addition, the temperature sensor is finely tuned with a serpentine-shaped microfluidic channel to ensure precise thermal measurements. SEAWIZARD also incorporates custom electronics that allow for wireless data transmission via Bluetooth, facilitating rapid data collection and user interface integration. Embedded artificial intelligence further enhances the platform by providing an automated alarm system capable of detecting predefined metal concentration thresholds and issuing warnings when limits are exceeded. This predictive feature enables early warnings of potential environmental disasters, such as industrial spills or toxic levels of heavy metal pollutants, making SEAWIZARD not just a detection tool but a comprehensive monitoring and early intervention system. In conclusion, SEAWIZARD represents a significant advancement in printed electronics applied to environmental sensing. By combining flexible, low-cost materials with advanced microfluidics, custom electronics, and AI-driven intelligence, SEAWIZARD offers a highly adaptable and scalable solution for real-time, high-resolution monitoring of heavy metals in marine environments. Its compact and portable design makes it an accessible, user-friendly tool with the potential to transform water quality monitoring practices and provide critical data to protect marine ecosystems from contamination-related risks.

Keywords: lab-on-chip, printed electronics, real-time monitoring, microfluidics, heavy metal contamination

Procedia PDF Downloads 5
6 Developing a Place-Name Gazetteer for Singapore by Mining Historical Planning Archives and Selective Crowd-Sourcing

Authors: Kevin F. Hsu, Alvin Chua, Sarah X. Lin

Abstract:

As a multilingual society, Singaporean names for different parts of the city have changed over time. Residents included Indigenous Malays, dialect-speakers from China, European settler-colonists, and Tamil-speakers from South India. Each group would name locations in their own languages. Today, as ancestral tongues are increasingly supplanted by English, contemporary Singaporeans’ understanding of once-common place names is disappearing. After demolition or redevelopment, some urban places will only exist in archival records or in human memory. United Nations conferences on the standardization of geographic names have called attention to how place names relate to identity, well-being, and a sense of belonging. The Singapore Place-Naming Project responds to these imperatives by capturing past and present place names through digitizing historical maps, mining archival records, and applying selective crowd-sourcing to trace the evolution of place names throughout the city. The project ensures that both formal and vernacular geographical names remain accessible to historians, city planners, and the public. The project is compiling a gazetteer, a geospatial archive of placenames, with streets, buildings, landmarks, and other points of interest (POI) appearing in the historic maps and planning documents of Singapore, currently held by the National Archives of Singapore, the National Library Board, university departments, and the Urban Redevelopment Authority. To create a spatial layer of information, the project links each place name to either a geo-referenced point, line segment, or polygon, along with the original source material in which the name appears. This record is supplemented by crowd-sourced contributions from civil service officers and heritage specialists, drawing from their collective memory to (1) define geospatial boundaries of historic places that appear in past documents, but maybe unfamiliar to users today, and (2) identify and record vernacular place names not captured in formal planning documents. An intuitive interface allows participants to demarcate feature classes, vernacular phrasings, time periods, and other knowledge related to historical or forgotten spaces. Participants are stratified into age bands and ethnicity to improve representativeness. Future iterations could allow additional public contributions. Names reveal meanings that communities assign to each place. While existing historical maps of Singapore allow users to toggle between present-day and historical raster files, this project goes a step further by adding layers of social understanding and planning documents. Tracking place names illuminates linguistic, cultural, commercial, and demographic shifts in Singapore, in the context of transformations of the urban environment. The project also demonstrates how a moderated, selectively crowd-sourced effort can solicit useful geospatial data at scale, sourced from different generations, and at higher granularity than traditional surveys, while mitigating negative impacts of unmoderated crowd-sourcing. Stakeholder agencies believe the project will achieve several objectives, including Supporting heritage conservation and public education; Safeguarding intangible cultural heritage; Providing historical context for street, place or development-renaming requests; Enhancing place-making with deeper historical knowledge; Facilitating emergency and social services by tagging legal addresses to vernacular place names; Encouraging public engagement with heritage by eliciting multi-stakeholder input.

Keywords: collective memory, crowd-sourced, digital heritage, geospatial, geographical names, linguistic heritage, place-naming, Singapore, Southeast Asia

Procedia PDF Downloads 121
5 Unleashing Potential in Pedagogical Innovation for STEM Education: Applying Knowledge Transfer Technology to Guide a Co-Creation Learning Mechanism for the Lingering Effects Amid COVID-19

Authors: Lan Cheng, Harry Qin, Yang Wang

Abstract:

Background: COVID-19 has induced the largest digital learning experiment in history. There is also emerging research evidence that students have paid a high cost of learning loss from virtual learning. University-wide survey results demonstrate that digital learning remains difficult for students who struggle with learning challenges, isolation, or a lack of resources. Large-scale efforts are therefore increasingly utilized for digital education. To better prepare students in higher education for this grand scientific and technological transformation, STEM education has been prioritized and promoted as a strategic imperative in the ongoing curriculum reform essential for unfinished learning needs and whole-person development. Building upon five key elements identified in the STEM education literature: Problem-based Learning, Community and Belonging, Technology Skills, Personalization of Learning, Connection to the External Community, this case study explores the potential of pedagogical innovation that integrates computational and experimental methodologies to support, enrich, and navigate STEM education. Objectives: The goal of this case study is to create a high-fidelity prototype design for STEM education with knowledge transfer technology that contains a Cooperative Multi-Agent System (CMAS), which has the objectives of (1) conduct assessment to reveal a virtual learning mechanism and establish strategies to facilitate scientific learning engagement, accessibility, and connection within and beyond university setting, (2) explore and validate an interactional co-creation approach embedded in project-based learning activities under the STEM learning context, which is being transformed by both digital technology and student behavior change,(3) formulate and implement the STEM-oriented campaign to guide learning network mapping, mitigate the loss of learning, enhance the learning experience, scale-up inclusive participation. Methods: This study applied a case study strategy and a methodology informed by Social Network Analysis Theory within a cross-disciplinary communication paradigm (students, peers, educators). Knowledge transfer technology is introduced to address learning challenges and to increase the efficiency of Reinforcement Learning (RL) algorithms. A co-creation learning framework was identified and investigated in a context-specific way with a learning analytic tool designed in this study. Findings: The result shows that (1) CMAS-empowered learning support reduced students’ confusion, difficulties, and gaps during problem-solving scenarios while increasing learner capacity empowerment, (2) The co-creation learning phenomenon have examined through the lens of the campaign and reveals that an interactive virtual learning environment fosters students to navigate scientific challenge independently and collaboratively, (3) The deliverables brought from the STEM educational campaign provide a methodological framework both within the context of the curriculum design and external community engagement application. Conclusion: This study brings a holistic and coherent pedagogy to cultivates students’ interest in STEM and develop them a knowledge base to integrate and apply knowledge across different STEM disciplines. Through the co-designing and cross-disciplinary educational content and campaign promotion, findings suggest factors to empower evidence-based learning practice while also piloting and tracking the impact of the scholastic value of co-creation under the dynamic learning environment. The data nested under the knowledge transfer technology situates learners’ scientific journey and could pave the way for theoretical advancement and broader scientific enervators within larger datasets, projects, and communities.

Keywords: co-creation, cross-disciplinary, knowledge transfer, STEM education, social network analysis

Procedia PDF Downloads 110
4 Improving Patient Journey in the Obstetrics and Gynecology Emergency Department: A Comprehensive Analysis of Patient Experience

Authors: Lolwa Alansari, Abdelhamid Azhaghdani, Sufia Athar, Hanen Mrabet, Annaliza Cruz, Tamara Alshadafat, Almunzer Zakaria

Abstract:

Introduction: Improving the patient experience is a fundamental pillar of healthcare's quadruple aims. Recognizing the importance of patient experiences and perceptions in healthcare interactions is pivotal for driving quality improvement. This abstract centers around the Patient Experience Program, an endeavor crafted with the purpose of comprehending and elevating the experiences of patients in the Obstetrics & Gynecology Emergency Department (OB/GYN ED). Methodology: This comprehensive endeavor unfolded through a structured sequence of phases following Plan-Do-Study-Act (PDSA) model, spanning over 12 months, focused on enhancing patient experiences in the Obstetrics & Gynecology Emergency Department (OB/GYN ED). The study meticulously examined the journeys of patients with acute obstetrics and gynecological conditions, collecting data from over 100 participants monthly. The inclusive approach covered patients of different priority levels (1-5) admitted for acute conditions, with no exclusions. Historical data from March and April 2022 serves as a benchmark for comparison, strengthening causality claims by providing a baseline understanding of OB/GYN ED performance before interventions. Additionally, the methodology includes the incorporation of staff engagement surveys to comprehensively understand the experiences of healthcare professionals with the implemented improvements. Data extraction involved administering open-ended questions and comment sections to gather rich qualitative insights. The survey covered various aspects of the patient journey, including communication, emotional support, timely access to care, care coordination, and patient-centered decision-making. The project's data analysis utilized a mixed-methods approach, combining qualitative techniques to identify recurring themes and extract actionable insights and quantitative methods to assess patient satisfaction scores and relevant metrics over time, facilitating the measurement of intervention impact and longitudinal tracking of changes. From the themes we discovered in both the online and in-person patient experience surveys, several key findings emerged that guided us in initiating improvements, including effective communication and information sharing, providing emotional support and empathy, ensuring timely access to care, fostering care coordination and continuity, and promoting patient-centered decision-making. Results: The project yielded substantial positive outcomes, significantly improving patient experiences in the OB/GYN ED. Patient satisfaction levels rose from 62% to a consistent 98%, with notable improvements in satisfaction with care plan information and physician care. Waiting time satisfaction increased from 68% to a steady 97%. The project positively impacted nurses' and midwives' job satisfaction, increasing from 64% to an impressive 94%. Operational metrics displayed positive trends, including a decrease in the "left without being seen" rate from 3% to 1%, the discharge against medical advice rate dropping from 8% to 1%, and the absconded rate reducing from 3% to 0%. These outcomes underscore the project's effectiveness in enhancing both patient and staff experiences in the healthcare setting. Conclusion: The use of a patient experience questionnaire has been substantiated by evidence-based research as an effective tool for improving the patient experience, guiding interventions, and enhancing overall healthcare quality in the OB/GYN ED. The project's interventions have resulted in a more efficient allocation of resources, reduced hospital stays, and minimized unnecessary resource utilization. This, in turn, contributes to cost savings for the healthcare facility.

Keywords: patient experience, patient survey, person centered care, quality initiatives

Procedia PDF Downloads 52
3 Translation of Self-Inject Contraception Training Objectives Into Service Performance Outcomes

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Simeon Christian Chukwu, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: Health service providers are offered in-service training periodically to strengthen their ability to deliver services that are ethical, quality, timely and safe. Not all capacity-building courses have successfully resulted in intended service delivery outcomes because of poor training content, design, approach, and ambiance. The Delivering Innovations in Selfcare (DISC) project developed a Moment of Truth innovation, which is a proven training model focused on improving consumer/provider interaction that leads to an increase in the voluntary uptake of subcutaneous depot medroxyprogesterone acetate (DMPA-SC) self-injection among women who opt for injectable contraception. Methodology: Six months after training on a moment of truth (MoT) training manual, the project conducted two intensive rounds of qualitative data collection and triangulation that included provider, client, and community mobilizer interviews, facility observations, and routine program data collection. Respondents were sampled according to a convenience sampling approach, and data collected was analyzed using a codebook and Atlas-TI. Providers and clients were interviewed to understand their experience, perspective, attitude, and awareness about the DMPA-SC self-inject. Data were collected from 12 health facilities in three states – eight directly trained and four cascades trained. The research team members came together for a participatory analysis workshop to explore and interpret emergent themes. Findings: Quality-of-service delivery and performance outcomes were observed to be significantly better in facilities whose providers were trained directly trained by the DISC project than in sites that received indirect training through master trainers. Facilities that were directly trained recorded SI proportions that were twice more than in cascade-trained sites. Direct training comprised of full-day and standalone didactic and interactive sessions constructed to evoke commitment, passion and conviction as well as eliminate provider bias and misconceptions in providers by utilizing human interest stories and values clarification exercises. Sessions also created compelling arguments using evidence and national guidelines. The training also prioritized demonstration sessions, utilized job aids, particularly videos, strengthened empathetic counseling – allaying client fears and concerns about SI, trained on positioning self-inject first and side effects management. Role plays and practicum was particularly useful to enable providers to retain and internalize new knowledge. These sessions provided experiential learning and the opportunity to apply one's expertise in a supervised environment where supportive feedback is provided in real-time. Cascade Training was often a shorter and abridged form of MoT training that leveraged existing training already planned by master trainers. This training was held over a four-hour period and was less emotive, focusing more on foundational DMPA-SC knowledge such as a reorientation to DMPA-SC, comparison of DMPA-SC variants, counseling framework and skills, data reporting and commodity tracking/requisition – no facility practicums. Training on self-injection was not as robust, presumably because they were not directed at methods in the contraceptive mix that align with state/organizational sponsored objectives – in this instance, fostering LARC services. Conclusion: To achieve better performance outcomes, consideration should be given to providing training that prioritizes practice-based and emotive content. Furthermore, a firm understanding and conviction about the value training offers improve motivation and commitment to accomplish and surpass service-related performance outcomes.

Keywords: training, performance outcomes, innovation, family planning, contraception, DMPA-SC, self-care, self-injection.

Procedia PDF Downloads 80
2 Innovative Practices That Have Significantly Scaled up Depot Medroxy Progesterone Acetate-SC Self-Inject Services

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background The Delivering Innovations in Selfcare (DISC) project promotes universal access to quality selfcare services beginning with subcutaneous depot medroxy progesterone acetate (DMPA-SC) contraceptive self-injection (SI) option. Self-inject (SI) offers women a highly effective and convenient option that saves them frequent trips to providers. Its increased use has the potential to improve the efficiency of an overstretched healthcare system by reducing provider workloads. State Social and Behavioral Change Communications (SBCC) Officers lead project demand creation and service delivery innovations that have resulted in significant increases in SI uptake among women who opt for injectables. Strategies Service Delivery Innovations The implementation of the "Moment of Truth (MoT)" innovation helped providers overcome biases and address client fear and reluctance to self-inject. Bi-annual program audits and supportive mentoring visits helped providers retain their competence and motivation. Proper documentation, tracking, and replenishment of commodities were ensured through effective engagement with State Logistics Units. The project supported existing state monitoring and evaluation structures to effectively record and report subcutaneous depot medroxy progesterone acetate (DMPA-SC) service utilization. Demand creation Innovations SBCC Officers provide oversight, routinely evaluate performance, trains, and provides feedback for the demand creation activities implemented by community mobilizers (CMs). The scope and intensity of training given to CMs affect the outcome of their work. The project operates a demand creation model that uses a schedule to inform the conduct of interpersonal and group events. Health education sessions are specifically designed to counter misinformation, address questions and concerns, and educate target audience in an informed choice context. The project mapped facilities and their catchment areas and enlisted the support of identified influencers and gatekeepers to enlist their buy-in prior to entry. Each mobilization event began with pre-mobilization sensitization activities, particularly targeting male groups. Context-specific interventions were informed by the religious, traditional, and cultural peculiarities of target communities. Mobilizers also support clients to engage with and navigate online digital Family Planning (FP) online portals such as DiscoverYourPower website, Facebook page, digital companion (chat bot), interactive voice response (IVR), radio and television (TV) messaging. This improves compliance and provides linkages to nearby facilities. Results The project recorded 136,950 self-injection (SI) visits and a self-injection (SI) proportion rate that increased from 13 percent before the implementation of interventions in 2021 to 62 percent currently. The project cost-effectively demonstrated catalytic impact by leveraging state and partner resources, institutional platforms, and geographic scope to scale up interventions. The project also cost effectively demonstrated catalytic impact by leveraging on the state and partner resources, institutional platforms, and geographic scope to sustainably scale-up these strategies. Conclusion Using evidence-informed iterations of service delivery and demand creation models have been useful to significantly drive self-injection (SI) uptake. It will be useful to consider this implementation model during program design. Contemplation should also be given to systematic and strategic execution of strategies to optimize impact.

Keywords: family planning, contraception, DMPA-SC, self-care, self-injection, innovation, service delivery, demand creation.

Procedia PDF Downloads 71
1 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 117