Search results for: online tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6334

Search results for: online tools

34 Analysis Of Fine Motor Skills in Chronic Neurodegenerative Models of Huntington’s Disease and Amyotrophic Lateral Sclerosis

Authors: T. Heikkinen, J. Oksman, T. Bragge, A. Nurmi, O. Kontkanen, T. Ahtoniemi

Abstract:

Motor impairment is an inherent phenotypic feature of several chronic neurodegenerative diseases, and pharmacological therapies aimed to counterbalance the motor disability have a great market potential. Animal models of chronic neurodegenerative diseases display a number deteriorating motor phenotype during the disease progression. There is a wide array of behavioral tools to evaluate motor functions in rodents. However, currently existing methods to study motor functions in rodents are often limited to evaluate gross motor functions only at advanced stages of the disease phenotype. The most commonly applied traditional motor assays used in CNS rodent models, lack the sensitivity to capture fine motor impairments or improvements. Fine motor skill characterization in rodents provides a more sensitive tool to capture more subtle motor dysfunctions and therapeutic effects. Importantly, similar approach, kinematic movement analysis, is also used in clinic, and applied both in diagnosis and determination of therapeutic response to pharmacological interventions. The aim of this study was to apply kinematic gait analysis, a novel and automated high precision movement analysis system, to characterize phenotypic deficits in three different chronic neurodegenerative animal models, a transgenic mouse model (SOD1 G93A) for amyotrophic lateral sclerosis (ALS), and R6/2 and Q175KI mouse models for Huntington’s disease (HD). The readouts from walking behavior included gait properties with kinematic data, and body movement trajectories including analysis of various points of interest such as movement and position of landmarks in the torso, tail and joints. Mice (transgenic and wild-type) from each model were analyzed for the fine motor kinematic properties at young ages, prior to the age when gross motor deficits are clearly pronounced. Fine motor kinematic Evaluation was continued in the same animals until clear motor dysfunction with conventional motor assays was evident. Time course analysis revealed clear fine motor skill impairments in each transgenic model earlier than what is seen with conventional gross motor tests. Motor changes were quantitatively analyzed for up to ~80 parameters, and the largest data sets of HD models were further processed with principal component analysis (PCA) to transform the pool of individual parameters into a smaller and focused set of mutually uncorrelated gait parameters showing strong genotype difference. Kinematic fine motor analysis of transgenic animal models described in this presentation show that this method isa sensitive, objective and fully automated tool that allows earlier and more sensitive detection of progressive neuromuscular and CNS disease phenotypes. As a result of the analysis a comprehensive set of fine motor parameters for each model is created, and these parameters provide better understanding of the disease progression and enhanced sensitivity of this assay for therapeutic testing compared to classical motor behavior tests. In SOD1 G93A, R6/2, and Q175KI mice, the alterations in gait were evident already several weeks earlier than with traditional gross motor assays. Kinematic testing can be applied to a wider set of motor readouts beyond gait in order to study whole body movement patterns such as with relation to joints and various body parts longitudinally, providing a sophisticated and translatable method for disseminating motor components in rodent disease models and evaluating therapeutic interventions.

Keywords: Gait analysis, kinematic, motor impairment, inherent feature

Procedia PDF Downloads 354
33 Surface Sunctionalization Strategies for the Design of Thermoplastic Microfluidic Devices for New Analytical Diagnostics

Authors: Camille Perréard, Yoann Ladner, Fanny D'Orlyé, Stéphanie Descroix, Vélan Taniga, Anne Varenne, Cédric Guyon, Michael. Tatoulian, Frédéric Kanoufi, Cyrine Slim, Sophie Griveau, Fethi Bedioui

Abstract:

The development of micro total analysis systems is of major interest for contaminant and biomarker analysis. As a lab-on-chip integrates all steps of an analysis procedure in a single device, analysis can be performed in an automated format with reduced time and cost, while maintaining performances comparable to those of conventional chromatographic systems. Moreover, these miniaturized systems are either compatible with field work or glovebox manipulations. This work is aimed at developing an analytical microsystem for trace and ultra trace quantitation in complex matrices. The strategy consists in the integration of a sample pretreatment step within the lab-on-chip by a confinement zone where selective ligands are immobilized for target extraction and preconcentration. Aptamers were chosen as selective ligands, because of their high affinity for all types of targets (from small ions to viruses and cells) and their ease of synthesis and functionalization. This integrated target extraction and concentration step will be followed in the microdevice by an electrokinetic separation step and an on-line detection. Polymers consisting of cyclic olefin copolymer (COC) or fluoropolymer (Dyneon THV) were selected as they are easy to mold, transparent in UV-visible and have high resistance towards solvents and extreme pH conditions. However, because of their low chemical reactivity, surface treatments are necessary. For the design of this miniaturized diagnostics, we aimed at modifying the microfluidic system at two scales : (1) on the entire surface of the microsystem to control the surface hydrophobicity (so as to avoid any sample wall adsorption) and the fluid flows during electrokinetic separation, or (2) locally so as to immobilize selective ligands (aptamers) on restricted areas for target extraction and preconcentration. We developed different novel strategies for the surface functionalization of COC and Dyneon, based on plasma, chemical and /or electrochemical approaches. In a first approach, a plasma-induced immobilization of brominated derivatives was performed on the entire surface. Further substitution of the bromine by an azide functional group led to covalent immobilization of ligands through “click” chemistry reaction between azides and terminal alkynes. COC and Dyneon materials were characterized at each step of the surface functionalization procedure by various complementary techniques to evaluate the quality and homogeneity of the functionalization (contact angle, XPS, ATR). With the objective of local (micrometric scale) aptamer immobilization, we developed an original electrochemical strategy on engraved Dyneon THV microchannel. Through local electrochemical carbonization followed by adsorption of azide-bearing diazonium moieties and covalent linkage of alkyne-bearing aptamers through click chemistry reaction, typical dimensions of immobilization zones reached the 50 µm range. Other functionalization strategies, such as sol-gel encapsulation of aptamers, are currently investigated and may also be suitable for the development of the analytical microdevice. The development of these functionalization strategies is the first crucial step in the design of the entire microdevice. These strategies allow the grafting of a large number of molecules for the development of new analytical tools in various domains like environment or healthcare.

Keywords: alkyne-azide click chemistry (CuAAC), electrochemical modification, microsystem, plasma bromination, surface functionalization, thermoplastic polymers

Procedia PDF Downloads 441
32 Smart Interior Design: A Revolution in Modern Living

Authors: Fatemeh Modirzare

Abstract:

Smart interior design represents a transformative approach to creating living spaces that integrate technology seamlessly into our daily lives, enhancing comfort, convenience, and sustainability. This paper explores the concept of smart interior design, its principles, benefits, challenges, and future prospects. It also highlights various examples and applications of smart interior design to illustrate its potential in shaping the way we live and interact with our surroundings. In an increasingly digitized world, the boundaries between technology and interior design are blurring. Smart interior design, also known as intelligent or connected interior design, involves the incorporation of advanced technologies and automation systems into residential and commercial spaces. This innovative approach aims to make living environments more efficient, comfortable, and adaptable while promoting sustainability and user well-being. Smart interior design seamlessly integrates technology into the aesthetics and functionality of a space, ensuring that devices and systems do not disrupt the overall design. Sustainable materials, energy-efficient systems, and eco-friendly practices are central to smart interior design, reducing environmental impact. Spaces are designed to be adaptable, allowing for reconfiguration to suit changing needs and preferences. Smart homes and spaces offer greater comfort through features like automated climate control, adjustable lighting, and customizable ambiance. Smart interior design can significantly reduce energy consumption through optimized heating, cooling, and lighting systems. Smart interior design integrates security systems, fire detection, and emergency response mechanisms for enhanced safety. Sustainable materials, energy-efficient appliances, and waste reduction practices contribute to a greener living environment. Implementing smart interior design can be expensive, particularly when retrofitting existing spaces with smart technologies. The increased connectivity raises concerns about data privacy and cybersecurity, requiring robust measures to protect user information. Rapid advancements in technology may lead to obsolescence, necessitating updates and replacements. Users must be familiar with smart systems to fully benefit from them, requiring education and ongoing support. Residential spaces incorporate features like voice-activated assistants, automated lighting, and energy management systems. Intelligent office design enhances productivity and employee well-being through smart lighting, climate control, and meeting room booking systems. Hospitals and healthcare facilities use smart interior design for patient monitoring, wayfinding, and energy conservation. Smart retail design includes interactive displays, personalized shopping experiences, and inventory management systems. The future of smart interior design holds exciting possibilities, including AI-powered design tools that create personalized spaces based on user preferences. Smart interior design will increasingly prioritize factors that improve physical and mental health, such as air quality monitoring and mood-enhancing lighting. Smart interior design is revolutionizing the way we interact with our living and working spaces. By embracing technology, sustainability, and user-centric design principles, smart interior design offers numerous benefits, from increased comfort and convenience to energy efficiency and sustainability. Despite challenges, the future holds tremendous potential for further innovation in this field, promising a more connected, efficient, and harmonious way of living and working.

Keywords: smart interior design, home automation, sustainable living spaces, technological integration, user-centric design

Procedia PDF Downloads 67
31 New Hybrid Process for Converting Small Structural Parts from Metal to CFRP

Authors: Yannick Willemin

Abstract:

Carbon fibre-reinforced plastic (CFRP) offers outstanding value. However, like all materials, CFRP also has its challenges. Many forming processes are largely manual and hard to automate, making it challenging to control repeatability and reproducibility (R&R); they generate significant scrap and are too slow for high-series production; fibre costs are relatively high and subject to supply and cost fluctuations; the supply chain is fragmented; many forms of CFRP are not recyclable, and many materials have yet to be fully characterized for accurate simulation; shelf life and outlife limitations add cost; continuous-fibre forms have design limitations; many materials are brittle; and small and/or thick parts are costly to produce and difficult to automate. A majority of small structural parts are metal due to high CFRP fabrication costs for the small-size class. The fact that CFRP manufacturing processes that produce the highest performance parts also tend to be the slowest and least automated is another reason CFRP parts are generally higher in cost than comparably performing metal parts, which are easier to produce. Fortunately, business is in the midst of a major manufacturing evolution—Industry 4.0— one technology seeing rapid growth is additive manufacturing/3D printing, thanks to new processes and materials, plus an ability to harness Industry 4.0 tools. No longer limited to just prototype parts, metal-additive technologies are used to produce tooling and mold components for high-volume manufacturing, and polymer-additive technologies can incorporate fibres to produce true composites and be used to produce end-use parts with high aesthetics, unmatched complexity, mass customization opportunities, and high mechanical performance. A new hybrid manufacturing process combines the best capabilities of additive—high complexity, low energy usage and waste, 100% traceability, faster to market—and post-consolidation—tight tolerances, high R&R, established materials, and supply chains—technologies. The platform was developed by Zürich-based 9T Labs AG and is called Additive Fusion Technology (AFT). It consists of a design software offering the possibility to determine optimal fibre layup, then exports files back to check predicted performance—plus two pieces of equipment: a 3d-printer—which lays up (near)-net-shape preforms using neat thermoplastic filaments and slit, roll-formed unidirectional carbon fibre-reinforced thermoplastic tapes—and a post-consolidation module—which consolidates then shapes preforms into final parts using a compact compression press fitted with a heating unit and matched metal molds. Matrices—currently including PEKK, PEEK, PA12, and PPS, although nearly any high-quality commercial thermoplastic tapes and filaments can be used—are matched between filaments and tapes to assure excellent bonding. Since thermoplastics are used exclusively, larger assemblies can be produced by bonding or welding together smaller components, and end-of-life parts can be recycled. By combining compression molding with 3D printing, higher part quality with very-low voids and excellent surface finish on A and B sides can be produced. Tight tolerances (min. section thickness=1.5mm, min. section height=0.6mm, min. fibre radius=1.5mm) with high R&R can be cost-competitively held in production volumes of 100 to 10,000 parts/year on a single set of machines.

Keywords: additive manufacturing, composites, thermoplastic, hybrid manufacturing

Procedia PDF Downloads 94
30 DH-Students Promoting Underage Asylum Seekers' Oral Health in Finland

Authors: Eeva Wallenius-Nareneva, Tuula Toivanen-Labiad

Abstract:

Background: Oral health promotion event was organised for forty Afghanistan, Iraqi and Bangladeshi underage asylum seekers in Finland. The invitation to arrange this coaching occasion was accepted in the Degree Programme in Oral Hygiene in Metropolia. The personnel in the reception center found the need to improve oral health among the youngsters. The purpose was to strengthen the health literacy of the boys in their oral self-care and to reduce dental fears. The Finnish studies, especially the terminology of oral health was integrated to coaching with the help of interpreters. Cooperative learning was applied. Methods: Oral health was interactively discussed in four study group sessions: 1. The importance of healthy eating habits; - Good and bad diets, - Regular meals, - Acid attack o Xylitol. 2. Oral diseases − connection to general health; - Aetiology of gingivitis, periodontitis and caries, - Harmfulness of smoking 3. Tools and techniques for oral self-care; - Brushing and inter dental cleaning. 4. Sharing earlier dental care experiences; - Cultural differences, - Dental fear, - Regular check-ups. Results: During coaching deficiencies appeared in brushing and inter dental cleaning techniques. Some boys were used to wash their mouth with salt justifying it by salt’s antiseptic properties. Many brushed their teeth by vertical movements. The boys took feedback positively when a demonstration with model jaws revealed the inefficiency of the technique. The advantages of fluoride tooth paste were advised. Dental care procedures were new and frightening for many boys. Finnish dental care system was clarified. The safety and indolence of the treatments and informed consent were highlighted. Video presentations and the dialog lowered substantially the threshold to visit dental clinic. The occasion gave the students means for meeting patients from different cultural and language backgrounds. The information hidden behind the oral health problems of the asylum seekers was valuable. Conclusions: Learning dental care practices used in different cultures is essential for dental professionals. The project was a good start towards multicultural oral health care. More experiences are needed before graduation. Health education themes should be held simple regardless of the target group. The heterogeneity of the group does not pose a problem. Open discussion with questions leading to the theme works well in clarifying the target group’s knowledge level. Sharing own experiences strengthens the sense of equality among the participants and encourages them to express own opinions. Motivational interview method turned out to be successful. In the future coaching occasions must confirm active participation of everyone. This could be realized by dividing the participants to even smaller groups. The different languages impose challenges but they can be solved by using more interpreters. Their presence ensures that everyone understands the issues properly although the use of plain and sign languages are helpful. In further development, it would be crucial to arrange a rehearsal occasion to the same participants in two/three months’ time. This would strengthen the adaption of self-care practices and give the youngsters opportunity to pose more open questions. The students would gain valuable feedback regarding the effectiveness of their work.

Keywords: cooperative learning, interactive methods, motivational interviewing, oral health promotion, underage asylum seekers

Procedia PDF Downloads 288
29 Health and Climate Changes: "Ippocrate" a New Alert System to Monitor and Identify High Risk

Authors: A. Calabrese, V. F. Uricchio, D. di Noia, S. Favale, C. Caiati, G. P. Maggi, G. Donvito, D. Diacono, S. Tangaro, A. Italiano, E. Riezzo, M. Zippitelli, M. Toriello, E. Celiberti, D. Festa, A. Colaianni

Abstract:

Climate change has a severe impact on human health. There is a vast literature demonstrating temperature increase is causally related to cardiovascular problem and represents a high risk for human health, but there are not study that improve a solution. In this work, it is studied how the clime influenced the human parameter through the analysis of climatic conditions in an area of the Apulia Region: Capurso Municipality. At the same time, medical personnel involved identified a set of variables useful to define an index describing health condition. These scientific studies are the base of an innovative alert system, IPPOCRATE, whose aim is to asses climate risk and share information to population at risk to support prevention and mitigation actions. IPPOCRATE is an e-health system, it is designed to provide technological support to analysis of health risk related to climate and provide tools for prevention and management of critical events. It is the first integrated system of prevention of human risk caused by climate change. IPPOCRATE calculates risk weighting meteorological data with the vulnerability of monitored subjects and uses mobile and cloud technologies to acquire and share information on different data channels. It is composed of four components: Multichannel Hub. Multichannel Hub is the ICT infrastructure used to feed IPPOCRATE cloud with a different type of data coming from remote monitoring devices, or imported from meteorological databases. Such data are ingested, transformed and elaborated in order to be dispatched towards mobile app and VoIP phone systems. IPPOCRATE Multichannel Hub uses open communication protocols to create a set of APIs useful to interface IPPOCRATE with 3rd party applications. Internally, it uses non-relational paradigm to create flexible and highly scalable database. WeHeart and Smart Application The wearable device WeHeart is equipped with sensors designed to measure following biometric variables: heart rate, systolic blood pressure and diastolic blood pressure, blood oxygen saturation, body temperature and blood glucose for diabetic subjects. WeHeart is designed to be easy of use and non-invasive. For data acquisition, users need only to wear it and connect it to Smart Application by Bluetooth protocol. Easy Box was designed to take advantage from new technologies related to e-health care. EasyBox allows user to fully exploit all IPPOCRATE features. Its name, Easy Box, reveals its purpose of container for various devices that may be included depending on user needs. Territorial Registry is the IPPOCRATE web module reserved to medical personnel for monitoring, research and analysis activities. Territorial Registry allows to access to all information gathered by IPPOCRATE using GIS system in order to execute spatial analysis combining geographical data (climatological information and monitored data) with information regarding the clinical history of users and their personal details. Territorial Registry was designed for different type of users: control rooms managed by wide area health facilities, single health care center or single doctor. Territorial registry manages such hierarchy diversifying the access to system functionalities. IPPOCRATE is the first e-Health system focused on climate risk prevention.

Keywords: climate change, health risk, new technological system

Procedia PDF Downloads 867
28 Sinhala Sign Language to Grammatically Correct Sentences using NLP

Authors: Anjalika Fernando, Banuka Athuraliya

Abstract:

This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired community

Keywords: Sinhala sign language, sign Language, NLP, LSTM, NMT

Procedia PDF Downloads 103
27 Understanding Systemic Barriers (and Opportunities) to Increasing Uptake of Subcutaneous Medroxy Progesterone Acetate Self-Injection in Health Facilities in Nigeria

Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu

Abstract:

Background: The DISC project collaborated with partners to implement demand creation and service delivery interventions, including the MoT (Moment of Truth) innovation, in over 500 health facilities across 15 states. This has increased the voluntary conversion rate to self-injection among women who opt for injectable contraception. While some facilities recorded an increasing trend in key performance indicators, few others persistently performed sub-optimally due to provider and system-related barriers. Methodology: Twenty-two facilities performing sub-optimally were selected purposively from three Nigerian states. Low productivity was appraised using low reporting rates and poor SI conversion rates as indicators. Interviews were conducted with health providers across these health facilities using a rapid diagnosis tool. The project also conducted a data quality assessment that evaluated the veracity of data elements reported across the three major sources of family planning data in the facility. Findings: The inability and sometimes refusal of providers to support clients to self-inject effectively was associated with the misunderstanding of its value to their work experience. It was also observed that providers still held a strong influence over clients’ method choices. Furthermore, providers held biases and misconceptions about DMPA-SC that restricted the access of obese clients and new acceptors to services – a clear departure from the recommendations of the national guidelines. Additionally, quality of care standards was compromised because job aids were not used to inform service delivery. Facilities performing sub-optimally often under-reported DMPA-SC utilization data, and there were multiple uncoordinated responsibilities for recording and reporting. Additionally, data validation meetings were not regularly convened, and these meetings were ineffective in authenticating data received from health facilities. Other reasons for sub-optimal performance included poor documentation and tracking of stock inventory resulting in commodity stockouts, low client flow because of poor positioning of health facilities, and ineffective messaging. Some facilities lacked adequate human and material resources to provide services effectively and received very few supportive supervision visits. Supportive supervision visits and Data Quality Audits have been useful to address the aforementioned performance barriers. The project has deployed digital DMPA-SC self-injection checklists that have been aligned with nationally approved templates. During visits, each provider and community mobilizer is accorded special attention by the supervisor until he/she can perform procedures in line with best practice (protocol). Conclusion: This narrative provides a summary of a range of factors that identify health facilities performing sub-optimally in their provision of DMPA-SC services. Findings from this assessment will be useful during project design to inform effective strategies. As the project enters its final stages of implementation, it is transitioning high-impact activities to state institutions in the quest to sustain the quality of service beyond the tenure of the project. The project has flagged activities, as well as created protocols and tools aimed at placing state-level stakeholders at the forefront of improving productivity in health facilities.

Keywords: family planning, contraception, DMPA-SC, self-care, self-injection, barriers, opportunities, performance

Procedia PDF Downloads 77
26 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering  

Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi

Abstract:

In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.

Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering

Procedia PDF Downloads 148
25 Stabilizing Additively Manufactured Superalloys at High Temperatures

Authors: Keivan Davami, Michael Munther, Lloyd Hackel

Abstract:

The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.

Keywords: laser shock peening, mechanical properties, indentation, high temperature stability

Procedia PDF Downloads 148
24 Research Project of National Interest (PRIN-PNRR) DIVAS: Developing Methods to Assess Tree Vitality after a Wildfire through Analyses of Cambium Sugar Metabolism

Authors: Claudia Cocozza, Niccolò Frassinelli, Enrico Marchi, Cristiano Foderi, Alessandro Bizzarri, Margherita Paladini, Maria Laura Traversi, Eleftherious Touloupakis, Alessio Giovannelli

Abstract:

The development of tools to quickly identify the fate of injured trees after stress is highly relevant when biodiversity restoration of damaged sites is based on nature-based solutions. In this context, an approach to assess irreversible physiological damages within trees could help to support planning management decisions of perturbed sites to restore biodiversity, for the safety of the environment and understanding functionality adjustments of the ecosystems. Tree vitality can be estimated by a series of physiological proxies like cambium activity, starch, and soluble sugars amount in C-sinks whilst the accumulation of ethanol within the cambial cells and phloem is considered an alert of cell death. However, their determination requires time-consuming laboratory protocols, which makes the approach unfeasible as a practical option in the field. The project aims to develop biosensors to assess the concentration of soluble sugars and ethanol in stem tissues. Soluble sugars and ethanol concentrations will be used to define injured trees to discriminate compromised and recovering trees in the forest directly. To reach this goal, we select study sites subjected to prescribed fires or recent wildfires as experimental set-ups. Indeed, in Mediterranean countries, forest fire is a recurrent event that must be considered as a central component of regional and global strategies in forest management and biodiversity restoration programs. A biosensor will be developed through a multistep process related to target analytes characterization, bioreceptor selection, and, finally, calibration/testing of the sensor. To validate biosensor signals, soluble sugars and ethanol will be quantified by HPLC and GC using synthetic media (in lab) and phloem sap (in field) whilst cambium vitality will be assessed by anatomical observations. On burnt trees, the stem growth will be monitored by dendrometers and/or estimated by tree ring analyses, whilst the tree response to past fire events will be assessed by isotopic discrimination. Moreover, the fire characterization and the visual assessment procedure will be used to assign burnt trees to a vitality class. At the end of the project, a well-defined procedure combining biosensor signal and visual assessment will be produced and applied to a study case. The project outcomes and the results obtained will be properly packaged to reach, engage and address the needs of the final users and widely shared with relevant stakeholders involved in the optimal use of biosensors and in the management of post-fire areas. This project was funded by National Recovery and Resilience Plan (NRRP), Mission 4, Component C2, Investment 1.1 - Call for tender No. 1409 of 14 September 2022 – ‘Progetti di Ricerca di Rilevante interesse Nazionale – PRIN’ of Italian Ministry of University and Research funded by the European Union – NextGenerationEU; Grant N° P2022Z5742, CUP B53D23023780001.

Keywords: phloem, scorched crown, conifers, prescribed burning, biosensors

Procedia PDF Downloads 15
23 ARGO: An Open Designed Unmanned Surface Vehicle Mapping Autonomous Platform

Authors: Papakonstantinou Apostolos, Argyrios Moustakas, Panagiotis Zervos, Dimitrios Stefanakis, Manolis Tsapakis, Nektarios Spyridakis, Mary Paspaliari, Christos Kontos, Antonis Legakis, Sarantis Houzouris, Konstantinos Topouzelis

Abstract:

For years unmanned and remotely operated robots have been used as tools in industry research and education. The rapid development and miniaturization of sensors that can be attached to remotely operated vehicles in recent years allowed industry leaders and researchers to utilize them as an affordable means for data acquisition in air, land, and sea. Despite the recent developments in the ground and unmanned airborne vehicles, a small number of Unmanned Surface Vehicle (USV) platforms are targeted for mapping and monitoring environmental parameters for research and industry purposes. The ARGO project is developed an open-design USV equipped with multi-level control hardware architecture and state-of-the-art sensors and payloads for the autonomous monitoring of environmental parameters in large sea areas. The proposed USV is a catamaran-type USV controlled over a wireless radio link (5G) for long-range mapping capabilities and control for a ground-based control station. The ARGO USV has a propulsion control using 2x fully redundant electric trolling motors with active vector thrust for omnidirectional movement, navigation with opensource autopilot system with high accuracy GNSS device, and communication with the 2.4Ghz digital link able to provide 20km of Line of Sight (Los) range distance. The 3-meter dual hull design and composite structure offer well above 80kg of usable payload capacity. Furthermore, sun and friction energy harvesting methods provide clean energy to the propulsion system. The design is highly modular, where each component or payload can be replaced or modified according to the desired task (industrial or research). The system can be equipped with Multiparameter Sonde, measuring up to 20 water parameters simultaneously, such as conductivity, salinity, turbidity, dissolved oxygen, etc. Furthermore, a high-end multibeam echo sounder can be installed in a specific boat datum for shallow water high-resolution seabed mapping. The system is designed to operate in the Aegean Sea. The developed USV is planned to be utilized as a system for autonomous data acquisition, mapping, and monitoring bathymetry and various environmental parameters. ARGO USV can operate in small or large ports with high maneuverability and endurance to map large geographical extends at sea. The system presents state of the art solutions in the following areas i) the on-board/real-time data processing/analysis capabilities, ii) the energy-independent and environmentally friendly platform entirely made using the latest aeronautical and marine materials, iii) the integration of advanced technology sensors, all in one system (photogrammetric and radiometric footprint, as well as its connection with various environmental and inertial sensors) and iv) the information management application. The ARGO web-based application enables the system to depict the results of the data acquisition process in near real-time. All the recorded environmental variables and indices are presented, allowing users to remotely access all the raw and processed information using the implemented web-based GIS application.

Keywords: monitor marine environment, unmanned surface vehicle, mapping bythometry, sea environmental monitoring

Procedia PDF Downloads 138
22 Nigeria Rural Water Supply Management: Participatory Process as the Best Option

Authors: E. O. Aluta, C. A. Booth, D. G. Proverbs, T. Appleby

Abstract:

Challenges in the effective management of potable water have attracted global attention in recent years and remain many world regions’ major priorities. Scarcity and unavailability of potable water may potentially escalate poverty, obviate democratic expression of views and militate against inter-sectoral development. These challenges contra-indicate the inherent potentials of the resource. Thus, while creation of poverty may be regarded as a broad-based problem, it is capable of reflecting life-span reduction diseases, the friction of interests manifesting in threats and warfare, the relegation of democratic principles for authoritarian definitions and Human Rights abuse. The challenges may be identified as manifestations of ineffective management of potable water resource and therefore, regarded as major problems in environmental protection. In reaction, some nations have re-examined their laws and policies, while others have developed innovative projects, which seek to ameliorate difficulties of providing sustainable potable water. The problems resonate in Nigeria, where the legal framework supporting the supply and management of potable water has been criticized as ineffective. This has impacted more on rural community members, often regarded as ‘voiceless’. At that level, the participation of non-state actors has been identified as an effective strategy, which can improve water supply. However, there are indications that there is no pragmatic application of this, resulting in over-centralization and top-down management. Thus, this study focuses on how the participatory process may enable the development of participatory water governance framework, for use in Nigeria rural communities. The Rural Advisory Board (RAB) is proposed as a governing body to promote proximal relationships, institute democratisation borne out of participation, while enabling effective accountability and information. The RAB establishes mechanisms for effectiveness, taking into consideration Transparency, Accountability and Participation (TAP), advocated as guiding principles of decision-makers. Other tools, which may be explored in achieving these are, Laws and Policies supporting the water sector, under the direction of the Ministries and Law Courts, which ensure non-violation of laws. Community norms and values, consisting of Nigerian traditional belief system, perceptions, attitude and reality (often undermined in favour of legislations), are relied on to pave the way for enforcement. While the Task Forces consist of community members with specific designation of duties, which ensure compliance and enforceability, a cross-section of community members are assigned duties. Thus, the principle of participation is pragmatically reflected. A review of the literature provided information on the potentials of the participatory process, in potable water governance. Qualitative methodology was explored by using the semi-structured interview as strategy for inquiry. The purposive sampling strategy, consisting of homogeneous, heterogeneous and criterion techniques was applied to enable sampling. The samples, sourced from diverse positions of life, were from the study area of Delta State of Nigeria, involving three local governments of Oshimili South, Uvwie and Warri South. From the findings, there are indications that the application of the participatory process is inhered with empowerment of the rural community members to make legitimate demands for TAP. This includes the obviation of mono-decision making for the supply and management of potable water. This is capable of restructuring the top-down management to a top-down/bottom-up system.

Keywords: participation, participatory process, participatory water governance, rural advisory board

Procedia PDF Downloads 382
21 Risks for Cyanobacteria Harmful Algal Blooms in Georgia Piedmont Waterbodies Due to Land Management and Climate Interactions

Authors: Sam Weber, Deepak Mishra, Susan Wilde, Elizabeth Kramer

Abstract:

The frequency and severity of cyanobacteria harmful blooms (CyanoHABs) have been increasing over time, with point and non-point source eutrophication and shifting climate paradigms being blamed as the primary culprits. Excessive nutrients, warm temperatures, quiescent water, and heavy and less regular rainfall create more conducive environments for CyanoHABs. CyanoHABs have the potential to produce a spectrum of toxins that cause gastrointestinal stress, organ failure, and even death in humans and animals. To promote enhanced, proactive CyanoHAB management, risk modeling using geospatial tools can act as predictive mechanisms to supplement current CyanoHAB monitoring, management and mitigation efforts. The risk maps would empower water managers to focus their efforts on high risk water bodies in an attempt to prevent CyanoHABs before they occur, and/or more diligently observe those waterbodies. For this research, exploratory spatial data analysis techniques were used to identify the strongest predicators for CyanoHAB blooms based on remote sensing-derived cyanobacteria cell density values for 771 waterbodies in the Georgia Piedmont and landscape characteristics of their watersheds. In-situ datasets for cyanobacteria cell density, nutrients, temperature, and rainfall patterns are not widely available, so free gridded geospatial datasets were used as proxy variables for assessing CyanoHAB risk. For example, the percent of a watershed that is agriculture was used as a proxy for nutrient loading, and the summer precipitation within a watershed was used as a proxy for water quiescence. Cyanobacteria cell density values were calculated using atmospherically corrected images from the European Space Agency’s Sentinel-2A satellite and multispectral instrument sensor at a 10-meter ground resolution. Seventeen explanatory variables were calculated for each watershed utilizing the multi-petabyte geospatial catalogs available within the Google Earth Engine cloud computing interface. The seventeen variables were then used in a multiple linear regression model, and the strongest predictors of cyanobacteria cell density were selected for the final regression model. The seventeen explanatory variables included land cover composition, winter and summer temperature and precipitation data, topographic derivatives, vegetation index anomalies, and soil characteristics. Watershed maximum summer temperature, percent agriculture, percent forest, percent impervious, and waterbody area emerged as the strongest predictors of cyanobacteria cell density with an adjusted R-squared value of 0.31 and a p-value ~ 0. The final regression equation was used to make a normalized cyanobacteria cell density index, and a Jenks Natural Break classification was used to assign waterbodies designations of low, medium, or high risk. Of the 771 waterbodies, 24.38% were low risk, 37.35% were medium risk, and 38.26% were high risk. This study showed that there are significant relationships between free geospatial datasets representing summer maximum temperatures, nutrient loading associated with land use and land cover, and the area of a waterbody with cyanobacteria cell density. This data analytics approach to CyanoHAB risk assessment corroborated the literature-established environmental triggers for CyanoHABs, and presents a novel approach for CyanoHAB risk mapping in waterbodies across the greater southeastern United States.

Keywords: cyanobacteria, land use/land cover, remote sensing, risk mapping

Procedia PDF Downloads 210
20 Analysis of Composite Health Risk Indicators Built at a Regional Scale and Fine Resolution to Detect Hotspot Areas

Authors: Julien Caudeville, Muriel Ismert

Abstract:

Analyzing the relationship between environment and health has become a major preoccupation for public health as evidenced by the emergence of the French national plans for health and environment. These plans have identified the following two priorities: (1) to identify and manage geographic areas, where hotspot exposures are suspected to generate a potential hazard to human health; (2) to reduce exposure inequalities. At a regional scale and fine resolution of exposure outcome prerequisite, environmental monitoring networks are not sufficient to characterize the multidimensionality of the exposure concept. In an attempt to increase representativeness of spatial exposure assessment approaches, risk composite indicators could be built using additional available databases and theoretical framework approaches to combine factor risks. To achieve those objectives, combining data process and transfer modeling with a spatial approach is a fundamental prerequisite that implies the need to first overcome different scientific limitations: to define interest variables and indicators that could be built to associate and describe the global source-effect chain; to link and process data from different sources and different spatial supports; to develop adapted methods in order to improve spatial data representativeness and resolution. A GIS-based modeling platform for quantifying human exposure to chemical substances (PLAINE: environmental inequalities analysis platform) was used to build health risk indicators within the Lorraine region (France). Those indicators combined chemical substances (in soil, air and water) and noise risk factors. Tools have been developed using modeling, spatial analysis and geostatistic methods to build and discretize interest variables from different supports and resolutions on a 1 km2 regular grid within the Lorraine region. By example, surface soil concentrations have been estimated by developing a Kriging method able to integrate surface and point spatial supports. Then, an exposure model developed by INERIS was used to assess the transfer from soil to individual exposure through ingestion pathways. We used distance from polluted soil site to build a proxy for contaminated site. Air indicator combined modeled concentrations and estimated emissions to take in account 30 polluants in the analysis. For water, drinking water concentrations were compared to drinking water standards to build a score spatialized using a distribution unit serve map. The Lden (day-evening-night) indicator was used to map noise around road infrastructures. Aggregation of the different factor risks was made using different methodologies to discuss weighting and aggregation procedures impact on the effectiveness of risk maps to take decisions for safeguarding citizen health. Results permit to identify pollutant sources, determinants of exposure, and potential hotspots areas. A diagnostic tool was developed for stakeholders to visualize and analyze the composite indicators in an operational and accurate manner. The designed support system will be used in many applications and contexts: (1) mapping environmental disparities throughout the Lorraine region; (2) identifying vulnerable population and determinants of exposure to set priorities and target for pollution prevention, regulation and remediation; (3) providing exposure database to quantify relationships between environmental indicators and cancer mortality data provided by French Regional Health Observatories.

Keywords: health risk, environment, composite indicator, hotspot areas

Procedia PDF Downloads 247
19 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology

Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco

Abstract:

Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.

Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning

Procedia PDF Downloads 69
18 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid

Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang

Abstract:

Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.

Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal

Procedia PDF Downloads 77
17 Northern Nigeria Vaccine Direct Delivery System

Authors: Evelyn Castle, Adam Thompson

Abstract:

Background: In 2013, the Kano State Primary Health Care Management Board redesigned its Routine immunization supply chain from diffused pull to direct delivery push. It addressed issues around stockouts and reduced time spent by health facility staff collecting, and reporting on vaccine usage. The health care board sought the help of a 3PL for twice-monthly deliveries from its cold store to 484 facilities across 44 local governments. eHA’s Health Delivery Systems group formed a 3PL to serve 326 of these new facilities in partnership with the State. We focused on designing and implementing a technology system throughout. Basic methodologies: GIS Mapping: - Planning the delivery of vaccines to hundreds of health facilities requires detailed route planning for delivery vehicles. Mapping the road networks across Kano and Bauchi with a custom routing tool provided information for the optimization of deliveries. Reducing the number of kilometers driven each round by 20%, - reducing cost and delivery time. Direct Delivery Information System: - Vaccine Direct Deliveries are facilitated through pre-round planning (driven by health facility database, extensive GIS, and inventory workflow rules), manager and driver control panel customizing delivery routines and reporting, progress dashboard, schedules/routes, packing lists, delivery reports, and driver data collection applications. Move: Last Mile Logistics Management System: - MOVE has improved vaccine supply information management to be timely, accurate and actionable. Provides stock management workflow support, alerts management for cold chain exceptions/stock outs, and on-device analytics for health and supply chain staff. Software was built to be offline-first with user-validated interface and experience. Deployed to hundreds of vaccine storage site the improved information tools helps facilitate the process of system redesign and change management. Findings: - Stock-outs reduced from 90% to 33% - Redesigned current health systems and managing vaccine supply for 68% of Kano’s wards. - Near real time reporting and data availability to track stock. - Paperwork burdens of health staff have been dramatically reduced. - Medicine available when the community needs it. - Consistent vaccination dates for children under one to prevent polio, yellow fever, tetanus. - Higher immunization rates = Lower infection rates. - Hundreds of millions of Naira worth of vaccines successfully transported. - Fortnightly service to 326 facilities in 326 wards across 30 Local Government areas. - 6,031 cumulative deliveries. - Over 3.44 million doses transported. - Minimum travel distance covered in a round of delivery is 2000 kms & maximum of 6297 kms. - 153,409 kms travelled by 6 drivers. - 500 facilities in 326 wards. - Data captured and synchronized for the first time. - Data driven decision making now possible. Conclusion: eHA’s Vaccine Direct delivery has met challenges in Kano and Bauchi State and provided a reliable delivery service of vaccinations that ensure t health facilities can run vaccination clinics for children under one. eHA uses innovative technology that delivers vaccines from Northern Nigerian zonal stores straight to healthcare facilities. Helped healthcare workers spend less time managing supplies and more time delivering care, and will be rolled out nationally across Nigeria.

Keywords: direct delivery information system, health delivery system, GIS mapping, Northern Nigeria, vaccines

Procedia PDF Downloads 371
16 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 219
15 Analysis of Capillarity Phenomenon Models in Primary and Secondary Education in Spain: A Case Study on the Design, Implementation, and Analysis of an Inquiry-Based Teaching Sequence

Authors: E. Cascarosa-Salillas, J. Pozuelo-Muñoz, C. Rodríguez-Casals, A. de Echave

Abstract:

This study focuses on improving the understanding of the capillarity phenomenon among Primary and Secondary Education students. Despite being a common concept in daily life and covered in various subjects, students’ comprehension remains limited. This work explores inquiry-based teaching methods to build a conceptual foundation of capillarity by examining the forces involved. The study adopts an inquiry-based teaching approach supported by research emphasizing the importance of modeling in science education. Scientific modeling aids students in applying knowledge across varied contexts and developing systemic thinking, allowing them to construct scientific models applicable to everyday situations. This methodology fosters the development of scientific competencies such as observation, hypothesis formulation, and communication. The research was structured as a case study with activities designed for Spanish Primary and Secondary Education students aged 9 to 13. The process included curriculum analysis, the design of an activity sequence, and its implementation in classrooms. Implementation began with questions that students needed to resolve using available materials, encouraging observation, experimentation, and the re-contextualization of activities to everyday phenomena where capillarity is observed. Data collection tools included audio and video recordings of the sessions, which were transcribed and analyzed alongside the students' written work. Students' drawings on capillarity were also collected and categorized. Qualitative analyses of the activities showed that, through inquiry, students managed to construct various models of capillarity, reflecting an improved understanding of the phenomenon. Initial activities allowed students to express prior ideas and formulate hypotheses, which were then refined and expanded in subsequent sessions. The generalization and use of graphical representations of their ideas on capillarity, analyzed alongside their written work, enabled the categorization of capillarity models: Intuitive Model: A visual and straightforward representation without explanations of how or why it occurs. Simple symbolic elements, such as arrows to indicate water rising, are used without detailed or causal understanding. It reflects an initial, immediate perception of the phenomenon, interpreted as something that happens "on its own" without delving into the microscopic level. Explanatory Intuitive Model: Students begin to incorporate causal explanations, though still limited and without complete scientific accuracy. They represent the role of materials and use basic terms such as ‘absorption’ or ‘attraction’ to describe the rise of water. This model shows a more complex understanding where the phenomenon is not only observed but also partially explained in terms of interaction, though without microscopic detail. School Scientific Model: This model reflects a more advanced and detailed understanding. Students represent the phenomenon using specific scientific concepts like ‘surface tension,’ cohesion,’ and ‘adhesion,’ including structured explanations connecting microscopic and macroscopic levels. At this level, students model the phenomenon as a coherent system, demonstrating how various forces or properties interact in the capillarity process, with representations on a microscopic level. The study demonstrated that the capillarity phenomenon can be effectively approached in class through the experimental observation of everyday phenomena, explained through guided inquiry learning. The methodology facilitated students’ construction of capillarity models and served to analyze an interaction phenomenon of different forces occurring at the microscopic level.

Keywords: capillarity, inquiry-based learning, scientific modeling, primary and secondary education, conceptual understanding, Drawing analysis.

Procedia PDF Downloads 12
14 A Case Study on Utility of 18FDG-PET/CT Scan in Identifying Active Extra Lymph Nodes and Staging of Breast Cancer

Authors: Farid Risheq, M. Zaid Alrisheq, Shuaa Al-Sadoon, Karim Al-Faqih, Mays Abdulazeez

Abstract:

Breast cancer is the most frequently diagnosed cancer worldwide, and a common cause of death among women. Various conventional anatomical imaging tools are utilized for diagnosis, histological assessment and TNM (Tumor, Node, Metastases) staging of breast cancer. Biopsy of sentinel lymph node is becoming an alternative to the axillary lymph node dissection. Advances in 18-Fluoro-Deoxi-Glucose Positron Emission Tomography/Computed Tomography (18FDG-PET/CT) imaging have facilitated breast cancer diagnosis utilizing biological trapping of 18FDG inside lesion cells, expressed as Standardized Uptake Value (SUVmax). Objective: To present the utility of 18FDG uptake PET/CT scans in detecting active extra lymph nodes and distant occult metastases for breast cancer staging. Subjects and Methods: Four female patients were presented with initially classified TNM stages of breast cancer based on conventional anatomical diagnostic techniques. 18FDG-PET/CT scans were performed one hour post 18FDG intra-venous injection of (300-370) MBq, and (7-8) bed/130sec. Transverse, sagittal, and coronal views; fused PET/CT and MIP modality were reconstructed for each patient. Results: A total of twenty four lesions in breast, extended lesions to lung, liver, bone and active extra lymph nodes were detected among patients. The initial TNM stage was significantly changed post 18FDG-PET/CT scan for each patient, as follows: Patient-1: Initial TNM-stage: T1N1M0-(stage I). Finding: Two lesions in right breast (3.2cm2, SUVmax=10.2), (1.8cm2, SUVmax=6.7), associated with metastases to two right axillary lymph nodes. Final TNM-stage: T1N2M0-(stage II). Patient-2: Initial TNM-stage: T2N2M0-(stage III). Finding: Right breast lesion (6.1cm2, SUVmax=15.2), associated with metastases to right internal mammary lymph node, two right axillary lymph nodes, and sclerotic lesions in right scapula. Final TNM-stage: T2N3M1-(stage IV). Patient-3: Initial TNM-stage: T2N0M1-(stage III). Finding: Left breast lesion (11.1cm2, SUVmax=18.8), associated with metastases to two lymph nodes in left hilum, and three lesions in both lungs. Final TNM-stage: T2N2M1-(stage IV). Patient-4: Initial TNM-stage: T4N1M1-(stage III). Finding: Four lesions in upper outer quadrant area of right breast (largest: 12.7cm2, SUVmax=18.6), in addition to one lesion in left breast (4.8cm2, SUVmax=7.1), associated with metastases to multiple lesions in liver (largest: 11.4cm2, SUV=8.0), and two bony-lytic lesions in left scapula and cervicle-1. No evidence of regional or distant lymph node involvement. Final TNM-stage: T4N0M2-(stage IV). Conclusions: Our results demonstrated that 18FDG-PET/CT scans had significantly changed the TNM stages of breast cancer patients. While the T factor was unchanged, N and M factors showed significant variations. A single session of PET/CT scan was effective in detecting active extra lymph nodes and distant occult metastases, which were not identified by conventional diagnostic techniques, and might advantageously replace bone scan, and contrast enhanced CT of chest, abdomen and pelvis. Applying 18FDG-PET/CT scan early in the investigation, might shorten diagnosis time, helps deciding adequate treatment protocol, and could improve patients’ quality of life and survival. Trapping of 18FDG in malignant lesion cells, after a PET/CT scan, increases the retention index (RI%) for a considerable time, which might help localize sentinel lymph node for biopsy using a hand held gamma probe detector. Future work is required to demonstrate its utility.

Keywords: axillary lymph nodes, breast cancer staging, fluorodeoxyglucose positron emission tomography/computed tomography, lymph nodes

Procedia PDF Downloads 311
13 Development of a Core Set of Clinical Indicators to Measure Quality of Care for Thyroid Cancer: A Modified-Delphi Approach

Authors: Liane J. Ioannou, Jonathan Serpell, Cino Bendinelli, David Walters, Jenny Gough, Dean Lisewski, Win Meyer-Rochow, Julie Miller, Duncan Topliss, Bill Fleming, Stephen Farrell, Andrew Kiu, James Kollias, Mark Sywak, Adam Aniss, Linda Fenton, Danielle Ghusn, Simon Harper, Aleksandra Popadich, Kate Stringer, David Watters, Susannah Ahern

Abstract:

BACKGROUND: There are significant variations in the management, treatment and outcomes of thyroid cancer, particularly in the role of: diagnostic investigation and pre-treatment scanning; optimal extent of surgery (total or hemi-thyroidectomy); use of active surveillance for small low-risk cancers; central lymph node dissections (therapeutic or prophylactic); outcomes following surgery (e.g. recurrent laryngeal nerve palsy, hypocalcaemia, hypoparathyroidism); post-surgical hormone, calcium and vitamin D therapy; and provision and dosage of radioactive iodine treatment. A proven strategy to reduce variations in the outcome and to improve survival is to measure and compare it using high-quality clinical registry data. Clinical registries provide the most effective means of collecting high-quality data and are a tool for quality improvement. Where they have been introduced at a state or national level, registries have become one of the most clinically valued tools for quality improvement. To benchmark clinical care, clinical quality registries require systematic measurement at predefined intervals and the capacity to report back information to participating clinical units. OBJECTIVE: The aim of this study was to develop a core set clinical indicators that enable measurement and reporting of quality of care for patients with thyroid cancer. We hypothesise that measuring clinical quality indicators, developed to identify differences in quality of care across sites, will reduce variation and improve patient outcomes and survival, thereby lessening costs and healthcare burden to the Australian community. METHOD: Preparatory work and scoping was conducted to identify existing high quality, clinical guidelines and best practice for thyroid cancer both nationally and internationally, as well as relevant literature. A bi-national panel was invited to participate in a modified Delphi process. Panelists were asked to rate each proposed indicator on a Likert scale of 1–9 in a three-round iterative process. RESULTS: A total of 236 potential quality indicators were identified. One hundred and ninety-two indicators were removed to reflect the data capture by the Australian and New Zealand Thyroid Cancer Registry (ANZTCR) (from diagnosis to 90-days post-surgery). The remaining 44 indicators were presented to the panelists for voting. A further 21 indicators were later added by the panelists bringing the total potential quality indicators to 65. Of these, 21 were considered the most important and feasible indicators to measure quality of care in thyroid cancer, of which 12 were recommended for inclusion in the final set. The consensus indicator set spans the spectrum of care, including: preoperative; surgery; surgical complications; staging and post-surgical treatment planning; and post-surgical treatment. CONCLUSIONS: This study provides a core set of quality indicators to measure quality of care in thyroid cancer. This indicator set can be applied as a tool for internal quality improvement, comparative quality reporting, public reporting and research. Inclusion of these quality indicators into monitoring databases such as clinical quality registries will enable opportunities for benchmarking and feedback on best practice care to clinicians involved in the management of thyroid cancer.

Keywords: clinical registry, Delphi survey, quality indicators, quality of care

Procedia PDF Downloads 179
12 Optimizing AI Voice for Adolescent Health Education: Preferences and Trustworthiness Across Teens and Parent

Authors: Yu-Lin Chen, Kimberly Koester, Marissa Raymond-Flesh, Anika Thapar, Jay Thapar

Abstract:

Purpose: Effectively communicating adolescent health topics to teens and their parents is crucial. This study emphasizes critically evaluating the optimal use of artificial intelligence tools (AI), which are increasingly prevalent in disseminating health information. By fostering a deeper understanding of AI voice preference in the context of health, the research aspires to have a ripple effect, enhancing the collective health literacy and decision-making capabilities of both teenagers and their parents. This study explores AI voices' potential within health learning modules for annual well-child visits. We aim to identify preferred voice characteristics and understand factors influencing perceived trustworthiness, ultimately aiming to improve health literacy and decision-making in both demographics. Methods: A cross-sectional study assessed preferences and trust perceptions of AI voices in learning modules among teens (11-18) and their parents/guardians in Northern California. The study involved the development of four distinct learning modules covering various adolescent health-related topics, including general communication, sexual and reproductive health communication, parental monitoring, and well-child check-ups. Participants were asked to evaluate eight AI voices across the modules, considering a set of six factors such as intelligibility, naturalness, prosody, social impression, trustworthiness, and overall appeal, using Likert scales ranging from 1 to 10 (the higher, the better). They were also asked to select their preferred choice of voice for each module. Descriptive statistics summarized participant demographics. Chi-square/t-tests explored differences in voice preferences between groups. Regression models identified factors impacting the perceived trustworthiness of the top-selected voice per module. Results: Data from 104 participants (teen=63; adult guardian = 41) were included in the analysis. The mean age is 14.9 for teens (54% male) and 41.9 for the parent/guardian (12% male). At the same time, similar voice quality ratings were observed across groups, and preferences varied by topic. For instance, in general communication, teens leaned towards young female voices, while parents preferred mature female tones. Interestingly, this trend reversed for parental monitoring, with teens favoring mature male voices and parents opting for mature female ones. Both groups, however, converged on mature female voices for sexual and reproductive health topics. Beyond preferences, the study delved into factors influencing perceived trustworthiness. Interestingly, social impression and sound appeal emerged as the most significant contributors across all modules, jointly explaining 71-75% of the variance in trustworthiness ratings. Conclusion: The study emphasizes the importance of catering AI voices to specific audiences and topics. Social impression and sound appeal emerged as critical factors influencing perceived trustworthiness across all modules. These findings highlight the need to tailor AI voices by age and the specific health information being delivered. Ensuring AI voices resonate with both teens and their parents can foster their engagement and trust, ultimately leading to improved health literacy and decision-making for both groups. Limitations and future research: This study lays the groundwork for understanding AI voice preferences for teenagers and their parents in healthcare settings. However, limitations exist. The sample represents a specific geographic location, and cultural variations might influence preferences. Additionally, the modules focused on topics related to well-child visits, and preferences might differ for more sensitive health topics. Future research should explore these limitations and investigate the long-term impact of AI voice on user engagement, health outcomes, and health behaviors.

Keywords: artificial intelligence, trustworthiness, voice, adolescent

Procedia PDF Downloads 54
11 Structural Characteristics of HPDSP Concrete on Beam Column Joints

Authors: Hari Krishan Sharma, Sanjay Kumar Sharma, Sushil Kumar Swar

Abstract:

Inadequate transverse reinforcement is considered as the main reason for the beam column joint shear failure observed during recent earthquakes. DSP matrix consists of cement and high content of micro-silica with low water to cement ratio while the aggregates are graded quartz sand. The use of reinforcing fibres leads not only to the increase of tensile/bending strength and specific fracture energy, but also to reduction of brittleness and, consequently, to production of non-explosive ruptures. Besides, fibre-reinforced materials are more homogeneous and less sensitive to small defects and flaws. Recent works on the freeze-thaw durability (also in the presence of de-icing salts) of fibre-reinforced DSP confirm the excellent behaviour in the expected long term service life.DSP materials, including fibre-reinforced DSP and CRC (Compact Reinforced Composites) are obtained by using high quantities of super plasticizers and high volumes of micro-silica. Steel fibres with high tensile yield strength of smaller diameter and short length in different fibre volume percentage and aspect ratio tilized to improve the performance by reducing the brittleness of matrix material. In the case of High Performance Densified Small Particle Concrete (HPDSPC), concrete is dense at the micro-structure level, tensile strain would be much higher than that of the conventional SFRC, SIFCON & SIMCON. Beam-column sub-assemblages used as moment resisting constructed using HPDSPC in the joint region with varying quantities of steel fibres, fibre aspect ratio and fibre orientation in the critical section. These HPDSPC in the joint region sub-assemblages tested under cyclic/earthquake loading. Besides loading measurements, frame displacements, diagonal joint strain and rebar strain adjacent to the joint will also be measured to investigate stress-strain behaviour, load deformation characteristics, joint shear strength, failure mechanism, ductility associated parameters, stiffness and energy dissipated parameters of the beam column sub-assemblages also evaluated. Finally a design procedure for the optimum design of HPDSPC corresponding to moment, shear forces and axial forces for the reinforced concrete beam-column joint sub-assemblage proposed. The fact that the implementation of material brittleness measure in the design of RC structures can improve structural reliability by providing uniform safety margins over a wide range of structural sizes and material compositions well recognized in the structural design and research. This lead to the development of high performance concrete for the optimized combination of various structural ratios in concrete for the optimized combination of various structural properties. The structural applications of HPDSPC, because of extremely high strength, will reduce dead load significantly as compared to normal weight concrete thereby offering substantial cost saving and by providing improved seismic response, longer spans, and thinner sections, less reinforcing steel and lower foundation cost. These cost effective parameters will make this material more versatile for use in various structural applications like beam-column joints in industries, airports, parking areas, docks, harbours, and also containers for hazardous material, safety boxes and mould & tools for polymer composites and metals.

Keywords: high performance densified small particle concrete (HPDSPC), steel fibre reinforced concrete (SFRC), slurry infiltrated concrete (SIFCON), Slurry infiltrated mat concrete (SIMCON)

Procedia PDF Downloads 301
10 Ultra-Rapid and Efficient Immunomagnetic Separation of Listeria Monocytogenes from Complex Samples in High-Gradient Magnetic Field Using Disposable Magnetic Microfluidic Device

Authors: L. Malic, X. Zhang, D. Brassard, L. Clime, J. Daoud, C. Luebbert, V. Barrere, A. Boutin, S. Bidawid, N. Corneau, J. Farber, T. Veres

Abstract:

The incidence of infections caused by foodborne pathogens such as Listeria monocytogenes (L. monocytogenes) poses a great potential threat to public health and safety. These issues are further exacerbated by legal repercussions due to “zero tolerance” food safety standards adopted in developed countries. Unfortunately, a large number of related disease outbreaks are caused by pathogens present in extremely low counts currently undetectable by available techniques. The development of highly sensitive and rapid detection of foodborne pathogens is therefore crucial, and requires robust and efficient pre-analytical sample preparation. Immunomagnetic separation is a popular approach to sample preparation. Microfluidic chips combined with external magnets have emerged as viable high throughput methods. However, external magnets alone are not suitable for the capture of nanoparticles, as very strong magnetic fields are required. Devices that incorporate externally applied magnetic field and microstructures of a soft magnetic material have thus been used for local field amplification. Unfortunately, very complex and costly fabrication processes used for integration of soft magnetic materials in the reported proof-of-concept devices would prohibit their use as disposable tools for food and water safety or diagnostic applications. We present a sample preparation magnetic microfluidic device implemented in low-cost thermoplastic polymers using fabrication techniques suitable for mass-production. The developed magnetic capture chip (M-chip) was employed for rapid capture and release of L. monocytogenes conjugated to immunomagnetic nanoparticles (IMNs) in buffer and beef filtrate. The M-chip relies on a dense array of Nickel-coated high-aspect ratio pillars for capture with controlled magnetic field distribution and a microfluidic channel network for sample delivery, waste, wash and recovery. The developed Nickel-coating process and passivation allows generation of switchable local perturbations within the uniform magnetic field generated with a pair of permanent magnets placed at the opposite edges of the chip. This leads to strong and reversible trapping force, wherein high local magnetic field gradients allow efficient capture of IMNs conjugated to L. monocytogenes flowing through the microfluidic chamber. The experimental optimization of the M-chip was performed using commercially available magnetic microparticles and fabricated silica-coated iron-oxide nanoparticles. The fabricated nanoparticles were optimized to achieve the desired magnetic moment and surface functionalization was tailored to allow efficient capture antibody immobilization. The integration, validation and further optimization of the capture and release protocol is demonstrated using both, dead and live L. monocytogenes through fluorescence microscopy and plate- culture method. The capture efficiency of the chip was found to vary as function of listeria to nanoparticle concentration ratio. The maximum capture efficiency of 30% was obtained and the 24-hour plate-culture method allowed the detection of initial sample concentration of only 16 cfu/ml. The device was also very efficient in concentrating the sample from a 10 ml initial volume. Specifically, 280% concentration efficiency was achieved in 17 minutes only, demonstrating the suitability of the system for food safety applications. In addition, flexible design and low-cost fabrication process will allow rapid sample preparation for applications beyond food and water safety, including point-of-care diagnosis.

Keywords: array of pillars, bacteria isolation, immunomagnetic sample preparation, polymer microfluidic device

Procedia PDF Downloads 279
9 Teacher Collaboration Impact on Bilingual Students’ Oral Communication Skills in Inclusive Contexts

Authors: Diana González, Marta Gràcia, Ana Luisa Adam-Alcocer

Abstract:

Incorporating digital tools into educational practices represents a valuable approach for enriching the quality of teachers' educational practices in oral competence and fostering improvements in student learning outcomes. This study aims to promote a collaborative and culturally sensitive approach to professional development between teachers and a speech therapist to enhance their self-awareness and reflection on high-quality educational practices that integrate school components to strengthen children’s oral communication and pragmatic skills. The study involved five bilingual teachers fluent in both English and Spanish, with three specializing in special education and two in general education. It focused on Spanish-English bilingual students, aged 3-6, who were experiencing speech delays or disorders in a New York City public school, with the collaboration of a speech therapist. Using EVALOE-DSS (Assessment Scale of Oral Language Teaching in the School Context - Decision Support System), teachers conducted self-assessments of their teaching practices, reflect and make-decisions throughout six classes from March to June, focusing on students' communicative competence across various activities. Concurrently, the speech therapist observed and evaluated six classes per teacher using EVALOE-DSS during the same period. Additionally, professional development meetings were held monthly between the speech therapist and teachers, centering on discussing classroom interactions, instructional strategies, and the progress of both teachers and students in their classes. Findings highlight the digital tool EVALOE-DSS's value in analyzing communication patterns and trends among bilingual children in inclusive settings. It helps in identifying improvement areas through teacher and speech therapist collaboration. After self-reflection meetings, teachers demonstrated increased awareness of student needs in oral language and pragmatic skills. They also exhibited enhanced utilization of strategies outlined in EVALOE-DSS, such as actively guiding and orienting students during oral language activities, promoting student-initiated communicative interactions, teaching students how to seek and provide information, and managing turn-taking to ensure inclusive participation. Teachers participating in the professional development program have shown positive progress in assessing their classes across all dimensions of the training tool, including instructional design, teacher conversation management, pupil conversation management, communicative functions, teacher strategies, and pupil communication functions. This includes aspects related to both teacher actions and child actions, particularly in child language development. This progress underscores the effectiveness of individual reflection (conducted weekly or biweekly using EVALOE-DSS) as well as collaborative reflection among teachers and the speech therapist during meetings. The EVALOE-SSD has proven effective in supporting teachers' self-reflection, decision-making, and classroom changes, leading to improved development of students' oral language and pragmatic skills. It has facilitated culturally sensitive evaluations of communication among bilingual children, cultivating collaboration between teachers and speech therapist to identify areas of growth. Participants in the professional development program demonstrated substantial progress across all dimensions assessed by EVALOE-DSS. This included improved management of pupil communication functions, implementation of effective teaching strategies, and better classroom dynamics. Regular reflection sessions using EVALOE-SSD supported continuous improvement in instructional practices, highlighting its role in fostering reflective teaching and enriching student learning experiences. Overall, EVALOE-DSS has proven invaluable for enhancing teaching effectiveness and promoting meaningful student interactions in diverse educational settings.

Keywords: bilingual students, collaboration, culturally sensitive, oral communication skills, self-reflection

Procedia PDF Downloads 34
8 XAI Implemented Prognostic Framework: Condition Monitoring and Alert System Based on RUL and Sensory Data

Authors: Faruk Ozdemir, Roy Kalawsky, Peter Hubbard

Abstract:

Accurate estimation of RUL provides a basis for effective predictive maintenance, reducing unexpected downtime for industrial equipment. However, while models such as the Random Forest have effective predictive capabilities, they are the so-called ‘black box’ models, where interpretability is at a threshold to make critical diagnostic decisions involved in industries related to aviation. The purpose of this work is to present a prognostic framework that embeds Explainable Artificial Intelligence (XAI) techniques in order to provide essential transparency in Machine Learning methods' decision-making mechanisms based on sensor data, with the objective of procuring actionable insights for the aviation industry. Sensor readings have been gathered from critical equipment such as turbofan jet engine and landing gear, and the prediction of the RUL is done by a Random Forest model. It involves steps such as data gathering, feature engineering, model training, and evaluation. These critical components’ datasets are independently trained and evaluated by the models. While suitable predictions are served, their performance metrics are reasonably good; such complex models, however obscure reasoning for the predictions made by them and may even undermine the confidence of the decision-maker or the maintenance teams. This is followed by global explanations using SHAP and local explanations using LIME in the second phase to bridge the gap in reliability within industrial contexts. These tools analyze model decisions, highlighting feature importance and explaining how each input variable affects the output. This dual approach offers a general comprehension of the overall model behavior and detailed insight into specific predictions. The proposed framework, in its third component, incorporates the techniques of causal analysis in the form of Granger causality tests in order to move beyond correlation toward causation. This will not only allow the model to predict failures but also present reasons, from the key sensor features linked to possible failure mechanisms to relevant personnel. The causality between sensor behaviors and equipment failures creates much value for maintenance teams due to better root cause identification and effective preventive measures. This step contributes to the system being more explainable. Surrogate Several simple models, including Decision Trees and Linear Models, can be used in yet another stage to approximately represent the complex Random Forest model. These simpler models act as backups, replicating important jobs of the original model's behavior. If the feature explanations obtained from the surrogate model are cross-validated with the primary model, the insights derived would be more reliable and provide an intuitive sense of how the input variables affect the predictions. We then create an iterative explainable feedback loop, where the knowledge learned from the explainability methods feeds back into the training of the models. This feeds into a cycle of continuous improvement both in model accuracy and interpretability over time. By systematically integrating new findings, the model is expected to adapt to changed conditions and further develop its prognosis capability. These components are then presented to the decision-makers through the development of a fully transparent condition monitoring and alert system. The system provides a holistic tool for maintenance operations by leveraging RUL predictions, feature importance scores, persistent sensor threshold values, and autonomous alert mechanisms. Since the system will provide explanations for the predictions given, along with active alerts, the maintenance personnel can make informed decisions on their end regarding correct interventions to extend the life of the critical machinery.

Keywords: predictive maintenance, explainable artificial intelligence, prognostic, RUL, machine learning, turbofan engines, C-MAPSS dataset

Procedia PDF Downloads 4
7 Achieving Sustainable Lifestyles Based on the Spiritual Teaching and Values of Buddhism from Lumbini, Nepal

Authors: Purna Prasad Acharya, Madhav Karki, Sunta B. Tamang, Uttam Basnet, Chhatra Katwal

Abstract:

The paper outlines the idea behind achieving sustainable lifestyles based on the spiritual values and teachings of Lord Buddha. This objective is to be achieved by spreading the tenets and teachings of Buddhism throughout the Asia Pacific region and the world from the sacred birth place of Buddha - Lumbini, Nepal. There is an urgent need to advance the relevance of Buddhist philosophy in tackling the triple planetary crisis of climate change, nature’s decline, and pollution. Today, the world is facing an existential crisis due to the above crises, exasperated by hunger, poverty and armed conflict. To address multi-dimensional impacts, the global communities have to adopt simple life styles that respect nature and universal human values. These were the basic teachings of Gautam Buddha. Lumbini, Nepal has the moral obligation to widely disseminate Buddha’s teaching to the world and receive constant feedback and learning to develop human and ecosystem resilience by molding the lifestyles of current and future generations through adaptive learning and simplicity across the geography and nationality based on spirituality and environmental stewardship. By promoting Buddhism, Nepal has developed a pro-nature tourism industry that focuses on both its spiritual and bio-cultural heritage. Nepal is a country rich in ancient wisdom, where sages have sought knowledge, practiced meditation, and followed spiritual paths for thousands of years. It can spread the teachings of Buddha in a way people can search for and adopt ways to live, creating harmony with nature. Using tools of natural sciences and social sciences, the team will package knowledge and share the idea of community well-being within the framework of environmental sustainability, social harmony and universal respect for nature and people in a more holistic manner. This notion takes into account key elements of sustainable development such as food-energy-water-biodiversity interconnections, environmental conservation, ecological integrity, ecosystem health, community resiliency, adaptation capacity, and indigenous culture, knowledge and values. This inclusive concept has garnered a strong network of supporters locally, regionally, and internationally. The key objectives behind this concept are: a) to leverage expertise and passion of a network of global collaborators to advance research, education, and policy outreach in the areas of human sustainability based on lifestyle change using the power of spirituality and Buddha’s teaching, resilient lifestyles, and adaptive living; b) help develop creative short courses for multi-disciplinary teaching in educational institutions worldwide in collaboration with Lumbini Buddha University and other relevant partners in Nepal; c) help build local and regional intellectual and cultural teaching and learning capacity by improving professional collaborations to promote nature based and Buddhist value-based lifestyles by connecting Lumbini to Nepal’s rich nature; d) promote research avenues to provide policy relevant knowledge that is creative, innovative, as well as practical and locally viable; and e) connect local research and outreach work with academic and cultural partners in South Korea so as to open up Lumbini based Buddhist heritage and Nepal’s Karnali River basin’s unique natural landscape to Korean scholars and students to promote sustainable lifestyles leading to human living in harmony with nature.

Keywords: triple planetary crisis, spirituality, sustainable lifestyles, living in harmony with nature, resilience

Procedia PDF Downloads 32
6 Open Science Philosophy, Research and Innovation

Authors: C.Ardil

Abstract:

Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.

Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data

Procedia PDF Downloads 129
5 Surface Acoustic Wave (SAW)-Induced Mixing Enhances Biomolecules Kinetics in a Novel Phase-Interrogation Surface Plasmon Resonance (SPR) Microfluidic Biosensor

Authors: M. Agostini, A. Sonato, G. Greco, M. Travagliati, G. Ruffato, E. Gazzola, D. Liuni, F. Romanato, M. Cecchini

Abstract:

Since their first demonstration in the early 1980s, surface plasmon resonance (SPR) sensors have been widely recognized as useful tools for detecting chemical and biological species, and the interest of the scientific community toward this technology has known a rapid growth in the past two decades owing to their high sensitivity, label-free operation and possibility of real-time detection. Recent works have suggested that a turning point in SPR sensor research would be the combination of SPR strategies with other technologies in order to reduce human handling of samples, improve integration and plasmonic sensitivity. In this light, microfluidics has been attracting growing interest. By properly designing microfluidic biochips it is possible to miniaturize the analyte-sensitive areas with an overall reduction of the chip dimension, reduce the liquid reagents and sample volume, improve automation, and increase the number of experiments in a single biochip by multiplexing approaches. However, as the fluidic channel dimensions approach the micron scale, laminar flows become dominant owing to the low Reynolds numbers that typically characterize microfluidics. In these environments mixing times are usually dominated by diffusion, which can be prohibitively long and lead to long-lasting biochemistry experiments. An elegant method to overcome these issues is to actively perturb the liquid laminar flow by exploiting surface acoustic waves (SAWs). With this work, we demonstrate a new approach for SPR biosensing based on the combination of microfluidics, SAW-induced mixing and the real-time phase-interrogation grating-coupling SPR technology. On a single lithium niobate (LN) substrate the nanostructured SPR sensing areas, interdigital transducer (IDT) for SAW generation and polydimethylsiloxane (PDMS) microfluidic chambers were fabricated. SAWs, impinging on the microfluidic chamber, generate acoustic streaming inside the fluid, leading to chaotic advection and thus improved fluid mixing, whilst analytes binding detection is made via SPR method based on SPP excitation via gold metallic grating upon azimuthal orientation and phase interrogation. Our device has been fully characterized in order to separate for the very first time the unwanted SAW heating effect with respect to the fluid stirring inside the microchamber that affect the molecules binding dynamics. Avidin/biotin assay and thiol-polyethylene glycol (bPEG-SH) were exploited as model biological interaction and non-fouling layer respectively. Biosensing kinetics time reduction with SAW-enhanced mixing resulted in a ≈ 82% improvement for bPEG-SH adsorption onto gold and ≈ 24% for avidin/biotin binding—≈ 50% and 18% respectively compared to the heating only condition. These results demonstrate that our biochip can significantly reduce the duration of bioreactions that usually require long times (e.g., PEG-based sensing layer, low concentration analyte detection). The sensing architecture here proposed represents a new promising technology satisfying the major biosensing requirements: scalability and high throughput capabilities. The detection system size and biochip dimension could be further reduced and integrated; in addition, the possibility of reducing biological experiment duration via SAW-driven active mixing and developing multiplexing platforms for parallel real-time sensing could be easily combined. In general, the technology reported in this study can be straightforwardly adapted to a great number of biological system and sensing geometry.

Keywords: biosensor, microfluidics, surface acoustic wave, surface plasmon resonance

Procedia PDF Downloads 276