Search results for: monitoring tool
96 Genome-Scale Analysis of Streptomyces Caatingaensis CMAA 1322 Metabolism, a New Abiotic Stress-Tolerant Actinomycete
Authors: Suikinai Nobre Santos, Ranko Gacesa, Paul F. Long, Itamar Soares de Melo
Abstract:
Extremophilic microorganism are adapted to biotopes combining several stress factors (temperature, pressure, radiation, salinity and pH), which indicate the richness valuable resource for the exploitation of novel biotechnological processes and constitute unique models for investigations their biomolecules (1, 2). The above information encourages us investigate bioprospecting synthesized compounds by a noval actinomycete, designated thermotolerant Streptomyces caatingaensis CMAA 1322, isolated from sample soil tropical dry forest (Caatinga) in the Brazilian semiarid region (3-17°S and 35-45°W). This set of constrating physical and climatic factores provide the unique conditions and a diversity of well adapted species, interesting site for biotechnological purposes. Preliminary studies have shown the great potential in the production of cytotoxic, pesticidal and antimicrobial molecules (3). Thus, to extend knowledge of the genes clusters responsible for producing biosynthetic pathways of natural products in strain CMAA1322, whole-genome shotgun (WGS) DNA sequencing was performed using paired-end long sequencing with PacBio RS (Pacific Biosciences). Genomic DNA was extracted from a pure culture grown overnight on LB medium using the PureLink genomic DNA kit (Life Technologies). An approximately 3- to 20-kb-insert PacBio library was constructed and sequenced on an 8 single-molecule real-time (SMRT) cell, yielding 116,269 reads (average length, 7,446 bp), which were allocated into 18 contigs, with 142.11x coverage and N50 value of 20.548 bp (BioProject number PRJNA288757). The assembled data were analyzed by Rapid Annotations using Subsystems Technology (RAST) (4) the genome size was found to be 7.055.077 bp, comprising 6167 open reading frames (ORFs) and 413 subsystems. The G+C content was estimated to be 72 mol%. The closest-neighbors tool, available in RAST through functional comparison of the genome, revealed that strain CMAA1322 is more closely related to Streptomyces hygroscopicus ATCC 53653 (similarity score value, 537), S. violaceusniger Tu 4113 (score value, 483), S. avermitilis MA-4680 (score value, 475), S. albus J1074 (score value, 447). The Streptomyces sp. CMAA1322 genome contains 98 tRNA genes and 135 genes copies related to stress response, mainly osmotic stress (14), heat shock (16), oxidative stress (49). Functional annotation by antiSMASH version 3.0 (5) identified 41 clusters for secondary metabolites (including two clusters for lanthipeptides, ten clusters for nonribosomal peptide synthetases [NRPS], three clusters for siderophores, fourteen for polyketide synthetase [PKS], six clusters encoding a terpene, two clusters encoding a bacteriocin, and one cluster encoding a phenazine). Our work provide in comparative analyse of genome and extract produced (data no published) by lineage CMAA1322, revealing the potential of microorganisms accessed from extreme environments as Caatinga” to produce a wide range of biotechnological relevant compounds.Keywords: caatinga, streptomyces, environmental stresses, biosynthetic pathways
Procedia PDF Downloads 24395 Pluripotent Stem Cells as Therapeutic Tools for Limbal Stem Cell Deficiencies and Drug Testing
Authors: Aberdam Edith, Sangari Linda, Petit Isabelle, Aberdam Daniel
Abstract:
Background and Rationale: Transparent avascularised cornea is essential for normal vision and depends on limbal stem cells (LSC) that reside between the cornea and the conjunctiva. Ocular burns or injuries may destroy the limbus, causing limbal stem cell deficiency (LSCD). The cornea becomes vascularised by invaded conjunctival cells, the stroma is scarring, resulting in corneal opacity and loss of vision. Grafted autologous limbus or cultivated autologous LCS can restore the vision, unless the two eyes are affected. Alternative cellular sources have been tested in the last decades, including oral mucosa or hair follicle epithelial cells. However, only partial success has been achieved by the use of these cells since they were not able to uniformly commit into corneal epithelial cells. Human pluripotent stem cells (iPSC) display both unlimited growth capacity and ability to differentiate into any cell type. Our goal was to design a standardized and reproducible protocol to produce transplantable autologous LSC from patients through cell reprogramming technology. Methodology: First, keratinocyte primary culture was established from a small number of plucked hair follicles of healthy donors. The resulting epithelial cells were reprogrammed into induced pluripotent stem cells (iPSCs) and further differentiate into corneal epithelial cells (CEC), according to a robust protocol that recapitulates the main step of corneal embryonic development. qRT-PCR analysis and immunofluorescent staining during the course of differentiation confirm the expression of stage specific markers of corneal embryonic lineage. First appear ectodermal progenitor-specific cytokeratins K8/K18, followed at day 7 by limbal-specific PAX6, TP63 and cytokeratins K5/K14. At day 15, K3/K12+-corneal cells are present. To amplify the iPSC-derived LSC (named COiPSC), intact small epithelial colonies were detached and cultivated in limbal cell-specific medium. In that culture conditions, the COiPSC can be frozen and thaw at any passage, while retaining their corneal characteristics for at least eight passages. To evaluate the potential of COiPSC as an alternative ocular toxicity model, COiPSC were treated at passage P0 to P4 with increasing amounts of SDS and Benzalkonium. Cell proliferation and apoptosis of treated cells was compared to LSC and the SV40-immortalized human corneal epithelial cell line (HCE) routinely used by cosmetological industrials. Of note, HCE are more resistant to toxicity than LSC. At P0, COiPSC were systematically more resistant to chemical toxicity than LSC and even to HCE. Remarkably, this behavior changed with passage since COiPSC at P2 became identical to LSC and thus closer to physiology than HCE. Comparative transcriptome analysis confirmed that COiPSC from P2 are similar to a mixture of LSC and CEC. Finally, by organotypic reconstitution assay, we demonstrated the ability of COiPSC to produce a 3D corneal epithelium on a stromal equivalent made of keratocytes. Conclusion: COiPSC could become valuable for two main applications: (1) an alternative robust tool to perform, in a reproducible and physiological manner, toxicity assays for cosmetic products and pharmacological tests of drugs. (2). COiPSC could become an alternative autologous source for cornea transplantation for LSCD.Keywords: Limbal stem cell deficiency, iPSC, cornea, limbal stem cells
Procedia PDF Downloads 41494 Decrease in Olfactory Cortex Volume and Alterations in Caspase Expression in the Olfactory Bulb in the Pathogenesis of Alzheimer’s Disease
Authors: Majed Al Otaibi, Melissa Lessard-Beaudoin, Amel Loudghi, Raphael Chouinard-Watkins, Melanie Plourde, Frederic Calon, C. Alexandre Castellano, Stephen Cunnane, Helene Payette, Pierrette Gaudreau, Denis Gris, Rona K. Graham
Abstract:
Introduction: Alzheimer disease (AD) is a chronic disorder that affects millions of individuals worldwide. Symptoms include memory dysfunction, and also alterations in attention, planning, language and overall cognitive function. Olfactory dysfunction is a common symptom of several neurological disorders including AD. Studying the mechanisms underlying the olfactory dysfunction may therefore lead to the discovery of potential biomarkers and/or treatments for neurodegenerative diseases. Objectives: To determine if olfactory dysfunction predicts future cognitive impairment in the aging population and to characterize the olfactory system in a murine model expressing a genetic factor of AD. Method: For the human study, quantitative olfactory tests (UPSIT and OMT) have been done on 93 subjects (aged 80 to 94 years) from the Quebec Longitudinal Study on Nutrition and Successful Aging (NuAge) cohort accepting to participate in the ORCA secondary study. The telephone Modified Mini Mental State examination (t-MMSE) was used to assess cognition levels, and an olfactory self-report was also collected. In a separate cohort, olfactory cortical volume was calculated using MRI results from healthy old adults (n=25) and patients with AD (n=18) using the AAL single-subject atlas and performed with the PNEURO tool (PMOD 3.7). For the murine study, we are using Western blotting, RT-PCR and immunohistochemistry. Result: Human Study: Based on the self-report, 81% of the participants claimed to not suffer from any problem with olfaction. However, based on the UPSIT, 94% of those subjects showed a poor olfactory performance and different forms of microsmia. Moreover, the results confirm that olfactory function declines with age. We also detected a significant decrease in olfactory cortical volume in AD individuals compared to controls. Murine study: Preliminary data demonstrate there is a significant decrease in expression levels of the proform of caspase-3 and the caspase substrate STK3, in the olfactory bulb of mice expressing human APOE4 compared with controls. In addition, there is a significant decrease in the expression level of the caspase-9 proform and caspase-8 active fragment. Analysis of the mature neuron marker, NeuN, shows decreased expression levels of both isoforms. The data also suggest that Iba-1 immunostaining is increased in the olfactory bulb of APOE4 mice compared to wild type mice. Conclusions: The activation of caspase-3 may be the cause of the decreased levels of STK3 through caspase cleavage and may play role in the inflammation observed. In the clinical study, our results suggest that seniors are unaware of their olfactory function status and therefore it is not sufficient to measure olfaction using the self-report in the elderly. Studying olfactory function and cognitive performance in the aging population will help to discover biomarkers in the early stage of the AD.Keywords: Alzheimer's disease, APOE4, cognition, caspase, brain atrophy, neurodegenerative, olfactory dysfunction
Procedia PDF Downloads 25893 Predicting and Obtaining New Solvates of Curcumin, Demethoxycurcumin and Bisdemethoxycurcumin Based on the Ccdc Statistical Tools and Hansen Solubility Parameters
Authors: J. Ticona Chambi, E. A. De Almeida, C. A. Andrade Raymundo Gaiotto, A. M. Do Espírito Santo, L. Infantes, S. L. Cuffini
Abstract:
The solubility of active pharmaceutical ingredients (APIs) is challenging for the pharmaceutical industry. The new multicomponent crystalline forms as cocrystal and solvates present an opportunity to improve the solubility of APIs. Commonly, the procedure to obtain multicomponent crystalline forms of a drug starts by screening the drug molecule with the different coformers/solvents. However, it is necessary to develop methods to obtain multicomponent forms in an efficient way and with the least possible environmental impact. The Hansen Solubility Parameters (HSPs) is considered a tool to obtain theoretical knowledge of the solubility of the target compound in the chosen solvent. H-Bond Propensity (HBP), Molecular Complementarity (MC), Coordination Values (CV) are tools used for statistical prediction of cocrystals developed by the Cambridge Crystallographic Data Center (CCDC). The HSPs and the CCDC tools are based on inter- and intra-molecular interactions. The curcumin (Cur), target molecule, is commonly used as an anti‐inflammatory. The demethoxycurcumin (Demcur) and bisdemethoxycurcumin (Bisdcur) are natural analogues of Cur from turmeric. Those target molecules have differences in their solubilities. In this way, the work aimed to analyze and compare different tools for multicomponent forms prediction (solvates) of Cur, Demcur and Biscur. The HSP values were calculated for Cur, Demcur, and Biscur using the chemical group contribution methods and the statistical optimization from experimental data. The HSPmol software was used. From the HSPs of the target molecules and fifty solvents (listed in the HSP books), the relative energy difference (RED) was determined. The probability of the target molecules would be interacting with the solvent molecule was determined using the CCDC tools. A dataset of fifty molecules of different organic solvents was ranked for each prediction method and by a consensus ranking of different combinations: HSP, CV, HBP and MC values. Based on the prediction, 15 solvents were selected as Dimethyl Sulfoxide (DMSO), Tetrahydrofuran (THF), Acetonitrile (ACN), 1,4-Dioxane (DOX) and others. In a starting analysis, the slow evaporation technique from 50°C at room temperature and 4°C was used to obtain solvates. The single crystals were collected by using a Bruker D8 Venture diffractometer, detector Photon100. The data processing and crystal structure determination were performed using APEX3 and Olex2-1.5 software. According to the results, the HSPs (theoretical and optimized) and the Hansen solubility sphere for Cur, Demcur and Biscur were obtained. With respect to prediction analyses, a way to evaluate the predicting method was through the ranking and the consensus ranking position of solvates already reported in the literature. It was observed that the combination of HSP-CV obtained the best results when compared to the other methods. Furthermore, as a result of solvent selected, six new solvates, Cur-DOX, Cur-DMSO, Bicur-DOX, Bircur-THF, Demcur-DOX, Demcur-ACN and a new Biscur hydrate, were obtained. Crystal structures were determined for Cur-DOX, Biscur-DOX, Demcur-DOX and Bicur-Water. Moreover, the unit-cell parameter information for Cur-DMSO, Biscur-THF and Demcur-ACN were obtained. The preliminary results showed that the prediction method is showing a promising strategy to evaluate the possibility of forming multicomponent. It is currently working on obtaining multicomponent single crystals.Keywords: curcumin, HSPs, prediction, solvates, solubility
Procedia PDF Downloads 6392 Introducing, Testing, and Evaluating a Unified JavaScript Framework for Professional Online Studies
Authors: Caspar Goeke, Holger Finger, Dorena Diekamp, Peter König
Abstract:
Online-based research has recently gained increasing attention from various fields of research in the cognitive sciences. Technological advances in the form of online crowdsourcing (Amazon Mechanical Turk), open data repositories (Open Science Framework), and online analysis (Ipython notebook) offer rich possibilities to improve, validate, and speed up research. However, until today there is no cross-platform integration of these subsystems. Furthermore, implementation of online studies still suffers from the complex implementation (server infrastructure, database programming, security considerations etc.). Here we propose and test a new JavaScript framework that enables researchers to conduct any kind of behavioral research in the browser without the need to program a single line of code. In particular our framework offers the possibility to manipulate and combine the experimental stimuli via a graphical editor, directly in the browser. Moreover, we included an action-event system that can be used to handle user interactions, interactively change stimuli properties or store participants’ responses. Besides traditional recordings such as reaction time, mouse and keyboard presses, the tool offers webcam based eye and face-tracking. On top of these features our framework also takes care about the participant recruitment, via crowdsourcing platforms such as Amazon Mechanical Turk. Furthermore, the build in functionality of google translate will ensure automatic text translations of the experimental content. Thereby, thousands of participants from different cultures and nationalities can be recruited literally within hours. Finally, the recorded data can be visualized and cleaned online, and then exported into the desired formats (csv, xls, sav, mat) for statistical analysis. Alternatively, the data can also be analyzed online within our framework using the integrated Ipython notebook. The framework was designed such that studies can be used interchangeably between researchers. This will support not only the idea of open data repositories but also constitutes the possibility to share and reuse the experimental designs and analyses such that the validity of the paradigms will be improved. Particularly, sharing and integrating the experimental designs and analysis will lead to an increased consistency of experimental paradigms. To demonstrate the functionality of the framework we present the results of a pilot study in the field of spatial navigation that was conducted using the framework. Specifically, we recruited over 2000 subjects with various cultural backgrounds and consequently analyzed performance difference in dependence on the factors culture, gender and age. Overall, our results demonstrate a strong influence of cultural factors in spatial cognition. Such an influence has not yet been reported before and would not have been possible to show without the massive amount of data collected via our framework. In fact, these findings shed new lights on cultural differences in spatial navigation. As a consequence we conclude that our new framework constitutes a wide range of advantages for online research and a methodological innovation, by which new insights can be revealed on the basis of massive data collection.Keywords: cultural differences, crowdsourcing, JavaScript framework, methodological innovation, online data collection, online study, spatial cognition
Procedia PDF Downloads 25791 Diffusion MRI: Clinical Application in Radiotherapy Planning of Intracranial Pathology
Authors: Pomozova Kseniia, Gorlachev Gennadiy, Chernyaev Aleksandr, Golanov Andrey
Abstract:
In clinical practice, and especially in stereotactic radiosurgery planning, the significance of diffusion-weighted imaging (DWI) is growing. This makes the existence of software capable of quickly processing and reliably visualizing diffusion data, as well as equipped with tools for their analysis in terms of different tasks. We are developing the «MRDiffusionImaging» software on the standard C++ language. The subject part has been moved to separate class libraries and can be used on various platforms. The user interface is Windows WPF (Windows Presentation Foundation), which is a technology for managing Windows applications with access to all components of the .NET 5 or .NET Framework platform ecosystem. One of the important features is the use of a declarative markup language, XAML (eXtensible Application Markup Language), with which you can conveniently create, initialize and set properties of objects with hierarchical relationships. Graphics are generated using the DirectX environment. The MRDiffusionImaging software package has been implemented for processing diffusion magnetic resonance imaging (dMRI), which allows loading and viewing images sorted by series. An algorithm for "masking" dMRI series based on T2-weighted images was developed using a deformable surface model to exclude tissues that are not related to the area of interest from the analysis. An algorithm of distortion correction using deformable image registration based on autocorrelation of local structure has been developed. Maximum voxel dimension was 1,03 ± 0,12 mm. In an elementary brain's volume, the diffusion tensor is geometrically interpreted using an ellipsoid, which is an isosurface of the probability density of a molecule's diffusion. For the first time, non-parametric intensity distributions, neighborhood correlations, and inhomogeneities are combined in one segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) algorithm. A tool for calculating the coefficient of average diffusion and fractional anisotropy has been created, on the basis of which it is possible to build quantitative maps for solving various clinical problems. Functionality has been created that allows clustering and segmenting images to individualize the clinical volume of radiation treatment and further assess the response (Median Dice Score = 0.963 ± 0,137). White matter tracts of the brain were visualized using two algorithms: deterministic (fiber assignment by continuous tracking) and probabilistic using the Hough transform. The proposed algorithms test candidate curves in the voxel, assigning to each one a score computed from the diffusion data, and then selects the curves with the highest scores as the potential anatomical connections. White matter fibers were visualized using a Hough transform tractography algorithm. In the context of functional radiosurgery, it is possible to reduce the irradiation volume of the internal capsule receiving 12 Gy from 0,402 cc to 0,254 cc. The «MRDiffusionImaging» will improve the efficiency and accuracy of diagnostics and stereotactic radiotherapy of intracranial pathology. We develop software with integrated, intuitive support for processing, analysis, and inclusion in the process of radiotherapy planning and evaluating its results.Keywords: diffusion-weighted imaging, medical imaging, stereotactic radiosurgery, tractography
Procedia PDF Downloads 8590 OpenFOAM Based Simulation of High Reynolds Number Separated Flows Using Bridging Method of Turbulence
Authors: Sagar Saroha, Sawan S. Sinha, Sunil Lakshmipathy
Abstract:
Reynolds averaged Navier-Stokes (RANS) model is the popular computational tool for prediction of turbulent flows. Being computationally less expensive as compared to direct numerical simulation (DNS), RANS has received wide acceptance in industry and research community as well. However, for high Reynolds number flows, the traditional RANS approach based on the Boussinesq hypothesis is incapacitated to capture all the essential flow characteristics, and thus, its performance is restricted in high Reynolds number flows of practical interest. RANS performance turns out to be inadequate in regimes like flow over curved surfaces, flows with rapid changes in the mean strain rate, duct flows involving secondary streamlines and three-dimensional separated flows. In the recent decade, partially averaged Navier-Stokes (PANS) methodology has gained acceptability among seamless bridging methods of turbulence- placed between DNS and RANS. PANS methodology, being a scale resolving bridging method, is inherently more suitable than RANS for simulating turbulent flows. The superior ability of PANS method has been demonstrated for some cases like swirling flows, high-speed mixing environment, and high Reynolds number turbulent flows. In our work, we intend to evaluate PANS in case of separated turbulent flows past bluff bodies -which is of broad aerodynamic research and industrial application. PANS equations, being derived from base RANS, continue to inherit the inadequacies from the parent RANS model based on linear eddy-viscosity model (LEVM) closure. To enhance PANS’ capabilities for simulating separated flows, the shortcomings of the LEVM closure need to be addressed. Inabilities of the LEVMs have inspired the development of non-linear eddy viscosity models (NLEVM). To explore the potential improvement in PANS performance, in our study we evaluate the PANS behavior in conjugation with NLEVM. Our work can be categorized into three significant steps: (i) Extraction of PANS version of NLEVM from RANS model, (ii) testing the model in the homogeneous turbulence environment and (iii) application and evaluation of the model in the canonical case of separated non-homogeneous flow field (flow past prismatic bodies and bodies of revolution at high Reynolds number). PANS version of NLEVM shall be derived and implemented in OpenFOAM -an open source solver. Homogeneous flows evaluation will comprise the study of the influence of the PANS’ filter-width control parameter on the turbulent stresses; the homogeneous analysis performed over typical velocity fields and asymptotic analysis of Reynolds stress tensor. Non-homogeneous flow case will include the study of mean integrated quantities and various instantaneous flow field features including wake structures. Performance of PANS + NLEVM shall be compared against the LEVM based PANS and LEVM based RANS. This assessment will contribute to significant improvement of the predictive ability of the computational fluid dynamics (CFD) tools in massively separated turbulent flows past bluff bodies.Keywords: bridging methods of turbulence, high Re-CFD, non-linear PANS, separated turbulent flows
Procedia PDF Downloads 14589 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing
Authors: Arjun Kumar Rath, Titus Dhanasingh
Abstract:
Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.Keywords: board diagnostics software, embedded system, hardware testing, test frameworks
Procedia PDF Downloads 14588 Exploring the Cultural Values of Nursing Personnel Utilizing Hofstede's Cultural Dimensions
Authors: Ma Chu Jui
Abstract:
Culture plays a pivotal role in shaping societal responses to change and fostering adaptability. In the realm of healthcare provision, hospitals serve as dynamic settings molded by the cultural consciousness of healthcare professionals. This intricate interplay extends to their expectations of leadership, communication styles, and attitudes towards patient care. Recognizing the cultural inclinations of healthcare professionals becomes imperative in navigating this complex landscape. This study will utilize Hofstede's Value Survey Module 2013 (VSM 2013) as a comprehensive analytical tool. The targeted participants for this research are in-service nursing professionals with a tenure of at least three months, specifically employed in the nursing department of an Eastern hospital. This quantitative approach seeks to quantify diverse cultural tendencies among the targeted nursing professionals, elucidating not only abstract cultural concepts but also revealing their cultural inclinations across different dimensions. The study anticipates gathering between 400 to 500 responses, ensuring a robust dataset for a comprehensive analysis. The focused approach on nursing professionals within the Eastern hospital setting enhances the relevance and specificity of the cultural insights obtained. The research aims to contribute valuable knowledge to the understanding of cultural tendencies among in-service nursing personnel in the nursing department of this specific Eastern hospital. The VSM 2013 will be initially distributed to this specific group to collect responses, aiming to calculate scores on each of Hofstede's six cultural dimensions—Power Distance Index (PDI), Individualism vs. Collectivism (IDV), Uncertainty Avoidance Index (UAI), Masculinity vs. Femininity (MAS), Long-Term Orientation vs. Short-Term Normative Orientation (LTO), and Indulgence vs. Restraint (IVR). the study unveils a significant correlation between different cultural dimensions and healthcare professionals' tendencies in understanding leadership expectations through PDI, grasping behavioral patterns via IDV, acknowledging risk acceptance through UAI, and understanding their long-term and short-term behaviors through LTO. These tendencies extend to communication styles and attitudes towards patient care. These findings provide valuable insights into the nuanced interconnections between cultural factors and healthcare practices. Through a detailed analysis of the varying levels of these cultural dimensions, we gain a comprehensive understanding of the predominant inclinations among the majority of healthcare professionals. This nuanced perspective adds depth to our comprehension of how cultural values shape their approach to leadership, communication, and patient care, contributing to a more holistic understanding of the healthcare landscape. A profound comprehension of the cultural paradigms embraced by healthcare professionals holds transformative potential. Beyond a mere understanding, it acts as a catalyst for elevating the caliber of healthcare services. This heightened awareness fosters cohesive collaboration among healthcare teams, paving the way for the establishment of a unified healthcare ethos. By cultivating shared values, our study envisions a healthcare environment characterized by enhanced quality, improved teamwork, and ultimately, a more favorable and patient-centric healthcare landscape. In essence, our research underscores the critical role of cultural awareness in shaping the future of healthcare delivery.Keywords: hofstede's cultural, cultural dimensions, cultural values in healthcare, cultural awareness in nursing
Procedia PDF Downloads 6587 Best Practices and Recommendations for CFD Simulation of Hydraulic Spool Valves
Authors: Jérémy Philippe, Lucien Baldas, Batoul Attar, Jean-Charles Mare
Abstract:
The proposed communication deals with the research and development of a rotary direct-drive servo valve for aerospace applications. A key challenge of the project is to downsize the electromagnetic torque motor by reducing the torque required to drive the rotary spool. It is intended to optimize the spool and the sleeve geometries by combining a Computational Fluid Dynamics (CFD) approach with commercial optimization software. The present communication addresses an important phase of the project, which consists firstly of gaining confidence in the simulation results. It is well known that the force needed to pilot a sliding spool valve comes from several physical effects: hydraulic forces, friction and inertia/mass of the moving assembly. Among them, the flow force is usually a major contributor to the steady-state (or Root Mean Square) driving torque. In recent decades, CFD has gradually become a standard simulation tool for studying fluid-structure interactions. However, in the particular case of high-pressure valve design, the authors have experienced that the calculated overall hydraulic force depends on the parameterization and options used to build and run the CFD model. To solve this issue, the authors have selected the standard case of the linear spool valve, which is addressed in detail in numerous scientific references (analytical models, experiments, CFD simulations). The first CFD simulations run by the authors have shown that the evolution of the equivalent discharge coefficient vs. Reynolds number at the metering orifice corresponds well to the values that can be predicted by the classical analytical models. Oppositely, the simulated flow force was found to be quite different from the value calculated analytically. This drove the authors to investigate minutely the influence of the studied domain and the setting of the CFD simulation. It was firstly shown that the flow recirculates in the inlet and outlet channels if their length is not sufficient regarding their hydraulic diameter. The dead volume on the uncontrolled orifice side also plays a significant role. These examples highlight the influence of the geometry of the fluid domain considered. The second action was to investigate the influence of the type of mesh, the turbulence models and near-wall approaches, and the numerical solver and discretization scheme order. Two approaches were used to determine the overall hydraulic force acting on the moving spool. First, the force was deduced from the momentum balance on a control domain delimited by the valve inlet and outlet and the spool walls. Second, the overall hydraulic force was calculated from the integral of pressure and shear forces acting at the boundaries of the fluid domain. This underlined the significant contribution of the viscous forces acting on the spool between the inlet and outlet orifices, which are generally not considered in the literature. This also emphasized the influence of the choices made for the implementation of CFD calculation and results analysis. With the step-by-step process adopted to increase confidence in the CFD simulations, the authors propose a set of best practices and recommendations for the efficient use of CFD to design high-pressure spool valves.Keywords: computational fluid dynamics, hydraulic forces, servovalve, rotary servovalve
Procedia PDF Downloads 4386 Exploring Antimicrobial Resistance in the Lung Microbial Community Using Unsupervised Machine Learning
Authors: Camilo Cerda Sarabia, Fernanda Bravo Cornejo, Diego Santibanez Oyarce, Hugo Osses Prado, Esteban Gómez Terán, Belén Diaz Diaz, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
Antimicrobial resistance (AMR) represents a significant and rapidly escalating global health threat. Projections estimate that by 2050, AMR infections could claim up to 10 million lives annually. Respiratory infections, in particular, pose a severe risk not only to individual patients but also to the broader public health system. Despite the alarming rise in resistant respiratory infections, AMR within the lung microbiome (microbial community) remains underexplored and poorly characterized. The lungs, as a complex and dynamic microbial environment, host diverse communities of microorganisms whose interactions and resistance mechanisms are not fully understood. Unlike studies that focus on individual genomes, analyzing the entire microbiome provides a comprehensive perspective on microbial interactions, resistance gene transfer, and community dynamics, which are crucial for understanding AMR. However, this holistic approach introduces significant computational challenges and exposes the limitations of traditional analytical methods such as the difficulty of identifying the AMR. Machine learning has emerged as a powerful tool to overcome these challenges, offering the ability to analyze complex genomic data and uncover novel insights into AMR that might be overlooked by conventional approaches. This study investigates microbial resistance within the lung microbiome using unsupervised machine learning approaches to uncover resistance patterns and potential clinical associations. it downloaded and selected lung microbiome data from HumanMetagenomeDB based on metadata characteristics such as relevant clinical information, patient demographics, environmental factors, and sample collection methods. The metadata was further complemented by details on antibiotic usage, disease status, and other relevant descriptions. The sequencing data underwent stringent quality control, followed by a functional profiling focus on identifying resistance genes through specialized databases like Antibiotic Resistance Database (CARD) which contains sequences of AMR gene sequence and resistance profiles. Subsequent analyses employed unsupervised machine learning techniques to unravel the structure and diversity of resistomes in the microbial community. Some of the methods employed were clustering methods such as K-Means and Hierarchical Clustering enabled the identification of sample groups based on their resistance gene profiles. The work was implemented in python, leveraging a range of libraries such as biopython for biological sequence manipulation, NumPy for numerical operations, Scikit-learn for machine learning, Matplotlib for data visualization and Pandas for data manipulation. The findings from this study provide insights into the distribution and dynamics of antimicrobial resistance within the lung microbiome. By leveraging unsupervised machine learning, we identified novel resistance patterns and potential drivers within the microbial community.Keywords: antibiotic resistance, microbial community, unsupervised machine learning., sequences of AMR gene
Procedia PDF Downloads 2385 Influence of Cryo-Grinding on Antioxidant Activity and Amount of Free Phenolic Acids, Rutin and Tyrosol in Whole Grain Buckwheat and Pumpkin Seed Cake
Authors: B. Voucko, M. Benkovic, N. Cukelj, S. Drakula, D. Novotni, S. Balbino, D. Curic
Abstract:
Oxidative stress is considered as one of the causes leading to metabolic disorders in humans. Therefore, the ability of antioxidants to inhibit free radical production is their primary role in the human organism. Antioxidants originating from cereals, especially flavonoids and polyphenols, are mostly bound and indigestible. Micronization damages the cell wall which consecutively results in bioactive material to be more accessible in vivo. In order to ensure complete fragmentation, micronization is often combined with high temperatures (e.g., for bran 200°C) which can lead to degradation of bioactive compounds. The innovative non-thermal technology of cryo-milling is an ultra-fine micronization method that uses liquid nitrogen (LN2) at a temperature of 195°C to freeze and cool the sample during milling. Freezing at such low temperatures causes the material to become brittle which ensures the generation of fine particles while preserving the bioactive content of the material. The aim of this research was to determine if production of ultra-fine material with cryo-milling will result in the augmentation of available bioactive compounds of buckwheat and pumpkin seed cake. For that reason, buckwheat and pumpkin seed cake were ground in a ball mill (CryoMill, Retch, Germany) with and without the use of LN2 for 8 minutes, in a 50 mL stainless steel jar containing one grinding ball (Ø 25 mm) at an oscillation frequency of 30 Hz. The cryo-milled samples were cooled with LN2 for 2 minutes prior to milling, followed by the first cycle of milling (4 minutes), intermediary cooling (2 minutes), and finally the second cycle of milling (further 4 minutes). A continuous process of milling was applied to the samples ground without freezing with LN2. Particle size distribution was determined using the Scirocco 2000 dry dispersion unit (Malvern Instruments, UK). Antioxidant activity was determined by 2,2-Diphenyl-1-picrylhydrazyl (DPPH) test and ferric reducing antioxidant power (FRAP) assay, while the total phenol content was determined using the Folin Ciocalteu method, using the ultraviolet-visible spectrophotometer (Specord 50 Plus, Germany). The content of the free phenolic acids, rutin in buckwheat, tyrosol in pumpkin seed cake, was determined with an HPLC-PDA method (Agilent 1200 series, Germany). Cryo-milling resulted in 11 times smaller size of buckwheat particles, and 3 times smaller size of pumpkin seed particles than milling without the use of LN2, but also, a lower uniformity of the particle size distribution. Lack of freezing during milling of pumpkin seed cake caused a formation of agglomerates due to its high-fat content (21 %). Cryo-milling caused augmentation of buckwheat flour antioxidant activity measured by DPPH test (23,9%) and an increase in available rutin content (14,5%). Also, it resulted in an augmentation of the total phenol content (36,9%) and available tyrosol content (12,5%) of pumpkin seed cake. Antioxidant activity measured with the FRAP test, as well as the content of phenolic acids remained unchanged independent of the milling process. The results of this study showed the potential of cryo-milling for complete raw material utilization in the food industry, as well as a tool for extraction of aimed bioactive components.Keywords: bioactive, ball-mill, buckwheat, cryo-milling, pumpkin seed cake
Procedia PDF Downloads 13284 Improved Anatomy Teaching by the 3D Slicer Platform
Authors: Ahmedou Moulaye Idriss, Yahya Tfeil
Abstract:
Medical imaging technology has become an indispensable tool in many branches of the biomedical, health area, and research and is vitally important for the training of professionals in these fields. It is not only about the tools, technologies, and knowledge provided but also about the community that this training project proposes. In order to be able to raise the level of anatomy teaching in the medical school of Nouakchott in Mauritania, it is necessary and even urgent to facilitate access to modern technology for African countries. The role of technology as a key driver of justifiable development has long been recognized. Anatomy is an essential discipline for the training of medical students; it is a key element for the training of medical specialists. The quality and results of the work of a young surgeon depend on his better knowledge of anatomical structures. The teaching of anatomy is difficult as the discipline is being neglected by medical students in many academic institutions. However, anatomy remains a vital part of any medical education program. When anatomy is presented in various planes medical students approve of difficulties in understanding. They do not increase their ability to visualize and mentally manipulate 3D structures. They are sometimes not able to correctly identify neighbouring or associated structures. This is the case when they have to make the identification of structures related to the caudate lobe when the liver is moved to different positions. In recent decades, some modern educational tools using digital sources tend to replace old methods. One of the main reasons for this change is the lack of cadavers in laboratories with poorly qualified staff. The emergence of increasingly sophisticated mathematical models, image processing, and visualization tools in biomedical imaging research have enabled sophisticated three-dimensional (3D) representations of anatomical structures. In this paper, we report our current experience in the Faculty of Medicine in Nouakchott Mauritania. One of our main aims is to create a local learning community in the fields of anatomy. The main technological platform used in this project is called 3D Slicer. 3D Slicer platform is an open-source application available for free for viewing, analysis, and interaction with biomedical imaging data. Using the 3D Slicer platform, we created from real medical images anatomical atlases of parts of the human body, including head, thorax, abdomen, liver, and pelvis, upper and lower limbs. Data were collected from several local hospitals and also from the website. We used MRI and CT-Scan imaging data from children and adults. Many different anatomy atlases exist, both in print and digital forms. Anatomy Atlas displays three-dimensional anatomical models, image cross-sections of labelled structures and source radiological imaging, and a text-based hierarchy of structures. Open and free online anatomical atlases developed by our anatomy laboratory team will be available to our students. This will allow pedagogical autonomy and remedy the shortcomings by responding more fully to the objectives of sustainable local development of quality education and good health at the national level. To make this work a reality, our team produced several atlases available in our faculty in the form of research projects.Keywords: anatomy, education, medical imaging, three dimensional
Procedia PDF Downloads 24183 Combination of Modelling and Environmental Life Cycle Assessment Approach for Demand Driven Biogas Production
Authors: Juan A. Arzate, Funda C. Ertem, M. Nicolas Cruz-Bournazou, Peter Neubauer, Stefan Junne
Abstract:
— One of the biggest challenges the world faces today is global warming that is caused by greenhouse gases (GHGs) coming from the combustion of fossil fuels for energy generation. In order to mitigate climate change, the European Union has committed to reducing GHG emissions to 80–95% below the level of the 1990s by the year 2050. Renewable technologies are vital to diminish energy-related GHG emissions. Since water and biomass are limited resources, the largest contributions to renewable energy (RE) systems will have to come from wind and solar power. Nevertheless, high proportions of fluctuating RE will present a number of challenges, especially regarding the need to balance the variable energy demand with the weather dependent fluctuation of energy supply. Therefore, biogas plants in this content would play an important role, since they are easily adaptable. Feedstock availability varies locally or seasonally; however there is a lack of knowledge in how biogas plants should be operated in a stable manner by local feedstock. This problem may be prevented through suitable control strategies. Such strategies require the development of convenient mathematical models, which fairly describe the main processes. Modelling allows us to predict the system behavior of biogas plants when different feedstocks are used with different loading rates. Life cycle assessment (LCA) is a technique for analyzing several sides from evolution of a product till its disposal in an environmental point of view. It is highly recommend to use as a decision making tool. In order to achieve suitable strategies, the combination of a flexible energy generation provided by biogas plants, a secure production process and the maximization of the environmental benefits can be obtained by the combination of process modelling and LCA approaches. For this reason, this study focuses on the biogas plant which flexibly generates required energy from the co-digestion of maize, grass and cattle manure, while emitting the lowest amount of GHG´s. To achieve this goal AMOCO model was combined with LCA. The program was structured in Matlab to simulate any biogas process based on the AMOCO model and combined with the equations necessary to obtain climate change, acidification and eutrophication potentials of the whole production system based on ReCiPe midpoint v.1.06 methodology. Developed simulation was optimized based on real data from operating biogas plants and existing literature research. The results prove that AMOCO model can successfully imitate the system behavior of biogas plants and the necessary time required for the process to adapt in order to generate demanded energy from available feedstock. Combination with LCA approach provided opportunity to keep the resulting emissions from operation at the lowest possible level. This would allow for a prediction of the process, when the feedstock utilization supports the establishment of closed material circles within a smart bio-production grid – under the constraint of minimal drawbacks for the environment and maximal sustainability.Keywords: AMOCO model, GHG emissions, life cycle assessment, modelling
Procedia PDF Downloads 18882 Bio-Inspired Information Complexity Management: From Ant Colony to Construction Firm
Authors: Hamza Saeed, Khurram Iqbal Ahmad Khan
Abstract:
Effective information management is crucial for any construction project and its success. Primary areas of information generation are either the construction site or the design office. There are different types of information required at different stages of construction involving various stakeholders creating complexity. There is a need for effective management of information flows to reduce uncertainty creating complexity. Nature provides a unique perspective in terms of dealing with complexity, in particular, information complexity. System dynamics methodology provides tools and techniques to address complexity. It involves modeling and simulation techniques that help address complexity. Nature has been dealing with complex systems since its creation 4.5 billion years ago. It has perfected its system by evolution, resilience towards sudden changes, and extinction of unadaptable and outdated species that are no longer fit for the environment. Nature has been accommodating the changing factors and handling complexity forever. Humans have started to look at their natural counterparts for inspiration and solutions for their problems. This brings forth the possibility of using a biomimetics approach to improve the management practices used in the construction sector. Ants inhabit different habitats. Cataglyphis and Pogonomyrmex live in deserts, Leafcutter ants reside in rainforests, and Pharaoh ants are native to urban developments of tropical areas. Detailed studies have been done on fifty species out of fourteen thousand discovered. They provide the opportunity to study the interactions in diverse environments to generate collective behavior. Animals evolve to better adapt to their environment. The collective behavior of ants emerges from feedback through interactions among individuals, based on a combination of three basic factors: The patchiness of resources in time and space, operating cost, environmental stability, and the threat of rupture. If resources appear in patches through time and space, the response is accelerating and non-linear, and if resources are scattered, the response follows a linear pattern. If the acquisition of energy through food is faster than energy spent to get it, the default is to continue with an activity unless it is halted for some reason. If the energy spent is rather higher than getting it, the default changes to stay put unless activated. Finally, if the environment is stable and the threat of rupture is low, the activation and amplification rate is slow but steady. Otherwise, it is fast and sporadic. To further study the effects and to eliminate the environmental bias, the behavior of four different ant species were studied, namely Red Harvester ants (Pogonomyrmex Barbatus), Argentine ants (Linepithema Humile), Turtle ants (Cephalotes Goniodontus), Leafcutter ants (Genus: Atta). This study aims to improve the information system in the construction sector by providing a guideline inspired by nature with a systems-thinking approach, using system dynamics as a tool. Identified factors and their interdependencies were analyzed in the form of a causal loop diagram (CLD), and construction industry professionals were interviewed based on the developed CLD, which was validated with significance response. These factors and interdependencies in the natural system corresponds with the man-made systems, providing a guideline for effective use and flow of information.Keywords: biomimetics, complex systems, construction management, information management, system dynamics
Procedia PDF Downloads 13781 Developing a Performance Measurement System for Arts-Based Initiatives: Action Research on Italian Corporate Museums
Authors: Eleonora Carloni, Michela Arnaboldi
Abstract:
In academia, the investigation of the relationship between cultural heritage and corporations is ubiquitous in several fields of studies. In practice corporations are more and more integrating arts and cultural heritage in their strategies for disparate benefits, such as: to foster customer’s purchase intention with authentic and aesthetic experiences, to improve their reputation towards local communities, and to motivate employees with creative thinking. There are diverse forms under which corporations set these artistic interventions, from sponsorships to arts-based training centers for employees, but scholars agree that the maximum expression of this cultural trend are corporate museums, growing in number and relevance. Corporate museums are museum-like settings, hosting artworks of corporations’ history and interests. In academia they have been ascribed as strategic asset and they have been associated with diverse uses for corporations’ benefits, from place for preservation of cultural heritage, to tools for public relations and cultural flagship stores. Previous studies have thus extensively but fragmentally studied the diverse benefits of corporate museum opening to corporations, with a lack of comprehensive approach and a digression on how to evaluate and report corporate museum’s performances. Stepping forward, the present study aims to investigate: 1) what are the key performance measures corporate museums need to report to the associated corporations; 2) how are the key performance measures reported to the concerned corporations. This direction of study is not only suggested as future direction in academia but it has solid basis in practice, aiming to answer to the need of corporate museums’ directors to account for corporate museum’s activities to the concerned corporation. Coherently, at an empirical level the study relies on action research method, whose distinctive feature is to develop practical knowledge through a participatory process. This paper indeed relies on the experience of a collaborative project between the researchers and a set of corporate museums in Italy, aimed at co-developing a performance measurement system. The project involved two steps: a first step, in which researchers derived the potential performance measures from literature along with exploratory interviews; a second step, in which researchers supported the pool of corporate museums’ directors in co-developing a set of key performance indicators for reporting. Preliminary empirical findings show that while scholars insist on corporate museums’ capability to develop networking relations, directors insist on the role of museums as internal supplier of knowledge for innovation goals. Moreover, directors stress museums’ cultural mission and outcomes as potential benefits for corporation, by remarking to include both cultural and business measures in the final tool. In addition, they give relevant attention to the wording used in humanistic terms while struggling to express all measures in economic terms. The paper aims to contribute to corporate museums’ and more broadly to arts-based initiatives’ literature in two directions. Firstly, it elaborates key performance measures with related indicators to report on cultural initiatives for corporations. Secondly, it provides evidence of challenges and practices to handle reporting on these initiatives, because of tensions arising from the co-existence of diverse perspectives, namely arts and business worlds.Keywords: arts-based initiative, corporate museum, hybrid organization, performance measurement
Procedia PDF Downloads 17680 Foucault and Governmentality: International Organizations and State Power
Authors: Sara Dragisic
Abstract:
Using the theoretical analysis of the birth of biopolitics that Foucault performed through the history of liberalism and neoliberalism, in this paper we will try to show how, precisely through problematizing the role of international institutions, the model of governance differs from previous ways of objectifying body and life. Are the state and its mechanisms still a Leviathan to fight against, or can it be even the driver of resistance against the proponents of modern governance and the biopolitical power? Do paradigmatic examples of biopolitics still appear through sovereignty and (international) law, or is it precisely this sphere that shows a significant dose of incompetence and powerlessness in relation to, not only the economic sphere (Foucault’s critique of neoliberalism) but also the new politics of freedom? Have the struggle for freedom and human rights, as well as the war on terrorism, opened a new spectrum of biopolitical processes, which are manifested precisely through new international institutions and humanitarian discourse? We will try to answer these questions, in the following way. On the one hand, we will show that the views of authors such as Agamben and Hardt and Negri, in whom the state and sovereignty are seen as enemies to be defeated or overcome, fail to see how such attempts could translate into the politicization of life like it is done in many examples through the doctrine of liberal interventionism and humanitarianism. On the other hand, we will point out that it is precisely the humanitarian discourse and the defense of the right to intervention that can be the incentive and basis for the politicization of the category of life and lead to the selective application of human rights. Zizek example of the killing of United Nations workers and doctors in a village during the Vietnam War, who were targeted even before police or soldiers, because they were precisely seen as a powerful instrument of American imperialism (as they were sincerely trying to help the population), will be focus of this part of the analysis. We’ll ask the question whether such interpretation is a kind of liquidation of the extreme left of the political (Laclau) or on this basis can be explained at least in part the need to review the functioning of international organizations, ranging from those dealing with humanitarian aid (and humanitarian military interventions) to those dealing with protection and the security of the population, primarily from growing terrorism. Based on the above examples, we will also explain how the discourse of terrorism itself plays a dual role: it can appear as a tool of liberal biopolitics, although, more superficially, it mostly appears as an enemy that wants to destroy the liberal system and its values. This brings us to the basic problem that this paper will tackle: do the mechanisms of institutional struggle for human rights and freedoms, which is often seen as opposed to the security mechanisms of the state, serve the governance of citizens in such a way that the latter themselves participate in producing biopolitical governmental practices? Is the freedom today "nothing but the correlative development of apparatuses of security" (Foucault)? Or, we can continue this line of Foucault’s argumentation and enhance the interpretation with the important question of what precisely today reflects the change in the rationality of governance in which society is transformed from a passive object into a subject of its own production. Finally, in order to understand the skills of biopolitical governance in modern civil society, it is necessary to pay attention to the status of international organizations, which seem to have become a significant place for the implementation of global governance. In this sense, the power of sovereignty can turn out to be an insufficiently strong power of security policy, which can go hand in hand with freedom policies, through neoliberal governmental techniques.Keywords: neoliberalism, Foucault, sovereignty, biopolitics, international organizations, NGOs, Agamben, Hardt&Negri, Zizek, security, state power
Procedia PDF Downloads 20679 Digitization and Morphometric Characterization of Botanical Collection of Indian Arid Zones as Informatics Initiatives Addressing Conservation Issues in Climate Change Scenario
Authors: Dipankar Saha, J. P. Singh, C. B. Pandey
Abstract:
Indian Thar desert being the seventh largest in the world is the main hot sand desert occupies nearly 385,000km2 and about 9% of the area of the country harbours several species likely the flora of 682 species (63 introduced species) belonging to 352 genera and 87 families. The degree of endemism of plant species in the Thar desert is 6.4 percent, which is relatively higher than the degree of endemism in the Sahara desert which is very significant for the conservationist to envisage. The advent and development of computer technology for digitization and data base management coupled with the rapidly increasing importance of biodiversity conservation resulted in the invention of biodiversity informatics as discipline of basic sciences with multiple applications. Aichi Target 19 as an outcome of Convention of Biological Diversity (CBD) specifically mandates the development of an advanced and shared biodiversity knowledge base. Information on species distributions in space is the crux of effective management of biodiversity in the rapidly changing world. The efficiency of biodiversity management is being increased rapidly by various stakeholders like researchers, policymakers, and funding agencies with the knowledge and application of biodiversity informatics. Herbarium specimens being a vital repository for biodiversity conservation especially in climate change scenario the digitization process usually aims to improve access and to preserve delicate specimens and in doing so creating large sets of images as a part of the existing repository as arid plant information facility for long-term future usage. As the leaf characters are important for describing taxa and distinguishing between them and they can be measured from herbarium specimens as well. As a part of this activity, laminar characterization (leaves being the most important characters in assessing climate change impact) initially resulted in classification of more than thousands collections belonging to ten families like Acanthaceae, Aizoaceae, Amaranthaceae, Asclepiadaceae, Anacardeaceae, Apocynaceae, Asteraceae, Aristolochiaceae, Berseraceae and Bignoniaceae etc. Taxonomic diversity indices has also been worked out being one of the important domain of biodiversity informatics approaches. The digitization process also encompasses workflows which incorporate automated systems to enable us to expand and speed up the digitisation process. The digitisation workflows used to be on a modular system which has the potential to be scaled up. As they are being developed with a geo-referencing tool and additional quality control elements and finally placing specimen images and data into a fully searchable, web-accessible database. Our effort in this paper is to elucidate the role of BIs, present effort of database development of the existing botanical collection of institute repository. This effort is expected to be considered as a part of various global initiatives having an effective biodiversity information facility. This will enable access to plant biodiversity data that are fit-for-use by scientists and decision makers working on biodiversity conservation and sustainable development in the region and iso-climatic situation of the world.Keywords: biodiversity informatics, climate change, digitization, herbarium, laminar characters, web accessible interface
Procedia PDF Downloads 22978 Times2D: A Time-Frequency Method for Time Series Forecasting
Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan
Abstract:
Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation
Procedia PDF Downloads 4277 Food Processing Technology and Packaging: A Case Study of Indian Cashew-Nut Industry
Authors: Parashram Jakappa Patil
Abstract:
India is the global leader in world cashew business and cashew-nut industry is one of the important food processing industries in world. However India is the largest producer, processor, exporter and importer eschew in the world. India is providing cashew to the rest of the world. India is meeting world demand of cashew. India has a tremendous potential of cashew production and export to other countries. Every year India earns more than 2000 cores rupees through cashew trade. Cashew industry is one of the important small scale industries in the country which is playing significant role in rural development. It is generating more than 400000 jobs at remote area and 95% cashew worker are women, it is giving income to poor cashew farmers, majority cashew processing units are small and cottage, it is helping to stop migration from young farmers for employment opportunities, it is motivation rural entrepreneurship development and it is also helping to environment protection etc. Hence India cashew business is very important agribusiness in India which has potential make inclusive development. World Bank and IMF recognized cashew-nut industry is one the important tool for poverty eradication at global level. It shows important of cashew business and its strong existence in India. In spite of such huge potential cashew processing industry is facing different problems such as lack of infrastructure ability, lack of supply of raw cashew, lack of availability of finance, collection of raw cashew, unavailability of warehouse, marketing of cashew kernels, lack of technical knowledge and especially processing technology and packaging of finished products. This industry has great prospects such as scope for more cashew cultivation and cashew production, employment generation, formation of cashew processing units, alcohols production from cashew apple, shield oil production, rural development, poverty elimination, development of social and economic backward class and environment protection etc. This industry has domestic as well as foreign market; India has tremendous potential in this regard. The cashew is a poor men’s crop but rich men’s food. The cashew is a source of income and livelihood for poor farmers. Cashew-nut industry may play very important role in the development of hilly region. The objectives of this paper are to identify problems of cashew processing and use of processing technology, problems of cashew kernel packaging, evolving of cashew processing technology over the year and its impact on final product and impact of good processing by adopting appropriate technology packaging on international trade of cashew-nut. The most important problem of cashew processing industry is that is processing and packaging. Bad processing reduce the quality of cashew kernel at large extent especially broken of cashew kernel which has very less price in market compare to whole cashew kernel and not eligible for export. On the other hand if there is no good packaging of cashew kernel will get moisture which destroy test of it. International trade of cashew-nut is depend of two things one is cashew processing and other is packaging. This study has strong relevance because cashew-nut industry is the labour oriented, where processing technology is not playing important role because 95% processing work is manual. Hence processing work was depending on physical performance of worker which makes presence of large workforce inevitable. There are many cashew processing units closed because they are not getting sufficient work force. However due to advancement in technology slowly this picture is changing and processing work get improve. Therefore it is interesting to explore all the aspects in context of cashew processing and packaging of cashew business.Keywords: cashew, processing technology, packaging, international trade, change
Procedia PDF Downloads 42276 A Review on Cyberchondria Based on Bibliometric Analysis
Authors: Xiaoqing Peng, Aijing Luo, Yang Chen
Abstract:
Background: Cyberchondria, as an "emerging risk" accompanied by the information era, is a new abnormal pattern characterized by excessive or repeated online searches for health-related information and escalating health anxiety, which endangers people's physical and mental health and poses a huge threat to public health. Objective: To explore and discuss the research status, hotspots and trends of Cyberchondria. Methods: Based on a total of 77 articles regarding "Cyberchondria" extracted from Web of Science from the beginning till October 2019, the literature trends, countries, institutions, hotspots are analyzed by bibliometric analysis, the concept definition of Cyberchondria, instruments, relevant factors, treatment and intervention are discussed as well. Results: Since "Cyberchondria" was put forward for the first time in 2001, the last two decades witnessed a noticeable increase in the amount of literature, especially during 2014-2019, it quadrupled dramatically at 62 compared with that before 2014 only at 15, which shows that Cyberchondria has become a new theme and hot topic in recent years. The United States was the most active contributor with the largest publication (23), followed by England (11) and Australia (11), while the leading institutions were Baylor University(7) and University of Sydney(7), followed by Florida State University(4) and University of Manchester(4). The WoS categories "Psychiatry/Psychology " and "Computer/ Information Science "were the areas of greatest influence. The concept definition of Cyberchondria is not completely unified in the world, but it is generally considered as an abnormal behavioral pattern and emotional state and has been invoked to refer to the anxiety-amplifying effects of online health-related searches. The first and the most frequently cited scale for measuring the severity of Cyberchondria called “The Cyberchondria Severity Scale (CSS) ”was developed in 2014, which conceptualized Cyberchondria as a multidimensional construct consisting of compulsion, distress, excessiveness, reassurance, and mistrust of medical professionals which was proved to be not necessary for this construct later. Since then, the Brazilian, German, Turkish, Polish and Chinese versions were subsequently developed, improved and culturally adjusted, while CSS was optimized to a simplified version (CSS-12) in 2019, all of which should be worthy of further verification. The hotspots of Cyberchondria mainly focuses on relevant factors as follows: intolerance of uncertainty, anxiety sensitivity, obsessive-compulsive disorder, internet addition, abnormal illness behavior, Whiteley index, problematic internet use, trying to make clear the role played by “associated factors” and “anxiety-amplifying factors” in the development of Cyberchondria, to better understand the aetiological links and pathways in the relationships between hypochondriasis, health anxiety and online health-related searches. Although the treatment and intervention of Cyberchondria are still in the initial stage of exploration, there are kinds of meaningful attempts to seek effective strategies from different aspects such as online psychological treatment, network technology management, health information literacy improvement and public health service. Conclusion: Research on Cyberchondria is in its infancy but should be deserved more attention. A conceptual consensus on Cyberchondria, a refined assessment tool, prospective studies conducted in various populations, targeted treatments for it would be the main research direction in the near future.Keywords: cyberchondria, hypochondriasis, health anxiety, online health-related searches
Procedia PDF Downloads 12275 Pre-Cancerigene Injuries Related to Human Papillomavirus: Importance of Cervicography as a Complementary Diagnosis Method
Authors: Denise De Fátima Fernandes Barbosa, Tyane Mayara Ferreira Oliveira, Diego Jorge Maia Lima, Paula Renata Amorim Lessa, Ana Karina Bezerra Pinheiro, Cintia Gondim Pereira Calou, Glauberto Da Silva Quirino, Hellen Lívia Oliveira Catunda, Tatiana Gomes Guedes, Nicolau Da Costa
Abstract:
The aim of this study is to evaluate the use of Digital Cervicography (DC) in the diagnosis of precancerous lesions related to Human Papillomavirus (HPV). Cross-sectional study with a quantitative approach, of evaluative type, held in a health unit linked to the Pro Dean of Extension of the Federal University of Ceará, in the period of July to August 2015 with a sample of 33 women. Data collecting was conducted through interviews with enforcement tool. Franco (2005) standardized the technique used for DC. Polymerase Chain Reaction (PCR) was performed to identify high-risk HPV genotypes. DC were evaluated and classified by 3 judges. The results of DC and PCR were classified as positive, negative or inconclusive. The data of the collecting instruments were compiled and analyzed by the software Statistical Package for Social Sciences (SPSS) with descriptive statistics and cross-references. Sociodemographic, sexual and reproductive variables were analyzed through absolute frequencies (N) and their respective percentage (%). Kappa coefficient (κ) was applied to determine the existence of agreement between the DC of reports among evaluators with PCR and also among the judges about the DC results. The Pearson's chi-square test was used for analysis of sociodemographic, sexual and reproductive variables with the PCR reports. It was considered statistically significant (p<0.05). Ethical aspects of research involving human beings were respected, according to 466/2012 Resolution. Regarding the socio-demographic profile, the most prevalent ages and equally were those belonging to the groups 21-30 and 41-50 years old (24.2%). The brown color was reported in excess (84.8%) and 96.9% out of them had completed primary and secondary school or studying. 51.5% were married, 72.7% Catholic, 54.5% employed and 48.5% with income between one and two minimum wages. As for the sexual and reproductive characteristics, prevailed heterosexual (93.9%) who did not use condoms during sexual intercourse (72.7%). 51.5% had a previous history of Sexually Transmitted Infection (STI), and HPV the most prevalent STI (76.5%). 57.6% did not use contraception, 78.8% underwent examination Cancer Prevention Uterus (PCCU) with shorter time interval or equal to one year, 72.7% had no cases of Cervical Cancer in the family, 63.6% were multiparous and 97% were not vaccinated against HPV. DC identified good level of agreement between raters (κ=0.542), had a specificity of 77.8% and sensitivity of 25% when compared their results with PCR. Only the variable race showed a statistically significant association with CRP (p=0.042). DC had 100% acceptance amongst women in the sample, revealing the possibility of other experiments in using this method so that it proves as a viable technique. The DC positivity criteria were developed by nurses and these professionals also perform PCCU in Brazil, which means that DC can be an important complementary diagnostic method for the appreciation of these professional’s quality of examinations.Keywords: gynecological examination, human papillomavirus, nursing, papillomavirus infections, uterine lasmsneop
Procedia PDF Downloads 30074 Provotyping Futures Through Design
Authors: Elisabetta Cianfanelli, Maria Claudia Coppola, Margherita Tufarelli
Abstract:
Design practices throughout history return a critical understanding of society since they always conveyed values and meanings aimed at (re)framing reality by acting in everyday life: here, design gains cultural and normative character, since its artifacts, services, and environments hold the power to intercept, influence and inspire thoughts, behaviors, and relationships. In this sense, design can be persuasive, engaging in the production of worlds and, as such, acting in the space between poietics and politics so that chasing preferable futures and their aesthetic strategies becomes a matter full of political responsibility. This resonates with contemporary landscapes of radical interdependencies challenging designers to focus on complex socio-technical systems and to better support values such as equality and justice for both humans and nonhumans. In fact, it is in times of crisis and structural uncertainty that designers turn into visionaries at the service of society, envisioning scenarios and dwelling in the territories of imagination to conceive new fictions and frictions to be added to the thickness of the real. Here, design’s main tasks are to develop options, to increase the variety of choices, to cultivate its role as scout, jester, agent provocateur for the public, so that design for transformation emerges, making an explicit commitment to society, furthering structural change in a proactive and synergic manner. However, the exploration of possible futures is both a trap and a trampoline because, although it embodies a radical research tool, it raises various challenges when the design process goes further in the translation of such vision into an artefact - whether tangible or intangible -, through which it should deliver that bit of future into everyday experience. Today designers are making up new tools and practices to tackle current wicked challenges, combining their approaches with other disciplinary domains: futuring through design, thus, rises from research strands like speculative design, design fiction, and critical design, where the blending of design approaches and futures thinking brings an action-oriented and product-based approach to strategic insights. The contribution positions at the intersection of those approaches, aiming at discussing design’s tools of inquiry through which it is possible to grasp the agency of imagined futures into present time. Since futures are not remote, they actively participate in creating path-dependent decisions, crystallized into designed artifacts par excellence, prototypes, and their conceptual other, provotypes: with both being unfinished and multifaceted, the first ones are effective in reiterating solutions to problems already framed, while the second ones prove to be useful when the goal is to explore and break boundaries, bringing closer preferable futures. By focusing on some provotypes throughout history which challenged markets and, above all, social and cultural structures, the contribution’s final aim is understanding the knowledge produced by provotypes, understood as design spaces where designs’s humanistic side might help developing a deeper sensibility about uncertainty and, most of all, the unfinished feature of societal artifacts, whose experimentation would leave marks and traces to build up f(r)ictions as vital sparks of plurality and collective life.Keywords: speculative design, provotypes, design knowledge, political theory
Procedia PDF Downloads 13273 Describing Cognitive Decline in Alzheimer's Disease via a Picture Description Writing Task
Authors: Marielle Leijten, Catherine Meulemans, Sven De Maeyer, Luuk Van Waes
Abstract:
For the diagnosis of Alzheimer's disease (AD), a large variety of neuropsychological tests are available. In some of these tests, linguistic processing - both oral and written - is an important factor. Language disturbances might serve as a strong indicator for an underlying neurodegenerative disorder like AD. However, the current diagnostic instruments for language assessment mainly focus on product measures, such as text length or number of errors, ignoring the importance of the process that leads to written or spoken language production. In this study, it is our aim to describe and test differences between cognitive and impaired elderly on the basis of a selection of writing process variables (inter- and intrapersonal characteristics). These process variables are mainly related to pause times, because the number, length, and location of pauses have proven to be an important indicator of the cognitive complexity of a process. Method: Participants that were enrolled in our research were chosen on the basis of a number of basic criteria necessary to collect reliable writing process data. Furthermore, we opted to match the thirteen cognitively impaired patients (8 MCI and 5 AD) with thirteen cognitively healthy elderly. At the start of the experiment, participants were each given a number of tests, such as the Mini-Mental State Examination test (MMSE), the Geriatric Depression Scale (GDS), the forward and backward digit span and the Edinburgh Handedness Inventory (EHI). Also, a questionnaire was used to collect socio-demographic information (age, gender, eduction) of the subjects as well as more details on their level of computer literacy. The tests and questionnaire were followed by two typing tasks and two picture description tasks. For the typing tasks participants had to copy (type) characters, words and sentences from a screen, whereas the picture description tasks each consisted of an image they had to describe in a few sentences. Both the typing and the picture description tasks were logged with Inputlog, a keystroke logging tool that allows us to log and time stamp keystroke activity to reconstruct and describe text production processes. The main rationale behind keystroke logging is that writing fluency and flow reveal traces of the underlying cognitive processes. This explains the analytical focus on pause (length, number, distribution, location, etc.) and revision (number, type, operation, embeddedness, location, etc.) characteristics. As in speech, pause times are seen as indexical of cognitive effort. Results. Preliminary analysis already showed some promising results concerning pause times before, within and after words. For all variables, mixed effects models were used that included participants as a random effect and MMSE scores, GDS scores and word categories (such as determiners and nouns) as a fixed effect. For pause times before and after words cognitively impaired patients paused longer than healthy elderly. These variables did not show an interaction effect between the group participants (cognitively impaired or healthy elderly) belonged to and word categories. However, pause times within words did show an interaction effect, which indicates pause times within certain word categories differ significantly between patients and healthy elderly.Keywords: Alzheimer's disease, keystroke logging, matching, writing process
Procedia PDF Downloads 36672 Innovative Practices That Have Significantly Scaled up Depot Medroxy Progesterone Acetate-SC Self-Inject Services
Authors: Oluwaseun Adeleke, Samuel O. Ikani, Fidelis Edet, Anthony Nwala, Mopelola Raji, Simeon Christian Chukwu
Abstract:
Background The Delivering Innovations in Selfcare (DISC) project promotes universal access to quality selfcare services beginning with subcutaneous depot medroxy progesterone acetate (DMPA-SC) contraceptive self-injection (SI) option. Self-inject (SI) offers women a highly effective and convenient option that saves them frequent trips to providers. Its increased use has the potential to improve the efficiency of an overstretched healthcare system by reducing provider workloads. State Social and Behavioral Change Communications (SBCC) Officers lead project demand creation and service delivery innovations that have resulted in significant increases in SI uptake among women who opt for injectables. Strategies Service Delivery Innovations The implementation of the "Moment of Truth (MoT)" innovation helped providers overcome biases and address client fear and reluctance to self-inject. Bi-annual program audits and supportive mentoring visits helped providers retain their competence and motivation. Proper documentation, tracking, and replenishment of commodities were ensured through effective engagement with State Logistics Units. The project supported existing state monitoring and evaluation structures to effectively record and report subcutaneous depot medroxy progesterone acetate (DMPA-SC) service utilization. Demand creation Innovations SBCC Officers provide oversight, routinely evaluate performance, trains, and provides feedback for the demand creation activities implemented by community mobilizers (CMs). The scope and intensity of training given to CMs affect the outcome of their work. The project operates a demand creation model that uses a schedule to inform the conduct of interpersonal and group events. Health education sessions are specifically designed to counter misinformation, address questions and concerns, and educate target audience in an informed choice context. The project mapped facilities and their catchment areas and enlisted the support of identified influencers and gatekeepers to enlist their buy-in prior to entry. Each mobilization event began with pre-mobilization sensitization activities, particularly targeting male groups. Context-specific interventions were informed by the religious, traditional, and cultural peculiarities of target communities. Mobilizers also support clients to engage with and navigate online digital Family Planning (FP) online portals such as DiscoverYourPower website, Facebook page, digital companion (chat bot), interactive voice response (IVR), radio and television (TV) messaging. This improves compliance and provides linkages to nearby facilities. Results The project recorded 136,950 self-injection (SI) visits and a self-injection (SI) proportion rate that increased from 13 percent before the implementation of interventions in 2021 to 62 percent currently. The project cost-effectively demonstrated catalytic impact by leveraging state and partner resources, institutional platforms, and geographic scope to scale up interventions. The project also cost effectively demonstrated catalytic impact by leveraging on the state and partner resources, institutional platforms, and geographic scope to sustainably scale-up these strategies. Conclusion Using evidence-informed iterations of service delivery and demand creation models have been useful to significantly drive self-injection (SI) uptake. It will be useful to consider this implementation model during program design. Contemplation should also be given to systematic and strategic execution of strategies to optimize impact.Keywords: family planning, contraception, DMPA-SC, self-care, self-injection, innovation, service delivery, demand creation.
Procedia PDF Downloads 7571 Climate Safe House: A Community Housing Project Tackling Catastrophic Sea Level Rise in Coastal Communities
Authors: Chris Fersterer, Col Fay, Tobias Danielmeier, Kat Achterberg, Scott Willis
Abstract:
New Zealand, an island nation, has an extensive coastline peppered with small communities of iconic buildings known as Bachs. Post WWII, these modest buildings were constructed by their owners as retreats and generally were small, low cost, often using recycled material and often they fell below current acceptable building standards. In the latter part of the 20th century, real estate prices in many of these communities remained low and these areas became permanent residences for people attracted to this affordable lifestyle choice. The Blueskin Resilient Communities Trust (BRCT) is an organisation that recognises the vulnerability of communities in low lying settlements as now being prone to increased flood threat brought about by climate change and sea level rise. Some of the inhabitants of Blueskin Bay, Otago, NZ have already found their properties to be un-insurable because of increased frequency of flood events and property values have slumped accordingly. Territorial authorities also acknowledge this increased risk and have created additional compliance measures for new buildings that are less than 2 m above tidal peaks. Community resilience becomes an additional concern where inhabitants are attracted to a lifestyle associated with a specific location and its people when this lifestyle is unable to be met in a suburban or city context. Traditional models of social housing fail to provide the sense of community connectedness and identity enjoyed by the current residents of Blueskin Bay. BRCT have partnered with the Otago Polytechnic Design School to design a new form of community housing that can react to this environmental change. It is a longitudinal project incorporating participatory approaches as a means of getting people ‘on board’, to understand complex systems and co-develop solutions. In the first period, they are seeking industry support and funding to develop a transportable and fully self-contained housing model that exploits current technologies. BRCT also hope that the building will become an educational tool to highlight climate change issues facing us today. This paper uses the Climate Safe House (CSH) as a case study for education in architectural sustainability through experiential learning offered as part of the Otago Polytechnics Bachelor of Design. Students engage with the project with research methodologies, including site surveys, resident interviews, data sourced from government agencies and physical modelling. The process involves collaboration across design disciplines including product and interior design but also includes connections with industry, both within the education institution and stakeholder industries introduced through BRCT. This project offers a rich learning environment where students become engaged through project based learning within a community of practice, including architecture, construction, energy and other related fields. The design outcomes are expressed in a series of public exhibitions and forums where community input is sought in a truly participatory process.Keywords: community resilience, problem based learning, project based learning, case study
Procedia PDF Downloads 28870 Use of Machine Learning Algorithms to Pediatric MR Images for Tumor Classification
Authors: I. Stathopoulos, V. Syrgiamiotis, E. Karavasilis, A. Ploussi, I. Nikas, C. Hatzigiorgi, K. Platoni, E. P. Efstathopoulos
Abstract:
Introduction: Brain and central nervous system (CNS) tumors form the second most common group of cancer in children, accounting for 30% of all childhood cancers. MRI is the key imaging technique used for the visualization and management of pediatric brain tumors. Initial characterization of tumors from MRI scans is usually performed via a radiologist’s visual assessment. However, different brain tumor types do not always demonstrate clear differences in visual appearance. Using only conventional MRI to provide a definite diagnosis could potentially lead to inaccurate results, and so histopathological examination of biopsy samples is currently considered to be the gold standard for obtaining definite diagnoses. Machine learning is defined as the study of computational algorithms that can use, complex or not, mathematical relationships and patterns from empirical and scientific data to make reliable decisions. Concerning the above, machine learning techniques could provide effective and accurate ways to automate and speed up the analysis and diagnosis for medical images. Machine learning applications in radiology are or could potentially be useful in practice for medical image segmentation and registration, computer-aided detection and diagnosis systems for CT, MR or radiography images and functional MR (fMRI) images for brain activity analysis and neurological disease diagnosis. Purpose: The objective of this study is to provide an automated tool, which may assist in the imaging evaluation and classification of brain neoplasms in pediatric patients by determining the glioma type, grade and differentiating between different brain tissue types. Moreover, a future purpose is to present an alternative way of quick and accurate diagnosis in order to save time and resources in the daily medical workflow. Materials and Methods: A cohort, of 80 pediatric patients with a diagnosis of posterior fossa tumor, was used: 20 ependymomas, 20 astrocytomas, 20 medulloblastomas and 20 healthy children. The MR sequences used, for every single patient, were the following: axial T1-weighted (T1), axial T2-weighted (T2), FluidAttenuated Inversion Recovery (FLAIR), axial diffusion weighted images (DWI), axial contrast-enhanced T1-weighted (T1ce). From every sequence only a principal slice was used that manually traced by two expert radiologists. Image acquisition was carried out on a GE HDxt 1.5-T scanner. The images were preprocessed following a number of steps including noise reduction, bias-field correction, thresholding, coregistration of all sequences (T1, T2, T1ce, FLAIR, DWI), skull stripping, and histogram matching. A large number of features for investigation were chosen, which included age, tumor shape characteristics, image intensity characteristics and texture features. After selecting the features for achieving the highest accuracy using the least number of variables, four machine learning classification algorithms were used: k-Nearest Neighbour, Support-Vector Machines, C4.5 Decision Tree and Convolutional Neural Network. The machine learning schemes and the image analysis are implemented in the WEKA platform and MatLab platform respectively. Results-Conclusions: The results and the accuracy of images classification for each type of glioma by the four different algorithms are still on process.Keywords: image classification, machine learning algorithms, pediatric MRI, pediatric oncology
Procedia PDF Downloads 14969 Particle Size Characteristics of Aerosol Jets Produced by a Low Powered E-Cigarette
Authors: Mohammad Shajid Rahman, Tarik Kaya, Edgar Matida
Abstract:
Electronic cigarettes, also known as e-cigarettes, may have become a tool to improve smoking cessation due to their ability to provide nicotine at a selected rate. Unlike traditional cigarettes, which produce toxic elements from tobacco combustion, e-cigarettes generate aerosols by heating a liquid solution (commonly a mixture of propylene glycol, vegetable glycerin, nicotine and some flavoring agents). However, caution still needs to be taken when using e-cigarettes due to the presence of addictive nicotine and some harmful substances produced from the heating process. Particle size distribution (PSD) and associated velocities generated by e-cigarettes have significant influence on aerosol deposition in different regions of human respiratory tracts. On another note, low actuation power is beneficial in aerosol generating devices since it exhibits a reduced emission of toxic chemicals. In case of e-cigarettes, lower heating powers can be considered as powers lower than 10 W compared to a wide range of powers (0.6 to 70.0 W) studied in literature. Due to the importance regarding inhalation risk reduction, deeper understanding of particle size characteristics of e-cigarettes demands thorough investigation. However, comprehensive study on PSD and velocities of e-cigarettes with a standard testing condition at relatively low heating powers is still lacking. The present study aims to measure particle number count and size distribution of undiluted aerosols of a latest fourth-generation e-cigarette at low powers, within 6.5 W using real-time particle counter (time-of-flight method). Also, temporal and spatial evolution of particle size and velocity distribution of aerosol jets are examined using phase Doppler anemometry (PDA) technique. To the authors’ best knowledge, application of PDA in e-cigarette aerosol measurement is rarely reported. In the present study, preliminary results about particle number count of undiluted aerosols measured by time-of-flight method depicted that an increase of heating power from 3.5 W to 6.5 W resulted in an enhanced asymmetricity in PSD, deviating from log-normal distribution. This can be considered as an artifact of rapid vaporization, condensation and coagulation processes on aerosols caused by higher heating power. A novel mathematical expression, combining exponential, Gaussian and polynomial (EGP) distributions, was proposed to describe asymmetric PSD successfully. The value of count median aerodynamic diameter and geometric standard deviation laid within a range of about 0.67 μm to 0.73 μm, and 1.32 to 1.43, respectively while the power varied from 3.5 W to 6.5 W. Laser Doppler velocimetry (LDV) and PDA measurement suggested a typical centerline streamwise mean velocity decay of aerosol jet along with a reduction of particle sizes. In the final submission, a thorough literature review, detailed description of experimental procedure and discussion of the results will be provided. Particle size and turbulent characteristics of aerosol jets will be further examined, analyzing arithmetic mean diameter, volumetric mean diameter, volume-based mean diameter, streamwise mean velocity and turbulence intensity. The present study has potential implications in PSD simulation and validation of aerosol dosimetry model, leading to improving related aerosol generating devices.Keywords: E-cigarette aerosol, laser doppler velocimetry, particle size distribution, particle velocity, phase Doppler anemometry
Procedia PDF Downloads 4968 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence
Authors: Gus Calderon, Richard McCreight, Tammy Schwartz
Abstract:
Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.
Procedia PDF Downloads 10867 Service Blueprinting: A New Application for Evaluating Service Provision in the Hospice Sector
Authors: L. Sudbury-Riley, P. Hunter-Jones, L. Menzies, M. Pyrah, H. Knight
Abstract:
Just as manufacturing firms aim for zero defects, service providers strive to avoid service failures where customer expectations are not met. However, because services comprise unique human interactions, service failures are almost inevitable. Consequently, firms focus on service recovery strategies to fix problems and retain their customers for the future. Because a hospice offers care to terminally ill patients, it may not get the opportunity to correct a service failure. This situation makes the identification of what hospice users really need and want, and to ascertain perceptions of the hospice’s service delivery from the user’s perspective, even more important than for other service providers. A well-documented and fundamental barrier to improving end-of-life care is a lack of service quality measurement tools that capture the experiences of user’s from their own perspective. In palliative care, many quantitative measures are used and these focus on issues such as how quickly patients are assessed, whether they receive information leaflets, whether a discussion about their emotional needs is documented, and so on. Consequently, quality of service from the user’s perspective is overlooked. The current study was designed to overcome these limitations by adapting service blueprinting - never before used in the hospice sector - in order to undertake a ‘deep-dive’ to examine the impact of hospice services upon different users. Service blueprinting is a customer-focused approach for service innovation and improvement, where the ‘onstage’ visible service user and provider interactions must be supported by the ‘backstage’ employee actions and support processes. The study was conducted in conjunction with East Cheshire Hospice in England. The Hospice provides specialist palliative care for patients with progressive life-limiting illnesses, offering services to patients, carers and families via inpatient and outpatient units. Using service blueprinting to identify every service touchpoint, in-depth qualitative interviews with 38 in-patients, outpatients, visitors and bereaved families enabled a ‘deep-dive’ to uncover perceptions of the whole service experience among these diverse users. Interviews were recorded and transcribed, and thematic analysis of over 104,000 words of data revealed many excellent aspects of Hospice service. Staff frequently exceed people’s expectations. Striking gratifying comparisons to hospitals emerged. The Hospice makes people feel safe. Nevertheless, the technique uncovered many areas for improvement, including serendipity of referrals processes, the need for better communications with external agencies, improvements amid the daunting arrival and admissions process, a desperate need for more depression counselling, clarity of communication pertaining to actual end of life, and shortcomings in systems dealing with bereaved families. The study reveals that the adapted service blueprinting tool has major advantages of alternative quantitative evaluation techniques, including uncovering the complex nature of service user’s experiences in health-care service systems, highlighting more fully the interconnected configurations within the system and making greater sense of the impact of the service upon different service users. Unlike other tools, this in-depth examination reveals areas for improvement, many of which have already been implemented by the Hospice. The technique has potential to improve experiences of palliative and end-of-life care among patients and their families.Keywords: hospices, end-of-life-care, service blueprinting, service delivery
Procedia PDF Downloads 193