Search results for: increased ecosystem persistence
108 Benzenepropanamine Analogues as Non-detergent Microbicidal Spermicide for Effective Pre-exposure Prophylaxis
Authors: Veenu Bala, Yashpal S. Chhonker, Bhavana Kushwaha, Rabi S. Bhatta, Gopal Gupta, Vishnu L. Sharma
Abstract:
According to UNAIDS 2013 estimate nearly 52% of all individuals living with HIV are now women of reproductive age (15–44 years). Seventy-five percent cases of HIV acquisition are through heterosexual contacts and sexually transmitted infections (STIs), attributable to unsafe sexual behaviour. Each year, an estimated 500 million people acquire atleast one of four STIs: chlamydia, gonorrhoea, syphilis and trichomoniasis. Trichomonas vaginalis (TV) is exclusively sexually transmitted in adults, accounting for 30% of STI cases and associated with pelvic inflammatory disease (PID), vaginitis and pregnancy complications in women. TV infection resulted in impaired vaginal milieu, eventually favoring HIV transmission. In the absence of an effective prophylactic HIV vaccine, prevention of new infections has become a priority. It was thought worthwhile to integrate HIV prevention and reproductive health services including unintended pregnancy protection for women as both are related with unprotected sex. Initially, nonoxynol-9 (N-9) had been proposed as a spermicidal agent with microbicidal activity but on the contrary it increased HIV susceptibility due to surfactant action. Thus, to accomplish an urgent need of novel woman controlled non-detergent microbicidal spermicides benzenepropanamine analogues have been synthesized. At first, five benzenepropanamine-dithiocarbamate hybrids have been synthesized and evaluated for their spermicidal, anti-Trichomonas and anti-fungal activities along with safety profiling to cervicovaginal cells. In order to further enhance the scope of above study benzenepropanamine was hybridized with thiourea as to introduce anti-HIV potential. The synthesized hybrid molecules were evaluated for their reverse transcriptase (RT) inhibition, spermicidal, anti-Trichomonas and antimicrobial activities as well as their safety against vaginal flora and cervical cells. simulated vaginal fluid (SVF) stability and pharmacokinetics of most potent compound versus N-9 was examined in female Newzealand (NZ) rabbits to observe its absorption into systemic circulation and subsequent exposure in blood plasma through vaginal wall. The study resulted in the most promising compound N-butyl-4-(3-oxo-3-phenylpropyl) piperazin-1-carbothioamide (29) exhibiting better activity profile than N-9 as it showed RT inhibition (72.30 %), anti-Trichomonas (MIC, 46.72 µM against MTZ susceptible and MIC, 187.68 µM against resistant strain), spermicidal (MEC, 0.01%) and antifungal activity (MIC, 3.12–50 µg/mL) against four fungal strains. The high safety against vaginal epithelium (HeLa cells) and compatibility with vaginal flora (lactobacillus), SVF stability and least vaginal absorption supported its suitability for topical vaginal application. Docking study was performed to gain an insight into the binding mode and interactions of the most promising compound, N-butyl-4-(3-oxo-3-phenylpropyl) piperazin-1-carbothioamide (29) with HIV-1 Reverse Transcriptase. The docking study has revealed that compound (29) interacted with HIV-1 RT similar to standard drug Nevirapine. It may be concluded that hybridization of benzenepropanamine and thiourea moiety resulted into novel lead with multiple activities including RT inhibition. A further lead optimization may result into effective vaginal microbicides having spermicidal, anti-Trichomonas, antifungal and anti-HIV potential altogether with enhanced safety to cervico-vaginal cells in comparison to Nonoxynol-9.Keywords: microbicidal, nonoxynol-9, reverse transcriptase, spermicide
Procedia PDF Downloads 343107 Navigating Rapids And Collecting Medical Insights: A Data Collection Of Athletes Presenting To The Medical Team At The International Canoe Federation Canoe Slalom World Championships 2023
Authors: Grace Scaplehorn, Muhammad Adeel Akhtar, Jane Gibson
Abstract:
Background: Canoe Slalom entails the skilful navigation of a carbon composite canoe or kayak through a series of 18-25 hanging gates, strategically positioned along the course, either upstream or downstream, amidst currents of whitewater rapids in natural and man-made river settings. Athletes compete individually in timed trials, competing for the fastest course time, typically around 80 to 120 seconds. In the new discipline of Kayak Cross, descents of the course are initiated by groups of four athletes freefalling simultaneously from a starting platform situated 3m above the river. Kayak Cross athletes, in contrast to Canoe Slalom, can make physical contact with suspended gates without incurring time penalties and are required to perform a kayak roll half way down the course. The Canoe Slalom World Championships were held at Lee Valley Whitewater Centre, London, from 19th to 24th September 2023. The event comprised 299 international athletes competing for 10 World Championship titles in Canoe/Kayak Slalom events (Olympic Debut Munich 1972), and the new Kayak Cross discipline (Olympic Debut Paris 2024). The inaugural appearance of Kayak Cross at the World Championships occurred in 2017, in Pau, France. There is limited literature surrounding Kayak Cross and the incidence of athlete injuries compared to traditional Canoe Slalom, hence it was felt important to undertake this review to address the perception that the event is dangerous. Aim: The study aimed to quantify and collate data collected from athletes presenting to the event medical centre. Methods: Athletes’ details were collected at initial assessments from the start of the practice period (16th–18th September) and throughout the event. Demographics such as age, sex and nationality were recorded along with presenting complaints, treatment, medication administered and outcome. Specifically, injuries were then sub-classified into body regions. The data does not include athletes who sought medical attention from their own governing body’s medical team. Results: During the 8-day period, there were 11 individual presentations to the medical centre, 3.7% of the athlete population (n=299). The mean age was 23.9 years (n=7), 6 were male (n=10). The most common presentation was minor injury (n=9), with 6 being musculoskeletal and 3 comprising skin damage, followed by insect sting/allergy (n=1) and pain relief requests (n=1). Five presentations were event-related, all being musculoskeletal injuries; 2 shoulder/arm, 1 head/neck, 1 hand/wrist and 1 other (data was not recorded). Of these injuries, the only intervention was 2 cases of 400mg Ibuprofen, which was given to both shoulder/arm injuries. Four of the 11 presentations were pre-existing injuries, which had been exacerbated due to increased intensity of practice. Two patients were advised to return for review, with 100% compliance. There were no unplanned re-presentations, and no emergency transfers to secondary care. Both the Kayak Cross and Canoe Slalom competitions resulted in 1 new event-related athlete presentation each. Conclusion: The event resulted in a negligible incidence of presentations at the medical centre, for both Kayak Cross and Canoe Slalom. This data holds significance in informing risk assessments and medical protocols necessary for the organisation of canoe slalom events.Keywords: canoe slalom, kayak cross, athlete injuries, event injuries
Procedia PDF Downloads 55106 An Integrated Real-Time Hydrodynamic and Coastal Risk Assessment Model
Authors: M. Reza Hashemi, Chris Small, Scott Hayward
Abstract:
The Northeast Coast of the US faces damaging effects of coastal flooding and winds due to Atlantic tropical and extratropical storms each year. Historically, several large storm events have produced substantial levels of damage to the region; most notably of which were the Great Atlantic Hurricane of 1938, Hurricane Carol, Hurricane Bob, and recently Hurricane Sandy (2012). The objective of this study was to develop an integrated modeling system that could be used as a forecasting/hindcasting tool to evaluate and communicate the risk coastal communities face from these coastal storms. This modeling system utilizes the ADvanced CIRCulation (ADCIRC) model for storm surge predictions and the Simulating Waves Nearshore (SWAN) model for the wave environment. These models were coupled, passing information to each other and computing over the same unstructured domain, allowing for the most accurate representation of the physical storm processes. The coupled SWAN-ADCIRC model was validated and has been set up to perform real-time forecast simulations (as well as hindcast). Modeled storm parameters were then passed to a coastal risk assessment tool. This tool, which is generic and universally applicable, generates spatial structural damage estimate maps on an individual structure basis for an area of interest. The required inputs for the coastal risk model included a detailed information about the individual structures, inundation levels, and wave heights for the selected region. Additionally, calculation of wind damage to structures was incorporated. The integrated coastal risk assessment system was then tested and applied to Charlestown, a small vulnerable coastal town along the southern shore of Rhode Island. The modeling system was applied to Hurricane Sandy and a synthetic storm. In both storm cases, effect of natural dunes on coastal risk was investigated. The resulting damage maps for the area (Charlestown) clearly showed that the dune eroded scenarios affected more structures, and increased the estimated damage. The system was also tested in forecast mode for a large Nor’Easters: Stella (March 2017). The results showed a good performance of the coupled model in forecast mode when compared to observations. Finally, a nearshore model XBeach was then nested within this regional grid (ADCIRC-SWAN) to simulate nearshore sediment transport processes and coastal erosion. Hurricane Irene (2011) was used to validate XBeach, on the basis of a unique beach profile dataset at the region. XBeach showed a relatively good performance, being able to estimate eroded volumes along the beach transects with a mean error of 16%. The validated model was then used to analyze the effectiveness of several erosion mitigation methods that were recommended in a recent study of coastal erosion in New England: beach nourishment, coastal bank (engineered core), and submerged breakwater as well as artificial surfing reef. It was shown that beach nourishment and coastal banks perform better to mitigate shoreline retreat and coastal erosion.Keywords: ADCIRC, coastal flooding, storm surge, coastal risk assessment, living shorelines
Procedia PDF Downloads 115105 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data
Authors: M. Mueller, M. Kuehn, M. Voelker
Abstract:
In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing
Procedia PDF Downloads 129104 Date Palm Wastes Turning into Biochars for Phosphorus Recovery from Aqueous Solutions: Static and Dynamic Investigations
Authors: Salah Jellali, Nusiba Suliman, Yassine Charabi, Jamal Al-Sabahi, Ahmed Al Raeesi, Malik Al-Wardy, Mejdi Jeguirim
Abstract:
Huge amounts of agricultural biomasses are worldwide produced. At the same time, large quantities of phosphorus are annually discharged into water bodies with possible serious effects onto the environment quality. The main objective of this work is to turn a local Omani biomass (date palm fronds wastes: DPFW) into an effective material for phosphorus recovery from aqueous and the reuse of this P-loaded material in agriculture as ecofriendly amendment. For this aim, the raw DPFW were firstly impregnated with 1 M salt separated solutions of CaCl₂, MgCl₂, FeCl₃, AlCl₃, and a mixture of MgCl₂/AlCl₃ for 24 h, and then pyrolyzed under N2 flow at 500 °C for 2 hours by using an adapted tubular furnace (Carbolite, UK). The synthetized biochars were deeply characterized through specific analyses concerning their morphology, structure, texture, and surface chemistry. These analyses included the use of a scanning electron microscope (SEM) coupled with an energy-dispersive X-Ray spectrometer (EDS), X-Ray diffraction (XRD), Fourier Transform Infrared (FTIR), sorption micrometrics, and X-ray Fluorescence (XRF) apparatus. Then, their efficiency in recovering phosphorus was investigated in batch mode for various contact times (1 min to 3 h), aqueous pH values (from 3 to 11), initial phosphorus concentrations (10-100 mg/L), presence of anions (nitrates, sulfates, and chlorides). In a second step, dynamic assays, by using laboratory columns (height of 30 cm and diameter of 3 cm), were performed in order to investigate the recovery of phosphorus by the modified biochar with a mixture of Mg/Al. The effect of the initial P concentration (25-100 mg/L), the bed depth height (3 to 8 g), and the flow rate (10-30 mL/min) was assessed. Experimental results showed that the biochars physico-chemical properties were very dependent on the type of the used modifying salt. The main affected parameters concerned the specific surface area, microporosity area, and the surface chemistry (pH of zero-point charge and available functional groups). These characteristics have significantly affected the phosphorus recovery efficiency from aqueous solutions. Indeed, the P removal efficiency in batch mode varies from about 5 mg/g for the Fe-modified biochar to more than 13 mg/g for the biochar functionalized with Mg/Al layered double hydroxides. Moreover, the P recovery seems to be a time dependent process and significantly affected by the pH of the aqueous media and the presence of foreign anions due to competition phenomenon. The laboratory column study of phosphorus recovery by the biochar functionalized with Mg/Al layered double hydroxides showed that this process is affected by the used phosphorus concentration, the flow rate, and especially the column bed depth height. Indeed, the phosphorus recovered amount increased from about 4.9 to more than 9.3 mg/g used biochar mass of 3 and 8 g, respectively. This work proved that salt-modified palm fronds-derived biochars could be considered as attractive and promising materials for phosphorus recovery from aqueous solutions even under dynamic conditions. The valorization of these P-loaded-modified biochars as eco-friendly amendment for agricultural soils is necessary will promote sustainability and circular economy concepts in the management of both liquid and solid wastes.Keywords: date palm wastes, Mg/Al double-layered hydroxides functionalized biochars, phosphorus, recovery, sustainability, circular economy
Procedia PDF Downloads 80103 Establishment of Farmed Fish Welfare Biomarkers Using an Omics Approach
Authors: Pedro M. Rodrigues, Claudia Raposo, Denise Schrama, Marco Cerqueira
Abstract:
Farmed fish welfare is a very recent concept, widely discussed among the scientific community. Consumers’ interest regarding farmed animal welfare standards has significantly increased in the last years posing a huge challenge to producers in order to maintain an equilibrium between good welfare principles and productivity, while simultaneously achieve public acceptance. The major bottleneck of standard aquaculture is to impair considerably fish welfare throughout the production cycle and with this, the quality of fish protein. Welfare assessment in farmed fish is undertaken through the evaluation of fish stress responses. Primary and secondary stress responses include release of cortisol and glucose and lactate to the blood stream, respectively, which are currently the most commonly used indicators of stress exposure. However, the reliability of these indicators is highly dubious, due to a high variability of fish responses to an acute stress and the adaptation of the animal to a repetitive chronic stress. Our objective is to use comparative proteomics to identify and validate a fingerprint of proteins that can present an more reliable alternative to the already established welfare indicators. In this way, the culture conditions will improve and there will be a higher perception of mechanisms and metabolic pathway involved in the produced organism’s welfare. Due to its high economical importance in Portuguese aquaculture Gilthead seabream will be the elected species for this study. Protein extracts from Gilthead Seabream fish muscle, liver and plasma, reared for a 3 month period under optimized culture conditions (control) and induced stress conditions (Handling, high densities, and Hipoxia) are collected and used to identify a putative fish welfare protein markers fingerprint using a proteomics approach. Three tanks per condition and 3 biological replicates per tank are used for each analisys. Briefly, proteins from target tissue/fluid are extracted using standard established protocols. Protein extracts are then separated using 2D-DIGE (Difference gel electrophoresis). Proteins differentially expressed between control and induced stress conditions will be identified by mass spectrometry (LC-Ms/Ms) using NCBInr (taxonomic level - Actinopterygii) databank and Mascot search engine. The statistical analysis is performed using the R software environment, having used a one-tailed Mann-Whitney U-test (p < 0.05) to assess which proteins were differentially expressed in a statistically significant way. Validation of these proteins will be done by comparison of the RT-qPCR (Quantitative reverse transcription polymerase chain reaction) expressed genes pattern with the proteomic profile. Cortisol, glucose, and lactate are also measured in order to confirm or refute the reliability of these indicators. The identified liver proteins under handling and high densities induced stress conditions are responsible and involved in several metabolic pathways like primary metabolism (i.e. glycolysis, gluconeogenesis), ammonia metabolism, cytoskeleton proteins, signalizing proteins, lipid transport. Validition of these proteins as well as identical analysis in muscle and plasma are underway. Proteomics is a promising high-throughput technique that can be successfully applied to identify putative welfare protein biomarkers in farmed fish.Keywords: aquaculture, fish welfare, proteomics, welfare biomarkers
Procedia PDF Downloads 155102 The Need for a More Defined Role for Psychologists in Adult Consultation Liaison Services in Hospital Settings
Authors: Ana Violante, Jodie Maccarrone, Maria Fimiani
Abstract:
In the United States, over 30 million people are hospitalized annually for conditions that require acute, 24-hour, supervised care. The experience of hospitalization can be traumatic, exposing the patient to loss of control, autonomy, and productivity. Furthermore, 40% of patients admitted to hospitals for general medical illness have a comorbid psychiatric diagnosis. Research suggests individuals admitted with psychiatric comorbidities experience poorer health outcomes, higher utilization rates and increased overall cost of care. Empirical work suggests hospital settings that include a consultation liaison (CL) service report reduced length of stay, lower costs per patient, improved medical staff and patient satisfaction and reduced readmission after 180 days. Despite the overall positive impact CL services can have on patient care, it is estimated that only 1% - 2.8% of hospital admits receive these services, and most research has been conducted by the field of psychiatry. Health psychologists could play an important role in increasing access to this valuable service, though the extent to which health psychologists participate in CL settings is not well known. Objective: Outline the preliminary findings from an empirical study to understand how many APPIC internship training programs offer adult consultation liaison rotations within inpatient hospital settings nationally, as well as describe the specific nature of these training experiences. Research Method/Design: Data was exported into Excel from the 2022-2023 APPIC Directory categorized as “health psychology” sites. It initially returned a total of 537 health training programs out 1518 total programs (35% of all APPIC programs). A full review included a quantitative and qualitative comprehensive review of the APPIC program summary, the site website, and program brochures. The quantitative review extracted the number of training positions; amount of stipend; location or state of program, patient, population, and rotation. The qualitative review examined the nature of the training experience. Results: 29 (5%) of all APPIC health psychology internship training programs (2%) respectively of all APPIC training internship programs offering internship CL training were identified. Of the 29 internship training programs, 16 were exclusively within a pediatric setting (55%), 11 were exclusively within an adult setting (38%), and two were a mix of pediatric and adult settings (7%). CL training sites were located to 19 states, offering a total of 153 positions nationally, with Florida containing the largest number of programs (4). Only six programs offered 12-month training opportunities while the rest offered CL as a major (6 month) to minor (3-4 month) rotation. The program’s stipend for CL training positions ranged from $25,000 to $62,400, with an average of $32,056. Conclusions: These preliminary findings suggest CL training and services are currently limited. Training opportunities that do exist are mostly limited to minor, short rotations and governed by psychiatry. Health psychologists are well-positioned to better define the role of psychology in consultation liaison services and enhance and formalize existing training protocols. Future research should explore in more detail empirical outcomes of CL services that employ psychology and delineate the contributions of psychology from psychiatry and other disciplines within an inpatient hospital setting.Keywords: consultation liaison, health psychology, hospital setting, training
Procedia PDF Downloads 73101 Construction of an Assessment Tool for Early Childhood Development in the World of DiscoveryTM Curriculum
Authors: Divya Palaniappan
Abstract:
Early Childhood assessment tools must measure the quality and the appropriateness of a curriculum with respect to culture and age of the children. Preschool assessment tools lack psychometric properties and were developed to measure only few areas of development such as specific skills in music, art and adaptive behavior. Existing preschool assessment tools in India are predominantly informal and are fraught with judgmental bias of observers. The World of Discovery TM curriculum focuses on accelerating the physical, cognitive, language, social and emotional development of pre-schoolers in India through various activities. The curriculum caters to every child irrespective of their dominant intelligence as per Gardner’s Theory of Multiple Intelligence which concluded "even students as young as four years old present quite distinctive sets and configurations of intelligences". The curriculum introduces a new theme every week where, concepts are explained through various activities so that children with different dominant intelligences could understand it. For example: The ‘Insects’ theme is explained through rhymes, craft and counting corner, and hence children with one of these dominant intelligences: Musical, bodily-kinesthetic and logical-mathematical could grasp the concept. The child’s progress is evaluated using an assessment tool that measures a cluster of inter-dependent developmental areas: physical, cognitive, language, social and emotional development, which for the first time renders a multi-domain approach. The assessment tool is a 5-point rating scale that measures these Developmental aspects: Cognitive, Language, Physical, Social and Emotional. Each activity strengthens one or more of the developmental aspects. During cognitive corner, the child’s perceptual reasoning, pre-math abilities, hand-eye co-ordination and fine motor skills could be observed and evaluated. The tool differs from traditional assessment methodologies by providing a framework that allows teachers to assess a child’s continuous development with respect to specific activities in real time objectively. A pilot study of the tool was done with a sample data of 100 children in the age group 2.5 to 3.5 years. The data was collected over a period of 3 months across 10 centers in Chennai, India, scored by the class teacher once a week. The teachers were trained by psychologists on age-appropriate developmental milestones to minimize observer’s bias. The norms were calculated from the mean and standard deviation of the observed data. The results indicated high internal consistency among parameters and that cognitive development improved with physical development. A significant positive relationship between physical and cognitive development has been observed among children in a study conducted by Sibley and Etnier. In Children, the ‘Comprehension’ ability was found to be greater than ‘Reasoning’ and pre-math abilities as indicated by the preoperational stage of Piaget’s theory of cognitive development. The average scores of various parameters obtained through the tool corroborates the psychological theories on child development, offering strong face validity. The study provides a comprehensive mechanism to assess a child’s development and differentiate high performers from the rest. Based on the average scores, the difficulty level of activities could be increased or decreased to nurture the development of pre-schoolers and also appropriate teaching methodologies could be devised.Keywords: child development, early childhood assessment, early childhood curriculum, quantitative assessment of preschool curriculum
Procedia PDF Downloads 362100 Thermal Characterisation of Multi-Coated Lightweight Brake Rotors for Passenger Cars
Authors: Ankit Khurana
Abstract:
The sufficient heat storage capacity or ability to dissipate heat is the most decisive parameter to have an effective and efficient functioning of Friction-based Brake Disc systems. The primary aim of the research was to analyse the effect of multiple coatings on lightweight disk rotors surface which not only alleviates the mass of vehicle & also, augments heat transfer. This research is projected to aid the automobile fraternity with an enunciated view over the thermal aspects in a braking system. The results of the project indicate that with the advent of modern coating technologies a brake system’s thermal curtailments can be removed and together with forced convection, heat transfer processes can see a drastic improvement leading to increased lifetime of the brake rotor. Other advantages of modifying the surface of a lightweight rotor substrate will be to reduce the overall weight of the vehicle, decrease the risk of thermal brake failure (brake fade and fluid vaporization), longer component life, as well as lower noise and vibration characteristics. A mathematical model was constructed in MATLAB which encompassing the various thermal characteristics of the proposed coatings and substrate materials required to approximate the heat flux values in a free and forced convection environment; resembling to a real-time braking phenomenon which could easily be modelled into a full cum scaled version of the alloy brake rotor part in ABAQUS. The finite element of a brake rotor was modelled in a constrained environment such that the nodal temperature between the contact surfaces of the coatings and substrate (Wrought Aluminum alloy) resemble an amalgamated solid brake rotor element. The initial results obtained were for a Plasma Electrolytic Oxidized (PEO) substrate wherein the Aluminum alloy gets a hard ceramic oxide layer grown on its transitional phase. The rotor was modelled and then evaluated in real-time for a constant ‘g’ braking event (based upon the mathematical heat flux input and convective surroundings), which reflected the necessity to deposit a conducting coat (sacrificial) above the PEO layer in order to inhibit thermal degradation of the barrier coating prematurely. Taguchi study was then used to bring out certain critical factors which may influence the maximum operating temperature of a multi-coated brake disc by simulating brake tests: a) an Alpine descent lasting 50 seconds; b) an Autobahn stop lasting 3.53 seconds; c) a Six–high speed repeated stop in accordance to FMVSS 135 lasting 46.25 seconds. Thermal Barrier coating thickness and Vane heat transfer coefficient were the two most influential factors and owing to their design and manufacturing constraints a final optimized model was obtained which survived the 6-high speed stop test as per the FMVSS -135 specifications. The simulation data highlighted the merits for preferring Wrought Aluminum alloy 7068 over Grey Cast Iron and Aluminum Metal Matrix Composite in coherence with the multiple coating depositions.Keywords: lightweight brakes, surface modification, simulated braking, PEO, aluminum
Procedia PDF Downloads 40799 Partnering With Faith-Based Entities to Improve Mental Health Awareness and Decrease Stigma in African American Communities
Authors: Bryana Woodard, Monica Mitchell, Kasey Harry, Ebony Washington, Megan Harris, Marcia Boyd, Regina Lynch, Daphene Baines, Surbi Bankar
Abstract:
Introduction: African Americans experience mental health illnesses (i.e., depression, anxiety, etc.) at higher rates than their white counterparts. Despite this, they utilize mental health resources less and have lower mental health literacy, perhaps due to cultural barriers- including but not limited to mistrust. Research acknowledges African Americans’ close ties to community networks, identifying these linkages as key to establishing comfort and trust. Similarly, the church has historically been a space that creates unity and community among African Americans. Studies show that longstanding academic-community partnerships with organizations, such as churches and faith-based entities, have the capability to effectively address health and mental health barriers and needs in African Americans. The importance of implementing faith-based approaches is supported in the literature, however few empirical studies exist. This project describes the First Ladies for Health and Cincinnati Children's Hospital Medical Center (CCHMC) Partnership (FLFH-CCHMC Partnership) and the implementation and assessment of an annual Mental Health Symposium, the overall aim of which was to increase mental health awareness and decrease stigma in African American communities. Methods: The specific goals of the FLFH Mental Health Symposium were to (1) Collaborate with trusted partners to build trust with community participants; (2) Increase mental health literacy and decrease mental health stigma; (3) Understand the barriers to improving mental health and improving trust; (4) Assess the short-term outcomes two months following the symposium. Data were collected through post-event and follow-up surveys using a mixed methods approach. Results: More than 100 participants attended each year with over 350 total participants over three years. 98.7% of participants were African American, 86.67% female, 11.6% male, and 11.6% LGBTQ+/non-binary; 10.5% of participants were teens, with the remainder aged 20 to 80 plus. The event was successful in achieving its goals: (1a) Eleven different speakers from 8 community and church organizations presented; (1b) 93% of participants rated the overall symposium as very good or excellent (2a) Mental health literacy significantly increased each year with over 90% of participants reporting improvement in their “understanding” and “awareness of mental health (2b) Participants 'personal stigma surrounding mental health illness decreased each year with 92.3% of participants reporting changes in their “willingness to talk about and share” mental health challenges; (3) Barriers to mental health care were identified and included social stigma, lack of trust, and the cost of care. Data were used to develop priorities and an action plan for the FLFH-CCHMC Mental Health Partnership; (4) Follow-up data showed that participants sustained benefits of the FLFH Symposium and took actionable steps (e.g., meditation, referrals, etc.). Additional quantitative and qualitative data will be shared. Conclusions: Lower rates of mental health literacy and higher rates of stigma among participants in this initiative demonstrate the importance of mental health providers building trust and partnerships in communities. Working with faith-based entities provides an opportunity to mitigate and address mental health equity in African American communities.Keywords: community psychology, faith-based, african-american, culturally competent care, mental health equity
Procedia PDF Downloads 3498 High Purity Lignin for Asphalt Applications: Using the Dawn Technology™ Wood Fractionation Process
Authors: Ed de Jong
Abstract:
Avantium is a leading technology development company and a frontrunner in renewable chemistry. Avantium develops disruptive technologies that enable the production of sustainable high value products from renewable materials and actively seek out collaborations and partnerships with like-minded companies and academic institutions globally, to speed up introductions of chemical innovations in the marketplace. In addition, Avantium helps companies to accelerate their catalysis R&D to improve efficiencies and deliver increased sustainability, growth, and profits, by providing proprietary systems and services to this regard. Many chemical building blocks and materials can be produced from biomass, nowadays mainly from 1st generation based carbohydrates, but potential for competition with the human food chain leads brand-owners to look for strategies to transition from 1st to 2nd generation feedstock. The use of non-edible lignocellulosic feedstock is an equally attractive source to produce chemical intermediates and an important part of the solution addressing these global issues (Paris targets). Avantium’s Dawn Technology™ separates the glucose, mixed sugars, and lignin available in non-food agricultural and forestry residues such as wood chips, wheat straw, bagasse, empty fruit bunches or corn stover. The resulting very pure lignin is dense in energy and can be used for energy generation. However, such a material might preferably be deployed in higher added value applications. Bitumen, which is fossil based, are mostly used for paving applications. Traditional hot mix asphalt emits large quantities of the GHG’s CO₂, CH₄, and N₂O, which is unfavorable for obvious environmental reasons. Another challenge for the bitumen industry is that the petrochemical industry is becoming more and more efficient in breaking down higher chain hydrocarbons to lower chain hydrocarbons with higher added value than bitumen. This has a negative effect on the availability of bitumen. The asphalt market, as well as governments, are looking for alternatives with higher sustainability in terms of GHG emission. The usage of alternative sustainable binders, which can (partly) replace the bitumen, contributes to reduce GHG emissions and at the same time broadens the availability of binders. As lignin is a major component (around 25-30%) of lignocellulosic material, which includes terrestrial plants (e.g., trees, bushes, and grass) and agricultural residues (e.g., empty fruit bunches, corn stover, sugarcane bagasse, straw, etc.), it is globally highly available. The chemical structure shows resemblance with the structure of bitumen and could, therefore, be used as an alternative for bitumen in applications like roofing or asphalt. Applications such as the use of lignin in asphalt need both fundamental research as well as practical proof under relevant use conditions. From a fundamental point of view, rheological aspects, as well as mixing, are key criteria. From a practical point of view, behavior in real road conditions is key (how easy can the asphalt be prepared, how easy can it be applied on the road, what is the durability, etc.). The paper will discuss the fundamentals of the use of lignin as bitumen replacement as well as the status of the different demonstration projects in Europe using lignin as a partial bitumen replacement in asphalts and will especially present the results of using Dawn Technology™ lignin as partial replacement of bitumen.Keywords: biorefinery, wood fractionation, lignin, asphalt, bitumen, sustainability
Procedia PDF Downloads 15397 Impact of Interdisciplinary Therapy Allied to Online Health Education on Cardiometabolic Parameters and Inflammation Factor Rating in Obese Adolescents
Authors: Yasmin A. M. Ferreira, Ana C. K. Pelissari, Sofia De C. F. Vicente, Raquel M. Da S. Campos, Deborah C. L. Masquio, Lian Tock, Lila M. Oyama, Flavia C. Corgosinho, Valter T. Boldarine, Ana R. Dâmaso
Abstract:
The prevalence of overweight and obesity is growing around the world and currently considered a global epidemic. Food and nutrition are essential requirements for promoting health and protecting non-communicable chronic diseases, such as obesity and cardiovascular disease. Specific dietary components may modulate the inflammation and oxidative stress in obese individuals. Few studies have investigated the dietary Inflammation Factor Rating (IFR) in obese adolescents. The IFR was developed to characterize an individual´s diet on anti- to pro-inflammatory score. This evaluation contributes to investigate the effects of inflammatory diet in metabolic profile in several individual conditions. Objectives: The present study aims to investigate the effects of a multidisciplinary weight loss therapy on inflammation factor rating and cardiometabolic risk in obese adolescents. Methods: A total of 26 volunteers (14-19 y.o) were recruited and submitted to 20 weeks interdisciplinary therapy allied to health education website- Ciclo do Emagrecimento®, including clinical, nutritional, psychological counseling and exercise training. The body weight was monitored weekly by self-report and photo. The adolescents answered a test to evaluate the knowledge of the topics covered in the videos. A 24h dietary record was applied at the baseline and after 20 weeks to assess the food intake and to calculate IFR. A negative IFR suggests that diet may have inflammatory effects and a positive IFR indicates an anti-inflammatory effect. Statistical analysis was performed using the program STATISTICA version 12.5 for Windows. The adopted significant value was α ≤ 5 %. Data normality was verified with the Kolmogorov Smirnov test. Data were expressed as mean±SD values. To analyze the effects of intervention it was applied test t. Pearson´s correlations test was performed. Results: After 20 weeks of treatment, body mass index (BMI), body weight, body fat (kg and %), abdominal and waist circumferences decreased significantly. The mean of high-density lipoprotein cholesterol (HDL-c) increased after the therapy. Moreover, it was found an improvement of inflammation factor rating from -427,27±322,47 to -297,15±240,01, suggesting beneficial effects of nutritional counselling. Considering the correlations analysis, it was found that pro-inflammatory diet is associated with increase in the BMI, very low-density lipoprotein cholesterol (VLDL), triglycerides, insulin and insulin resistance index (HOMA-IR); while an anti-inflammatory diet is associated with improvement of HDL-c and insulin sensitivity Check index (QUICKI). Conclusion: The 20-week blended multidisciplinary therapy was effective to reduce body weight, anthropometric circumferences and improve inflammatory markers in obese adolescents. In addition, our results showed that an increase in inflammatory profile diet is associated with cardiometabolic parameters, suggesting the relevance to stimulate anti-inflammatory diet habits as an effective strategy to treat and control of obesity and related comorbidities. Financial Support: FAPESP (2017/07372-1) and CNPq (409943/2016-9)Keywords: cardiometabolic risk, inflammatory diet, multidisciplinary therapy, obesity
Procedia PDF Downloads 19296 Polysaccharide Polyelectrolyte Complexation: An Engineering Strategy for the Development of Commercially Viable Sustainable Materials
Authors: Jeffrey M. Catchmark, Parisa Nazema, Caini Chen, Wei-Shu Lin
Abstract:
Sustainable and environmentally compatible materials are needed for a wide variety of volume commercial applications. Current synthetic materials such as plastics, fluorochemicals (such as PFAS), adhesives and resins in form of sheets, laminates, coatings, foams, fibers, molded parts and composites are used for countless products such as packaging, food handling, textiles, biomedical, construction, automotive and general consumer devices. Synthetic materials offer distinct performance advantages including stability, durability and low cost. These attributes are associated with the physical and chemical properties of these materials that, once formed, can be resistant to water, oils, solvents, harsh chemicals, salt, temperature, impact, wear and microbial degradation. These advantages become disadvantages when considering the end of life of these products which generate significant land and water pollution when disposed of and few are recycled. Agriculturally and biologically derived polymers offer the potential of remediating these environmental and life-cycle difficulties, but face numerous challenges including feedstock supply, scalability, performance and cost. Such polymers include microbial biopolymers like polyhydroxyalkanoates and polyhydroxbutirate; polymers produced using biomonomer chemical synthesis like polylactic acid; proteins like soy, collagen and casein; lipids like waxes; and polysaccharides like cellulose and starch. Although these materials, and combinations thereof, exhibit the potential for meeting some of the performance needs of various commercial applications, only cellulose and starch have both the production feedstock volume and cost to compete with petroleum derived materials. Over 430 million tons of plastic is produced each year and plastics like low density polyethylene cost ~$1500 to $1800 per ton. Over 400 million tons of cellulose and over 100 million tons of starch are produced each year at a volume cost as low as ~$500 to $1000 per ton with the capability of increased production. Cellulose and starches, however, are hydroscopic materials that do not exhibit the needed performance in most applications. Celluloses and starches can be chemically modified to contain positive and negative surface charges and such modified versions of these are used in papermaking, foods and cosmetics. Although these modified polysaccharides exhibit the same performance limitations, recent research has shown that composite materials comprised of cationic and anionic polysaccharides in polyelectrolyte complexation exhibit significantly improved performance including stability in diverse environments. Moreover, starches with added plasticizers can exhibit thermoplasticity, presenting the possibility of improved thermoplastic starches when comprised of starches in polyelectrolyte complexation. In this work, the potential for numerous volume commercial products based on polysaccharide polyelectrolyte complexes (PPCs) will be discussed, including the engineering design strategy used to develop them. Research results will be detailed including the development and demonstration of starch PPC compositions for paper coatings to replace PFAS; adhesives; foams for packaging, insulation and biomedical applications; and thermoplastic starches. In addition, efforts to demonstrate the potential for volume manufacturing with industrial partners will be discussed.Keywords: biomaterials engineering, commercial materials, polysaccharides, sustainable materials
Procedia PDF Downloads 1695 The Healthcare Costs of BMI-Defined Obesity among Adults Who Have Undergone a Medical Procedure in Alberta, Canada
Authors: Sonia Butalia, Huong Luu, Alexis Guigue, Karen J. B. Martins, Khanh Vu, Scott W. Klarenbach
Abstract:
Obesity is associated with significant personal impacts on health and has a substantial economic burden on payers due to increased healthcare use. A contemporary estimate of the healthcare costs associated with obesity at the population level are lacking. This evidence may provide further rationale for weight management strategies. Methods: Adults who underwent a medical procedure between 2012 and 2019 in Alberta, Canada were categorized into the investigational cohort (had body mass index [BMI]-defined class 2 or 3 obesity based on a procedure-associated code) and the control cohort (did not have the BMI procedure-associated code); those who had bariatric surgery were excluded. Characteristics were presented and healthcare costs ($CDN) determined over a 1-year observation period (2019/2020). Logistic regression and a generalized linear model with log link and gamma distribution were used to assess total healthcare costs (comprised of hospitalizations, emergency department visits, ambulatory care visits, physician visits, and outpatient prescription drugs); potential confounders included age, sex, region of residence, and whether the medical procedure was performed within 6-months before the observation period in the partial adjustment, and also the type of procedure performed, socioeconomic status, Charlson Comorbidity Index (CCI), and seven obesity-related health conditions in the full adjustment. Cost ratios and estimated cost differences with 95% confidence intervals (CI) were reported; incremental cost differences within the adjusted models represent referent cases. Results: The investigational cohort (n=220,190) was older (mean age: 53 standard deviation [SD]±17 vs 50 SD±17 years), had more females (71% vs 57%), lived in rural areas to a greater extent (20% vs 14%), experienced a higher overall burden of disease (CCI: 0.6 SD±1.3 vs 0.3 SD±0.9), and were less socioeconomically well-off (material/social deprivation was lower [14%/14%] in the most well-off quintile vs 20%/19%) compared with controls (n=1,955,548). Unadjusted total healthcare costs were estimated to be 1.77-times (95% CI: 1.76, 1.78) higher in the investigational versus control cohort; each healthcare resource contributed to the higher cost ratio. After adjusting for potential confounders, the total healthcare cost ratio decreased, but remained higher in the investigational versus control cohort (partial adjustment: 1.57 [95% CI: 1.57, 1.58]; full adjustment: 1.21 [95% CI: 1.20, 1.21]); each healthcare resource contributed to the higher cost ratio. Among urban-dwelling 50-year old females who previously had non-operative procedures, no procedures performed within 6-months before the observation period, a social deprivation index score of 3, a CCI score of 0.32, and no history of select obesity-related health conditions, the predicted cost difference between those living with and without obesity was $386 (95% CI: $376, $397). Conclusions: If these findings hold for the Canadian population, one would expect an estimated additional $3.0 billion per year in healthcare costs nationally related to BMI-defined obesity (based on an adult obesity rate of 26% and an estimated annual incremental cost of $386 [21%]); incremental costs are higher when obesity-related health conditions are not adjusted for. Results of this study provide additional rationale for investment in interventions that are effective in preventing and treating obesity and its complications.Keywords: administrative data, body mass index-defined obesity, healthcare cost, real world evidence
Procedia PDF Downloads 10794 Effects of Heart Rate Variability Biofeedback to Improve Autonomic Nerve Function, Inflammatory Response and Symptom Distress in Patients with Chronic Kidney Disease: A Randomized Control Trial
Authors: Chia-Pei Chen, Yu-Ju Chen, Yu-Juei Hsu
Abstract:
The prevalence and incidence of end-stage renal disease in Taiwan ranks the highest in the world. According to the statistical survey of the Ministry of Health and Welfare in 2019, kidney disease is the ninth leading cause of death in Taiwan. It leads to autonomic dysfunction, inflammatory response and symptom distress, and further increases the damage to the structure and function of the kidneys, leading to increased demand for renal replacement therapy and risks of cardiovascular disease, which also has medical costs for the society. If we can intervene in a feasible manual to effectively regulate the autonomic nerve function of CKD patients, reduce the inflammatory response and symptom distress. To prolong the progression of the disease, it will be the main goal of caring for CKD patients. This study aims to test the effect of heart rate variability biofeedback (HRVBF) on improving autonomic nerve function (Heart Rate Variability, HRV), inflammatory response (Interleukin-6 [IL-6], C reaction protein [CRP] ), symptom distress (Piper fatigue scale, Pittsburgh Sleep Quality Index [PSQI], and Beck Depression Inventory-II [BDI-II] ) in patients with chronic kidney disease. This study was experimental research, with a convenience sampling. Participants were recruited from the nephrology clinic at a medical center in northern Taiwan. With signed informed consent, participants were randomly assigned to the HRVBF or control group by using the Excel BINOMDIST function. The HRVBF group received four weekly hospital-based HRVBF training, and 8 weeks of home-based self-practice was done with StressEraser. The control group received usual care. We followed all participants for 3 months, in which we repeatedly measured their autonomic nerve function (HRV), inflammatory response (IL-6, CRP), and symptom distress (Piper fatigue scale, PSQI, and BDI-II) on their first day of study participation (baselines), 1 month, and 3 months after the intervention to test the effects of HRVBF. The results were analyzed by SPSS version 23.0 statistical software. The data of demographics, HRV, IL-6, CRP, Piper fatigue scale, PSQI, and BDI-II were analyzed by descriptive statistics. To test for differences between and within groups in all outcome variables, it was used by paired sample t-test, independent sample t-test, Wilcoxon Signed-Rank test and Mann-Whitney U test. Results: Thirty-four patients with chronic kidney disease were enrolled, but three of them were lost to follow-up. The remaining 31 patients completed the study, including 15 in the HRVBF group and 16 in the control group. The characteristics of the two groups were not significantly different. The four-week hospital-based HRVBF training combined with eight-week home-based self-practice can effectively enhance the parasympathetic nerve performance for patients with chronic kidney disease, which may against the disease-related parasympathetic nerve inhibition. In the inflammatory response, IL-6 and CRP in the HRVBF group could not achieve significant improvement when compared with the control group. Self-reported fatigue and depression significantly decreased in the HRVBF group, but they still failed to achieve a significant difference between the two groups. HRVBF has no significant effect on improving the sleep quality for CKD patients.Keywords: heart rate variability biofeedback, autonomic nerve function, inflammatory response, symptom distress, chronic kidney disease
Procedia PDF Downloads 17993 Unveiling the Dynamics of Preservice Teachers’ Engagement with Mathematical Modeling through Model Eliciting Activities: A Comprehensive Exploration of Acceptance and Resistance Towards Modeling and Its Pedagogy
Authors: Ozgul Kartal, Wade Tillett, Lyn D. English
Abstract:
Despite its global significance in curricula, mathematical modeling encounters persistent disparities in recognition and emphasis within regular mathematics classrooms and teacher education across countries with diverse educational and cultural traditions, including variations in the perceived role of mathematical modeling. Over the past two decades, increased attention has been given to the integration of mathematical modeling into national curriculum standards in the U.S. and other countries. Therefore, the mathematics education research community has dedicated significant efforts to investigate various aspects associated with the teaching and learning of mathematical modeling, primarily focusing on exploring the applicability of modeling in schools and assessing students', teachers', and preservice teachers' (PTs) competencies and engagement in modeling cycles and processes. However, limited attention has been directed toward examining potential resistance hindering teachers and PTs from effectively implementing mathematical modeling. This study focuses on how PTs, without prior modeling experience, resist and/or embrace mathematical modeling and its pedagogy as they learn about models and modeling perspectives, navigate the modeling process, design and implement their modeling activities and lesson plans, and experience the pedagogy enabling modeling. Model eliciting activities (MEAs) were employed due to their high potential to support the development of mathematical modeling pedagogy. The mathematical modeling module was integrated into a mathematics methods course to explore how PTs embraced or resisted mathematical modeling and its pedagogy. The module design included reading, reflecting, engaging in modeling, assessing models, creating a modeling task (MEA), and designing a modeling lesson employing an MEA. Twelve senior undergraduate students participated, and data collection involved video recordings, written prompts, lesson plans, and reflections. An open coding analysis revealed acceptance and resistance toward teaching mathematical modeling. The study identified four overarching themes, including both acceptance and resistance: pedagogy, affordance of modeling (tasks), modeling actions, and adjusting modeling. In the category of pedagogy, PTs displayed acceptance based on potential pedagogical benefits and resistance due to various concerns. The affordance of modeling (tasks) category emerged from instances when PTs showed acceptance or resistance while discussing the nature and quality of modeling tasks, often debating whether modeling is considered mathematics. PTs demonstrated both acceptance and resistance in their modeling actions, engaging in modeling cycles as students and designing/implementing MEAs as teachers. The adjusting modeling category captured instances where PTs accepted or resisted maintaining the qualities and nature of the modeling experience or converted modeling into a typical structured mathematics experience for students. While PTs displayed a mix of acceptance and resistance in their modeling actions, limitations were observed in embracing complexity and adhering to model principles. The study provides valuable insights into the challenges and opportunities of integrating mathematical modeling into teacher education, emphasizing the importance of addressing pedagogical concerns and providing support for effective implementation. In conclusion, this research offers a comprehensive understanding of PTs' engagement with modeling, advocating for a more focused discussion on the distinct nature and significance of mathematical modeling in the broader curriculum to establish a foundation for effective teacher education programs.Keywords: mathematical modeling, model eliciting activities, modeling pedagogy, secondary teacher education
Procedia PDF Downloads 6392 Learning from Dendrites: Improving the Point Neuron Model
Authors: Alexander Vandesompele, Joni Dambre
Abstract:
The diversity in dendritic arborization, as first illustrated by Santiago Ramon y Cajal, has always suggested a role for dendrites in the functionality of neurons. In the past decades, thanks to new recording techniques and optical stimulation methods, it has become clear that dendrites are not merely passive electrical components. They are observed to integrate inputs in a non-linear fashion and actively participate in computations. Regardless, in simulations of neural networks dendritic structure and functionality are often overlooked. Especially in a machine learning context, when designing artificial neural networks, point neuron models such as the leaky-integrate-and-fire (LIF) model are dominant. These models mimic the integration of inputs at the neuron soma, and ignore the existence of dendrites. In this work, the LIF point neuron model is extended with a simple form of dendritic computation. This gives the LIF neuron increased capacity to discriminate spatiotemporal input sequences, a dendritic functionality as observed in another study. Simulations of the spiking neurons are performed using the Bindsnet framework. In the common LIF model, incoming synapses are independent. Here, we introduce a dependency between incoming synapses such that the post-synaptic impact of a spike is not only determined by the weight of the synapse, but also by the activity of other synapses. This is a form of short term plasticity where synapses are potentiated or depressed by the preceding activity of neighbouring synapses. This is a straightforward way to prevent inputs from simply summing linearly at the soma. To implement this, each pair of synapses on a neuron is assigned a variable,representing the synaptic relation. This variable determines the magnitude ofthe short term plasticity. These variables can be chosen randomly or, more interestingly, can be learned using a form of Hebbian learning. We use Spike-Time-Dependent-Plasticity (STDP), commonly used to learn synaptic strength magnitudes. If all neurons in a layer receive the same input, they tend to learn the same through STDP. Adding inhibitory connections between the neurons creates a winner-take-all (WTA) network. This causes the different neurons to learn different input sequences. To illustrate the impact of the proposed dendritic mechanism, even without learning, we attach five input neurons to two output neurons. One output neuron isa regular LIF neuron, the other output neuron is a LIF neuron with dendritic relationships. Then, the five input neurons are allowed to fire in a particular order. The membrane potentials are reset and subsequently the five input neurons are fired in the reversed order. As the regular LIF neuron linearly integrates its inputs at the soma, the membrane potential response to both sequences is similar in magnitude. In the other output neuron, due to the dendritic mechanism, the membrane potential response is different for both sequences. Hence, the dendritic mechanism improves the neuron’s capacity for discriminating spa-tiotemporal sequences. Dendritic computations improve LIF neurons even if the relationships between synapses are established randomly. Ideally however, a learning rule is used to improve the dendritic relationships based on input data. It is possible to learn synaptic strength with STDP, to make a neuron more sensitive to its input. Similarly, it is possible to learn dendritic relationships with STDP, to make the neuron more sensitive to spatiotemporal input sequences. Feeding structured data to a WTA network with dendritic computation leads to a significantly higher number of discriminated input patterns. Without the dendritic computation, output neurons are less specific and may, for instance, be activated by a sequence in reverse order.Keywords: dendritic computation, spiking neural networks, point neuron model
Procedia PDF Downloads 13291 Quality in Healthcare: An Autism-Friendly Hospital Emergency Waiting Room
Authors: Elena Bellini, Daniele Mugnaini, Michele Boschetto
Abstract:
People with an Autistic Spectrum Disorder and an Intellectual Disability who need to attend a Hospital Emergency Waiting Room frequently present high levels of discomfort and challenging behaviors due to stress-related hyperarousal, sensory sensitivity, novelty-anxiety, communication and self-regulation difficulties. Increased agitation and acting out also disturb the diagnostic and therapeutic processes, and the emergency room climate. Architectural design disciplines aimed at reducing distress in hospitals or creating autism-friendly environments are called for to find effective answers to this particular need. A growing number of researchers are considering the physical environment as an important point of intervention for people with autism. It has been shown that providing the right setting can help enhance confidence and self-esteem and can have a profound impact on their health and wellbeing. Environmental psychology has evaluated the perceived quality of care, looking at the design of hospital rooms, paths and circulation, waiting rooms, services and devices. Furthermore, many studies have investigated the influence of the hospital environment on patients, in terms of stress-reduction and therapeutic intervention’ speed, but also on health professionals and their work. Several services around the world are organizing autism-friendly hospital environments which involve the architecture and the specific staff training. In Italy, the association Spes contra spem has promoted and published, in 2013, the ‘Chart of disabled people in the hospital’. It stipulates that disabled people should have equal rights to accessible and high-quality care. There are a few Italian examples of therapeutic programmes for autistic people as the Dama project in Milan and the recent experience of Children and Autism Foundation in Pordenone. Careggi’s Emergency Waiting Room in Florence has been built to satisfy this challenge. This project of research comes from a collaboration between the technical staff of Careggi Hospital, the Center for autism PAMAPI and some architects expert in the sensory environment. The methodology of focus group involved architects, psychologists and professionals through a transdisciplinary research, centered on the links between the spatial characteristics and clinical state of people with ASD. The relationship between architectural space and quality of life is studied to pay maximum attention to users’ needs and to support the medical staff in their work by a specific program of training. The result of this research is a sum of criteria used to design the emergency waiting room, that will be illustrated. A protected room, with a clear space design, maximizes comprehension and predictability. The multisensory environment is thought to help sensory integration and relaxation. Visual communication through Ipad allows an anticipated understanding of medical procedures, and a specific technological system supports requests, choices and self-determination in order to fit sensory stimulation to personal preferences, especially for hypo and hypersensitive people. All these characteristics should ensure a better regulation of the arousal, less behavior problems, improving treatment accessibility, safety, and effectiveness. First results about patient-satisfaction levels will be presented.Keywords: accessibility of care, autism-friendly architecture, personalized therapeutic process, sensory environment
Procedia PDF Downloads 26590 The Use of Artificial Intelligence in the Context of a Space Traffic Management System: Legal Aspects
Authors: George Kyriakopoulos, Photini Pazartzis, Anthi Koskina, Crystalie Bourcha
Abstract:
The need for securing safe access to and return from outer space, as well as ensuring the viability of outer space operations, maintains vivid the debate over the promotion of organization of space traffic through a Space Traffic Management System (STM). The proliferation of outer space activities in recent years as well as the dynamic emergence of the private sector has gradually resulted in a diverse universe of actors operating in outer space. The said developments created an increased adverse impact on outer space sustainability as the case of the growing number of space debris clearly demonstrates. The above landscape sustains considerable threats to outer space environment and its operators that need to be addressed by a combination of scientific-technological measures and regulatory interventions. In this context, recourse to recent technological advancements and, in particular, to Artificial Intelligence (AI) and machine learning systems, could achieve exponential results in promoting space traffic management with respect to collision avoidance as well as launch and re-entry procedures/phases. New technologies can support the prospects of a successful space traffic management system at an international scale by enabling, inter alia, timely, accurate and analytical processing of large data sets and rapid decision-making, more precise space debris identification and tracking and overall minimization of collision risks and reduction of operational costs. What is more, a significant part of space activities (i.e. launch and/or re-entry phase) takes place in airspace rather than in outer space, hence the overall discussion also involves the highly developed, both technically and legally, international (and national) Air Traffic Management System (ATM). Nonetheless, from a regulatory perspective, the use of AI for the purposes of space traffic management puts forward implications that merit particular attention. Key issues in this regard include the delimitation of AI-based activities as space activities, the designation of the applicable legal regime (international space or air law, national law), the assessment of the nature and extent of international legal obligations regarding space traffic coordination, as well as the appropriate liability regime applicable to AI-based technologies when operating for space traffic coordination, taking into particular consideration the dense regulatory developments at EU level. In addition, the prospects of institutionalizing international cooperation and promoting an international governance system, together with the challenges of establishment of a comprehensive international STM regime are revisited in the light of intervention of AI technologies. This paper aims at examining regulatory implications advanced by the use of AI technology in the context of space traffic management operations and its key correlating concepts (SSA, space debris mitigation) drawing in particular on international and regional considerations in the field of STM (e.g. UNCOPUOS, International Academy of Astronautics, European Space Agency, among other actors), the promising advancements of the EU approach to AI regulation and, last but not least, national approaches regarding the use of AI in the context of space traffic management, in toto. Acknowledgment: The present work was co-funded by the European Union and Greek national funds through the Operational Program "Human Resources Development, Education and Lifelong Learning " (NSRF 2014-2020), under the call "Supporting Researchers with an Emphasis on Young Researchers – Cycle B" (MIS: 5048145).Keywords: artificial intelligence, space traffic management, space situational awareness, space debris
Procedia PDF Downloads 25689 Empirical Modeling and Spatial Analysis of Heat-Related Morbidity in Maricopa County, Arizona
Authors: Chuyuan Wang, Nayan Khare, Lily Villa, Patricia Solis, Elizabeth A. Wentz
Abstract:
Maricopa County, Arizona, has a semi-arid hot desert climate that is one of the hottest regions in the United States. The exacerbated urban heat island (UHI) effect caused by rapid urbanization has made the urban area even hotter than the rural surroundings. The Phoenix metropolitan area experiences extremely high temperatures in the summer from June to September that can reach the daily highest of 120 °F (48.9 °C). Morbidity and mortality due to the environmental heat is, therefore, a significant public health issue in Maricopa County, especially because it is largely preventable. Public records from the Maricopa County Department of Public Health (MCDPH) revealed that between 2012 and 2016, there were 10,825 incidents of heat-related morbidity incidents, 267 outdoor environmental heat deaths, and 173 indoor heat-related deaths. A lot of research has examined heat-related death and its contributing factors around the world, but little has been done regarding heat-related morbidity issues, especially for regions that are naturally hot in the summer. The objective of this study is to examine the demographic, socio-economic, housing, and environmental factors that contribute to heat-related morbidity in Maricopa County. We obtained heat-related morbidity data between 2012 and 2016 at census tract level from MCDPH. Demographic, socio-economic, and housing variables were derived using 2012-2016 American Community Survey 5-year estimate from the U.S. Census. Remotely sensed Landsat 7 ETM+ and Landsat 8 OLI satellite images and Level-1 products were acquired for all the summer months (June to September) from 2012 and 2016. The National Land Cover Database (NLCD) 2016 percent tree canopy and percent developed imperviousness data were obtained from the U.S. Geological Survey (USGS). We used ordinary least squares (OLS) regression analysis to examine the empirical relationship between all the independent variables and heat-related morbidity rate. Results showed that higher morbidity rates are found in census tracts with higher values in population aged 65 and older, population under poverty, disability, no vehicle ownership, white non-Hispanic, population with less than high school degree, land surface temperature, and surface reflectance, but lower values in normalized difference vegetation index (NDVI) and housing occupancy. The regression model can be used to explain up to 59.4% of total variation of heat-related morbidity in Maricopa County. The multiscale geographically weighted regression (MGWR) technique was then used to examine the spatially varying relationships between heat-related morbidity rate and all the significant independent variables. The R-squared value of the MGWR model increased to 0.691, that shows a significant improvement in goodness-of-fit than the global OLS model, which means that spatial heterogeneity of some independent variables is another important factor that influences the relationship with heat-related morbidity in Maricopa County. Among these variables, population aged 65 and older, the Hispanic population, disability, vehicle ownership, and housing occupancy have much stronger local effects than other variables.Keywords: census, empirical modeling, heat-related morbidity, spatial analysis
Procedia PDF Downloads 12688 Unity in Diversity: Exploring the Psychological Processes and Mechanisms of the Sense of Community for the Chinese Nation in Ethnic Inter-embedded Communities
Authors: Jiamin Chen, Liping Yang
Abstract:
In 2007, sociologist Putnam proposed a pessimistic forecast in the United States' "Social Capital Community Benchmark Survey," suggesting that "ethnic diversity would challenge social unity and undermine social cohesion." If this pessimistic assumption were proven true, it would indicate a risk of division in diverse societies. China, with 56 ethnic groups, is a multi-ethnic country. On May 26, 2014, General Secretary Xi Jinping proposed "building ethnically inter-embedded communities to promote deeper development in interactions, exchanges, and integration among ethnic groups." Researchers unanimously agree that ethnic inter-embedded communities can serve as practical arenas and pathways for solidifying the sense of the Chinese national community However, there is no research providing evidence that ethnic inter-embedded communities can foster the sense of the Chinese national community, and the influencing factors remain unclear. This study adopts a constructivist grounded theory research approach. Convenience sampling and snowball sampling were used in the study. Data were collected in three communities in Kunming City. Twelve individuals were eventually interviewed, and the transcribed interviews totaled 187,000 words. The research has obtained ethical approval from the Ethics Committee of Nanjing Normal University (NNU202310030). The research analyzed the data and constructed theories, employing strategies such as coding, constant comparison, and theoretical sampling. The study found that: firstly, ethnic inter-embedded communities exhibit characteristics of diversity, including ethnic diversity, cultural diversity, and linguistic diversity. Diversity has positive functions, including increased opportunities for contact, promoting self-expansion, and increasing happiness; negative functions of diversity include highlighting ethnic differences, causing ethnic conflicts, and reminding of ethnic boundaries. Secondly, individuals typically engage in interactions within the community using active embedding and passive embedding strategies. Active embedding strategies include maintaining openness, focusing on similarities, and pro-diversity beliefs, which can increase external group identification, intergroup relational identity, and promote ethnic integration. Individuals using passive embedding strategies tend to focus on ethnic stereotypes, perceive stigmatization of their own ethnic group, and adopt an authoritarian-oriented approach to interactions, leading to a perception of more identity threats and ultimately rejecting ethnic integration. Thirdly, the commonality of the Chinese nation is reflected in the 56 ethnic groups as an "identity community" and "interest community," and both active and passive embedding paths affect individual understanding of the commonality of the Chinese nation. Finally, community work and environment can influence the embedding process. The research constructed a social psychological process and mechanism model for solidifying sense of the Chinese national community in ethnic inter-embedded communities. Based on this theoretical model, future research can conduct more micro-level psychological mechanism tests and intervention studies to enhance Chinese national cohesion.Keywords: diversity, sense of the chinese national community, ethnic inter-embedded communities, ethnic group
Procedia PDF Downloads 3787 Numerical Analysis of Mandible Fracture Stabilization System
Authors: Piotr Wadolowski, Grzegorz Krzesinski, Piotr Gutowski
Abstract:
The aim of the presented work is to recognize the impact of mini-plate application approach on the stress and displacement within the stabilization devices and surrounding bones. The mini-plate osteosynthesis technique is widely used by craniofacial surgeons as an improved replacement of wire connection approach. Many different types of metal plates and screws are used to the physical connection of fractured bones. Below investigation is based on a clinical observation of patient hospitalized with mini-plate stabilization system. Analysis was conducted on a solid mandible geometry, which was modeled basis on the computed tomography scan of the hospitalized patient. In order to achieve most realistic connected system behavior, the cortical and cancellous bone layers were assumed. The temporomandibular joint was simplified to the elastic element to allow physiological movement of loaded bone. The muscles of mastication system were reduced to three pairs, modeled as shell structures. Finite element grid was created by the ANSYS software, where hexahedral and tetrahedral variants of SOLID185 element were used. A set of nonlinear contact conditions were applied on connecting devices and bone common surfaces. Properties of particular contact pair depend on screw - mini-plate connection type and possible gaps between fractured bone around osteosynthesis region. Some of the investigated cases contain prestress introduced to the mini-plate during the application, what responds the initial bending of the connecting device to fit the retromolar fossa region. Assumed bone fracture occurs within the mandible angle zone. Due to the significant deformation of the connecting plate in some of the assembly cases the elastic-plastic model of titanium alloy was assumed. The bone tissues were covered by the orthotropic material. As a loading were used the gauge force of magnitude of 100N applied in three different locations. Conducted analysis shows significant impact of mini-plate application methodology on the stress distribution within the miniplate. Prestress effect introduces additional loading, which leads to locally exceed the titanium alloy yield limit. Stress in surrounding bone increases rapidly around the screws application region, exceeding assumed bone yield limit, what indicate the local bone destruction. Approach with the doubled mini-plate shows increased stress within the connector due to the too rigid connection, where the main path of loading leads through the mini-plates instead of plates and connected bones. Clinical observations confirm more frequent plate destruction of stiffer connections. Some of them could be an effect of decreased low cyclic fatigue capability caused by the overloading. The executed analysis prove that the mini-plate system provides sufficient support to mandible fracture treatment, however, many applicable solutions shifts the entire system to the allowable material limits. The results show that connector application with the initial loading needs to be carefully established due to the small material capability tolerances. Comparison to the clinical observations allows optimizing entire connection to prevent future incidents.Keywords: mandible fracture, mini-plate connection, numerical analysis, osteosynthesis
Procedia PDF Downloads 27186 Feasibility of Implementing Digital Healthcare Technologies to Prevent Disease: A Mixed-Methods Evaluation of a Digital Intervention Piloted in the National Health Service
Authors: Rosie Cooper, Tracey Chantler, Ellen Pringle, Sadie Bell, Emily Edmundson, Heidi Nielsen, Sheila Roberts, Michael Edelstein, Sandra Mounier Jack
Abstract:
Introduction: In line with the National Health Service’s (NHS) long-term plan, the NHS is looking to implement more digital health interventions. This study explores a case study in this area: a digital intervention used by NHS Trusts in London to consent adolescents for Human Papilloma Virus (HPV) immunisation. Methods: The electronic consent intervention was implemented in 14 secondary schools in inner city, London. These schools were statistically matched with 14 schools from the same area that were consenting using paper forms. Schools were matched on deprivation and English as an additional language. Consent form return rates and HPV vaccine uptake were compared quantitatively between intervention and matched schools. Data from observations of immunisation sessions and school feedback forms were analysed thematically. Individual and group interviews were undertaken with implementers parents and adolescents and a focus group with adolescents were undertaken and analysed thematically. Results: Twenty-eight schools (14 e-consent schools and 14 paper consent schools) comprising 3219 girls (1733 in paper consent schools and 1486 in e-consent schools) were included in the study. The proportion of pupils eligible for free school meals, with English as an additional language and students' ethnicity profile, was similar between the e-consent and paper consent schools. Return of consent forms was not increased by the implementation of the e-consent intervention. There was no difference in the proportion of pupils that were vaccinated at the scheduled vaccination session between the paper (n=14) and e-consent (n=14) schools (80.6% vs. 81.3%, p=0.93). The transition to using the system was not straightforward, whilst schools and staff understood the potential benefits, they found it difficult to adapt to new ways of working which removed some level or control from schools. Part of the reason for lower consent form return in e-consent schools was that some parents found the intervention difficult to use due to limited access to the internet, finding it hard to open the weblink, language barriers, and in some cases, the system closed a few days prior to sessions. Adolescents also highlighted the potential for e-consent interventions to by-pass their information needs. Discussion: We would advise caution against dismissing the e-consent intervention because it did not achieve its goal of increasing the return of consent forms. Given the problems embedding a news service, it was encouraging that HPV vaccine uptake remained stable. Introducing change requires stakeholders to understand, buy in, and work together with others. Schools and staff understood the potential benefits of using e-consent but found the new ways of working removed some level of control from schools, which they found hard to adapt to, possibly suggesting implementing digital technology will require an embedding process. Conclusion: The future direction of the NHS will require implementation of digital technology. Obtaining electronic consent from parents could help streamline school-based adolescent immunisation programmes. Findings from this study suggest that when implementing new digital technologies, it is important to allow for a period of embedding to enable them to become incorporated in everyday practice.Keywords: consent, digital, immunisation, prevention
Procedia PDF Downloads 14585 EGF Serum Level in Diagnosis and Prediction of Mood Disorder in Adolescents and Young Adults
Authors: Monika Dmitrzak-Weglarz, Aleksandra Rajewska-Rager, Maria Skibinska, Natalia Lepczynska, Piotr Sibilski, Joanna Pawlak, Pawel Kapelski, Joanna Hauser
Abstract:
Epidermal growth factor (EGF) is a well-known neurotrophic factor that involves in neuronal growth and synaptic plasticity. The proteomic research provided in order to identify novel candidate biological markers for mood disorders focused on elevated EGF serum level in patients during depression episode. However, the EGF association with mood disorder spectrum among adolescents and young adults has not been studied extensively. In this study, we aim to investigate the serum levels of EGF in adolescents and young adults during hypo/manic, depressive episodes and in remission compared to healthy control group. In our study, we involved 80 patients aged 12-24 years in 2-year follow-up study with a primary diagnosis of mood disorder spectrum, and 35 healthy volunteers matched by age and gender. Diagnoses were established according to DSM-IV-TR criteria using structured clinical interviews: K-SADS for child and adolescents, and SCID for young adults. Clinical and biological evaluations were made at baseline and euthymic mood (at 3th or 6th month of treatment and after 1 and 2 years). The Young Mania Rating Scale and Hamilton Rating Scale for Depression were used for assessment. The study protocols were approved by the relevant ethics committee. Serum protein concentration was determined by Enzyme-Linked Immunosorbent Assays (ELISA) method. Human EGF (cat. no DY 236) DuoSet ELISA kit was used (R&D Systems). Serum EGF levels were analysed with following variables: age, age under 18 and above 18 years old, sex, family history of affective disorders, drug-free vs. medicated. Shapiro-Wilk test was used to test the normality of the data. The homogeneity of variance was calculated with Levene’s test. EGF levels showed non-normal distribution and the homogeneity of variance was violated. Non-parametric tests: Mann-Whitney U test, Kruskall-Wallis ANOVA, Friedman’s ANOVA, Wilcoxon signed rank test, Spearman correlation coefficient was applied in the analyses The statistical significance level was set at p<0.05. Elevated EGF level at baseline (p=0.001) and at month 24 (p=0.02) was detected in study subjects compared with controls. Increased EGF level in women at month 12 (p=0.02) compared to men in study group have been observed. Using Wilcoxon signed rank test differences in EGF levels were detected: decrease from baseline to month 3 (p=0.014) and increase comparing: month 3 vs. 24 (p=0.013); month 6 vs. 12 (p=0.021) and vs. 24 (p=0.008). EGF level at baseline was negatively correlated with depression and mania occurrence at 24 months. EGF level at 24 months was positively correlated with depression and mania occurrence at 12 months. No other correlations of EGF levels with clinical and demographical variables have been detected. The findings of the present study indicate that EGF serum level is significantly elevated in the study group of patients compared to the controls. We also observed fluctuations in EGF levels during two years of disease observation. EGF seems to be useful as an early marker for prediction of diagnosis, course of illness and treatment response in young patients during first episode od mood disorders, which requires further investigation. Grant was founded by National Science Center in Poland no 2011/03/D/NZ5/06146.Keywords: biological marker, epidermal growth factor, mood disorders, prediction
Procedia PDF Downloads 18884 National Accreditation Board for Hospitals and Healthcare Reaccreditation, the Challenges and Advantages: A Qualitative Case Study
Authors: Narottam Puri, Gurvinder Kaur
Abstract:
Background: The National Accreditation Board for Hospitals & Healthcare Providers (NABH) is India’s apex standard setting accrediting body in health care which evaluates and accredits healthcare organizations. NABH requires accredited organizations to become reaccredited every three years. It is often though that once the initial accreditation is complete, the foundation is set and reaccreditation is a much simpler process. Fortis Hospital, Shalimar Bagh, a part of the Fortis Healthcare group is a 262 bed, multi-specialty tertiary care hospital. The hospital was successfully accredited in the year 2012. On completion of its first cycle, the hospital underwent a reaccreditation assessment in the year 2015. This paper aims to gain a better understanding of the challenges that accredited hospitals face when preparing for a renewal of their accreditations. Methods: The study was conducted using a cross-sectional mixed methods approach; semi-structured interviews were conducted with senior leadership team and staff members including doctors and nurses. Documents collated by the QA team while preparing for the re-assessment like the data on quality indicators: the method of collection, analysis, trending, continual incremental improvements made over time, minutes of the meetings, amendments made to the existing policies and new policies drafted was reviewed to understand the challenges. Results: The senior leadership had a concern about the cost of accreditation and its impact on the quality of health care services considering the staff effort and time consumed it. The management was however in favor of continuing with the accreditation since it offered competitive advantage, strengthened community confidence besides better pay rates from the payors. The clinicians regarded it as an increased non-clinical workload. Doctors felt accountable within a professional framework, to themselves, the patient and family, their peers and to their profession; but not to accreditation bodies and raised concerns on how the quality indicators were measured. The departmental leaders had a positive perception of accreditation. They agreed that it ensured high standards of care and improved management of their functional areas. However, they were reluctant in sparing people for the QA activities due to staffing issues. With staff turnover, a lot of work was lost as sticky knowledge and had to be redone. Listing the continual quality improvement initiatives over the last 3 years was a challenge in itself. Conclusion: The success of any quality assurance reaccreditation program depends almost entirely on the commitment and interest of the administrators, nurses, paramedical staff, and clinicians. The leader of the Quality Movement is critical in propelling and building momentum. Leaders need to recognize skepticism and resistance and consider ways in which staff can become positively engaged. Involvement of all the functional owners is the start point towards building ownership and accountability for standards compliance. Creativity plays a very valuable role. Communication by Mail Series, WhatsApp groups, Quizzes, Events, and any and every form helps. Leaders must be able to generate interest and commitment without burdening clinical and administrative staff with an activity they neither understand nor believe in.Keywords: NABH, reaccreditation, quality assurance, quality indicators
Procedia PDF Downloads 22383 Long-Term Tillage, Lime Matter and Cover Crop Effects under Heavy Soil Conditions in Northern Lithuania
Authors: Aleksandras Velykis, Antanas Satkus
Abstract:
Clay loam and clay soils are typical for northern Lithuania. These soils are susceptible to physical degradation in the case of intensive use of heavy machinery for field operations. However, clayey soils having poor physical properties by origin require more intensive tillage to maintain proper physical condition for grown crops. Therefore not only choice of suitable tillage system is very important for these soils in the region, but also additional search of other measures is essential for good soil physical state maintenance. Research objective: To evaluate the long-term effects of different intensity tillage as well as its combinations with supplementary agronomic practices on improvement of soil physical conditions and environmental sustainability. The experiment examined the influence of deep and shallow ploughing, ploughless tillage, combinations of ploughless tillage with incorporation of lime sludge and cover crop for green manure and application of the same cover crop for mulch without autumn tillage under spring and winter crop growing conditions on clay loam (27% clay, 50% silt, 23% sand) Endocalcaric Endogleyic Cambisol. Methods: The indicators characterizing the impact of investigated measures were determined using the following methods and devices: Soil dry bulk density – by Eijkelkamp cylinder (100 cm3), soil water content – by weighing, soil structure – by Retsch sieve shaker, aggregate stability – by Eijkelkamp wet sieving apparatus, soil mineral nitrogen – in 1 N KCL extract using colorimetric method. Results: Clay loam soil physical state (dry bulk density, structure, aggregate stability, water content) depends on tillage system and its combination with additional practices used. Application of cover crop winter mulch without tillage in autumn, ploughless tillage and shallow ploughing causes the compaction of bottom (15-25 cm) topsoil layer. However, due to ploughless tillage the soil dry bulk density in subsoil (25-35 cm) layer is less compared to deep ploughing. Soil structure in the upper (0-15 cm) topsoil layer and in the seedbed (0-5 cm), prepared for spring crops is usually worse when applying the ploughless tillage or cover crop mulch without autumn tillage. Application of lime sludge under ploughless tillage conditions helped to avoid the compaction and structure worsening in upper topsoil layer, as well as increase aggregate stability. Application of reduced tillage increased soil water content at upper topsoil layer directly after spring crop sowing. However, due to reduced tillage the water content in all topsoil markedly decreased when droughty periods lasted for a long time. Combination of reduced tillage with cover crop for green manure and winter mulch is significant for preserving the environment. Such application of cover crops reduces the leaching of mineral nitrogen into the deeper soil layers and environmental pollution. This work was supported by the National Science Program ‘The effect of long-term, different-intensity management of resources on the soils of different genesis and on other components of the agro-ecosystems’ [grant number SIT-9/2015] funded by the Research Council of Lithuania.Keywords: clay loam, endocalcaric endogleyic cambisol, mineral nitrogen, physical state
Procedia PDF Downloads 22582 Design and Fabrication of AI-Driven Kinetic Facades with Soft Robotics for Optimized Building Energy Performance
Authors: Mohammadreza Kashizadeh, Mohammadamin Hashemi
Abstract:
This paper explores a kinetic building facade designed for optimal energy capture and architectural expression. The system integrates photovoltaic panels with soft robotic actuators for precise solar tracking, resulting in enhanced electricity generation compared to static facades. Driven by the growing interest in dynamic building envelopes, the exploration of facade systems are necessitated. Increased energy generation and regulation of energy flow within buildings are potential benefits offered by integrating photovoltaic (PV) panels as kinetic elements. However, incorporating these technologies into mainstream architecture presents challenges due to the complexity of coordinating multiple systems. To address this, the design leverages soft robotic actuators, known for their compliance, resilience, and ease of integration. Additionally, the project investigates the potential for employing Large Language Models (LLMs) to streamline the design process. The research methodology involved design development, material selection, component fabrication, and system assembly. Grasshopper (GH) was employed within the digital design environment for parametric modeling and scripting logic, and an LLM was experimented with to generate Python code for the creation of a random surface with user-defined parameters. Various techniques, including casting, Three-dimensional 3D printing, and laser cutting, were utilized to fabricate physical components. A modular assembly approach was adopted to facilitate installation and maintenance. A case study focusing on the application of this facade system to an existing library building at Polytechnic University of Milan is presented. The system is divided into sub-frames to optimize solar exposure while maintaining a visually appealing aesthetic. Preliminary structural analyses were conducted using Karamba3D to assess deflection behavior and axial loads within the cable net structure. Additionally, Finite Element (FE) simulations were performed in Abaqus to evaluate the mechanical response of the soft robotic actuators under pneumatic pressure. To validate the design, a physical prototype was created using a mold adapted for a 3D printer's limitations. Casting Silicone Rubber Sil 15 was used for its flexibility and durability. The 3D-printed mold components were assembled, filled with the silicone mixture, and cured. After demolding, nodes and cables were 3D-printed and connected to form the structure, demonstrating the feasibility of the design. This work demonstrates the potential of soft robotics and Artificial Intelligence (AI) for advancements in sustainable building design and construction. The project successfully integrates these technologies to create a dynamic facade system that optimizes energy generation and architectural expression. While limitations exist, this approach paves the way for future advancements in energy-efficient facade design. Continued research efforts will focus on cost reduction, improved system performance, and broader applicability.Keywords: artificial intelligence, energy efficiency, kinetic photovoltaics, pneumatic control, soft robotics, sustainable building
Procedia PDF Downloads 2881 Chronic Impact of Silver Nanoparticle on Aerobic Wastewater Biofilm
Authors: Sanaz Alizadeh, Yves Comeau, Arshath Abdul Rahim, Sunhasis Ghoshal
Abstract:
The application of silver nanoparticles (AgNPs) in personal care products, various household and industrial products has resulted in an inevitable environmental exposure of such engineered nanoparticles (ENPs). Ag ENPs, released via household and industrial wastes, reach water resource recovery facilities (WRRFs), yet the fate and transport of ENPs in WRRFs and their potential risk in the biological wastewater processes are poorly understood. Accordingly, our main objective was to elucidate the impact of long-term continuous exposure to AgNPs on biological activity of aerobic wastewater biofilm. The fate, transport and toxicity of 10 μg.L-1and 100 μg.L-1 PVP-stabilized AgNPs (50 nm) were evaluated in an attached growth biological treatment process, using lab-scale moving bed bioreactors (MBBRs). Two MBBR systems for organic matter removal were fed with a synthetic influent and operated at a hydraulic retention time (HRT) of 180 min and 60% volumetric filling ratio of Anox-K5 carriers with specific surface area of 800 m2/m3. Both reactors were operated for 85 days after reaching steady state conditions to develop a mature biofilm. The impact of AgNPs on the biological performance of the MBBRs was characterized over a period of 64 days in terms of the filtered biodegradable COD (SCOD) removal efficiency, the biofilm viability and key enzymatic activities (α-glucosidase and protease). The AgNPs were quantitatively characterized using single-particle inductively coupled plasma mass spectroscopy (spICP-MS), determining simultaneously the particle size distribution, particle concentration and dissolved silver content in influent, bioreactor and effluent samples. The generation of reactive oxygen species and the oxidative stress were assessed as the proposed toxicity mechanism of AgNPs. Results indicated that a low concentration of AgNPs (10 μg.L-1) did not significantly affect the SCOD removal efficiency whereas a significant reduction in treatment efficiency (37%) was observed at 100 μg.L-1AgNPs. Neither the viability nor the enzymatic activities of biofilm were affected at 10 μg.L-1AgNPs but a higher concentration of AgNPs induced cell membrane integrity damage resulting in 31% loss of viability and reduced α-glucosidase and protease enzymatic activities by 31% and 29%, respectively, over the 64-day exposure period. The elevated intercellular ROS in biofilm at a higher AgNPs concentration over time was consistent with a reduced biological biofilm performance, confirming the occurrence of a nanoparticle-induced oxidative stress in the heterotrophic biofilm. The spICP-MS analysis demonstrated a decrease in the nanoparticles concentration over the first 25 days, indicating a significant partitioning of AgNPs into the biofilm matrix in both reactors. The concentration of nanoparticles increased in effluent of both reactors after 25 days, however, indicating a decreased retention capacity of AgNPs in biofilm. The observed significant detachment of biofilm also contributed to a higher release of nanoparticles due to cell-wall destabilizing properties of AgNPs as an antimicrobial agent. The removal efficiency of PVP-AgNPs and the biofilm biological responses were a function of nanoparticle concentration and exposure time. This study contributes to a better understanding of the fate and behavior of AgNPs in biological wastewater processes, providing key information that can be used to predict the environmental risks of ENPs in aquatic ecosystems.Keywords: biofilm, silver nanoparticle, single particle ICP-MS, toxicity, wastewater
Procedia PDF Downloads 26780 Empowering and Educating Young People Against Cybercrime by Playing: The Rayuela Method
Authors: Jose L. Diego, Antonio Berlanga, Gregorio López, Diana López
Abstract:
The Rayuela method is a success story, as it is part of a project selected by the European Commission to face the challenge launched by itself for achieving a better understanding of human factors, as well as social and organisational aspects that are able to solve issues in fighting against crime. Rayuela's method specifically focuses on the drivers of cyber criminality, including approaches to prevent, investigate, and mitigate cybercriminal behavior. As the internet has become an integral part of young people’s lives, they are the key target of the Rayuela method because they (as a victim or as a perpetrator) are the most vulnerable link of the chain. Considering the increased time spent online and the control of their internet usage and the low level of awareness of cyber threats and their potential impact, it is understandable the proliferation of incidents due to human mistakes. 51% of Europeans feel not well informed about cyber threats, and 86% believe that the risk of becoming a victim of cybercrime is rapidly increasing. On the other hand, Law enforcement has noted that more and more young people are increasingly committing cybercrimes. This is an international problem that has considerable cost implications; it is estimated that crimes in cyberspace will cost the global economy $445B annually. Understanding all these phenomena drives to the necessity of a shift in focus from sanctions to deterrence and prevention. As a research project, Rayuela aims to bring together law enforcement agencies (LEAs), sociologists, psychologists, anthropologists, legal experts, computer scientists, and engineers, to develop novel methodologies that allow better understanding the factors affecting online behavior related to new ways of cyber criminality, as well as promoting the potential of these young talents for cybersecurity and technologies. Rayuela’s main goal is to better understand the drivers and human factors affecting certain relevant ways of cyber criminality, as well as empower and educate young people in the benefits, risks, and threats intrinsically linked to the use of the Internet by playing, thus preventing and mitigating cybercriminal behavior. In order to reach that goal it´s necessary an interdisciplinary consortium (formed by 17 international partners) carries out researches and actions like Profiling and case studies of cybercriminals and victims, risk assessments, studies on Internet of Things and its vulnerabilities, development of a serious gaming environment, training activities, data analysis and interpretation using Artificial intelligence, testing and piloting, etc. For facilitating the real implementation of the Rayuela method, as a community policing strategy, is crucial to count on a Police Force with a solid background in trust-building and community policing in order to do the piloting, specifically with young people. In this sense, Valencia Local Police is a pioneer Police Force working with young people in conflict solving, through providing police mediation and peer mediation services and advice. As an example, it is an official mediation institution, so agreements signed by their police mediators have once signed by the parties, the value of a judicial decision.Keywords: fight against crime and insecurity, avert and prepare young people against aggression, ICT, serious gaming and artificial intelligence against cybercrime, conflict solving and mediation with young people
Procedia PDF Downloads 12779 Ecotoxicological Test-Battery for Efficiency Assessment of TiO2 Assisted Photodegradation of Emerging Micropolluants
Authors: Ildiko Fekete-Kertesz, Jade Chaker, Sylvain Berthelot, Viktoria Feigl, Monika Molnar, Lidia Favier
Abstract:
There has been growing concern about emerging micropollutants in recent years, because of the possible environmental and health risk posed by these substances, which are released into the environment as a consequence of anthropogenic activities. Among them pharmaceuticals are currently not considered under water quality regulations; however, their potential effect on the environment have become more frequent in recent years. Due to the fact that these compounds can be detected in natural water matrices, it can be concluded, that the currently applied water treatment processes are not efficient enough for their effective elimination. To date, advanced oxidation processes (AOPs) are considered as highly competitive water treatment technologies for the removal of those organic micropollutants not treatable by conventional techniques due to their high chemical stability and/or low biodegradability. AOPs such as (photo)chemical oxidation and heterogeneous photocatalysis have proven their potential in degrading harmful organic compounds from aqueous matrices. However, some of these technologies generate reaction by-products, which can even be more toxic to aquatic organisms than the parent compounds. Thus, target compound removal does not necessarily result in the removal of toxicity. Therefore, to evaluate process efficiency the determination of the toxicity and ecotoxicity of the reaction intermediates is crucial to estimate the environmental risk of such techniques. In this context, the present study investigates the effectiveness of TiO2 assisted photodegradation for the removal of emerging water contaminants. Two drugs named losartan (used in high blood pressure medication) and levetiracetam (used to treat epilepsy) were considered in this work. The photocatalytic reactions were carried out with a commercial catalyst usually employed in photocatalysis. Moreover, the toxicity of the by-products generated during the process was assessed with various ecotoxicological methods applying aquatic test organisms from different trophic levels. A series of experiments were performed to evaluate the toxicity of untreated and treated solutions applying the Aliivibrio fischeri bioluminescence inhibition test, the Tetrahymena pyriformis proliferation inhibition test, the Daphnia magna lethality and immobilization tests and the Lemna minor growth inhibition test. The applied ecotoxicological methodology indicated sensitively the toxic effects of the treated and untreated water samples, hence the applied test battery is suitable for the ecotoxicological characterization of TiO2 based photocatalytic water treatment technologies and the indication of the formation of toxic by-products from the parent chemical compounds. Obtained results clearly showed that the TiO2 assisted photodegradation was more efficient in the elimination of losartan than levetiracetam. It was also observed that the treated levetiracetam solutions had more severe effect on the applied test organisms. A possible explanation would be the production of levetiracetam by-products, which are more toxic than the parent compound. The increased toxicity and the risk of formation of toxic metabolites represent one possible limitation to the implementation of photocatalytic treatment using TiO2 for the removal of losartan and levetiracetam. Our results proved that, the battery of ecotoxicity tests used in this work can be a promising investigation tool for the environmental risk assessment of photocatalytic processes.Keywords: aquatic micropollutants, ecotoxicology, nano titanium dioxide, photocatalysis, water treatment
Procedia PDF Downloads 188