Search results for: OghmaNano software
21 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid
Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang
Abstract:
Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal
Procedia PDF Downloads 7720 Northern Nigeria Vaccine Direct Delivery System
Authors: Evelyn Castle, Adam Thompson
Abstract:
Background: In 2013, the Kano State Primary Health Care Management Board redesigned its Routine immunization supply chain from diffused pull to direct delivery push. It addressed issues around stockouts and reduced time spent by health facility staff collecting, and reporting on vaccine usage. The health care board sought the help of a 3PL for twice-monthly deliveries from its cold store to 484 facilities across 44 local governments. eHA’s Health Delivery Systems group formed a 3PL to serve 326 of these new facilities in partnership with the State. We focused on designing and implementing a technology system throughout. Basic methodologies: GIS Mapping: - Planning the delivery of vaccines to hundreds of health facilities requires detailed route planning for delivery vehicles. Mapping the road networks across Kano and Bauchi with a custom routing tool provided information for the optimization of deliveries. Reducing the number of kilometers driven each round by 20%, - reducing cost and delivery time. Direct Delivery Information System: - Vaccine Direct Deliveries are facilitated through pre-round planning (driven by health facility database, extensive GIS, and inventory workflow rules), manager and driver control panel customizing delivery routines and reporting, progress dashboard, schedules/routes, packing lists, delivery reports, and driver data collection applications. Move: Last Mile Logistics Management System: - MOVE has improved vaccine supply information management to be timely, accurate and actionable. Provides stock management workflow support, alerts management for cold chain exceptions/stock outs, and on-device analytics for health and supply chain staff. Software was built to be offline-first with user-validated interface and experience. Deployed to hundreds of vaccine storage site the improved information tools helps facilitate the process of system redesign and change management. Findings: - Stock-outs reduced from 90% to 33% - Redesigned current health systems and managing vaccine supply for 68% of Kano’s wards. - Near real time reporting and data availability to track stock. - Paperwork burdens of health staff have been dramatically reduced. - Medicine available when the community needs it. - Consistent vaccination dates for children under one to prevent polio, yellow fever, tetanus. - Higher immunization rates = Lower infection rates. - Hundreds of millions of Naira worth of vaccines successfully transported. - Fortnightly service to 326 facilities in 326 wards across 30 Local Government areas. - 6,031 cumulative deliveries. - Over 3.44 million doses transported. - Minimum travel distance covered in a round of delivery is 2000 kms & maximum of 6297 kms. - 153,409 kms travelled by 6 drivers. - 500 facilities in 326 wards. - Data captured and synchronized for the first time. - Data driven decision making now possible. Conclusion: eHA’s Vaccine Direct delivery has met challenges in Kano and Bauchi State and provided a reliable delivery service of vaccinations that ensure t health facilities can run vaccination clinics for children under one. eHA uses innovative technology that delivers vaccines from Northern Nigerian zonal stores straight to healthcare facilities. Helped healthcare workers spend less time managing supplies and more time delivering care, and will be rolled out nationally across Nigeria.Keywords: direct delivery information system, health delivery system, GIS mapping, Northern Nigeria, vaccines
Procedia PDF Downloads 37319 Human Bone Marrow Stem Cell Behavior on 3D Printed Scaffolds as Trabecular Bone Grafts
Authors: Zeynep Busra Velioglu, Deniz Pulat, Beril Demirbakan, Burak Ozcan, Ece Bayrak, Cevat Erisken
Abstract:
Bone tissue has the ability to perform a wide array of functions including providing posture, load-bearing capacity, protection for the internal organs, initiating hematopoiesis, and maintaining the homeostasis of key electrolytes via calcium/phosphate ion storage. The most common cause for bone defects is extensive trauma and subsequent infection. Bone tissue has the self-healing capability without a scar tissue formation for the majority of the injuries. However, some may result with delayed union or fracture non-union. Such cases include reconstruction of large bone defects or cases of compromised regenerative process as a result of avascular necrosis and osteoporosis. Several surgical methods exist to treat bone defects, including Ilizarov method, Masquelete technique, growth factor stimulation, and bone replacement. Unfortunately, these are technically demanding and come with noteworthy disadvantages such as lengthy treatment duration, adverse effects on the patient’s psychology, repeated surgical procedures, and often long hospitalization times. These limitations associated with surgical techniques make bone substitutes an attractive alternative. Here, it was hypothesized that a 3D printed scaffold will mimic trabecular bone in terms of biomechanical properties and that such scaffolds will support cell attachment and survival. To test this hypothesis, this study aimed at fabricating poly(lactic acid), PLA, structures using 3D printing technology for trabecular bone defects, characterizing the scaffolds and comparing with bovine trabecular bone. Capacity of scaffolds on human bone marrow stem cell (hBMSC) attachment and survival was also evaluated. Cubes with a volume of 1 cm³ having pore sizes of 0.50, 1.00 and 1.25 mm were printed. The scaffolds/grafts were characterized in terms of porosity, contact angle, compressive mechanical properties as well cell response. Porosities of the 3D printed scaffolds were calculated based on apparent densities. For contact angles, 50 µl distilled water was dropped over the surface of scaffolds, and contact angles were measured using ‘Image J’ software. Mechanical characterization under compression was performed on scaffolds and native trabecular bone (bovine, 15 months) specimens using a universal testing machine at a rate of 0.5mm/min. hBMSCs were seeded onto the 3D printed scaffolds. After 3 days of incubation with fully supplemented Dulbecco’s modified Eagle’s medium, the cells were fixed using 2% formaldehyde and glutaraldehyde mixture. The specimens were then imaged under scanning electron microscopy. Cell proliferation was determined by using EZQuant dsDNA Quantitation kit. Fluorescence was measured using microplate reader Spectramax M2 at the excitation and emission wavelengths of 485nm and 535nm, respectively. Findings suggested that porosity of scaffolds with pore dimensions of 0.5mm, 1.0mm and 1.25mm were not affected by pore size, while contact angle and compressive modulus decreased with increasing pore size. Biomechanical characterization of trabecular bone yielded higher modulus values as compared to scaffolds with all pore sizes studied. Cells attached and survived in all surfaces, demonstrating higher proliferation on scaffolds with 1.25mm pores as compared with those of 1mm. Collectively, given lower mechanical properties of scaffolds as compared to native bone, and biocompatibility of the scaffolds, the 3D printed PLA scaffolds of this study appear as candidate substitutes for bone repair and regeneration.Keywords: 3D printing, biomechanics, bone repair, stem cell
Procedia PDF Downloads 17118 Speeding Up Lenia: A Comparative Study Between Existing Implementations and CUDA C++ with OpenGL Interop
Authors: L. Diogo, A. Legrand, J. Nguyen-Cao, J. Rogeau, S. Bornhofen
Abstract:
Lenia is a system of cellular automata with continuous states, space and time, which surprises not only with the emergence of interesting life-like structures but also with its beauty. This paper reports ongoing research on a GPU implementation of Lenia using CUDA C++ and OpenGL Interoperability. We demonstrate how CUDA as a low-level GPU programming paradigm allows optimizing performance and memory usage of the Lenia algorithm. A comparative analysis through experimental runs with existing implementations shows that the CUDA implementation outperforms the others by one order of magnitude or more. Cellular automata hold significant interest due to their ability to model complex phenomena in systems with simple rules and structures. They allow exploring emergent behavior such as self-organization and adaptation, and find applications in various fields, including computer science, physics, biology, and sociology. Unlike classic cellular automata which rely on discrete cells and values, Lenia generalizes the concept of cellular automata to continuous space, time and states, thus providing additional fluidity and richness in emerging phenomena. In the current literature, there are many implementations of Lenia utilizing various programming languages and visualization libraries. However, each implementation also presents certain drawbacks, which serve as motivation for further research and development. In particular, speed is a critical factor when studying Lenia, for several reasons. Rapid simulation allows researchers to observe the emergence of patterns and behaviors in more configurations, on bigger grids and over longer periods without annoying waiting times. Thereby, they enable the exploration and discovery of new species within the Lenia ecosystem more efficiently. Moreover, faster simulations are beneficial when we include additional time-consuming algorithms such as computer vision or machine learning to evolve and optimize specific Lenia configurations. We developed a Lenia implementation for GPU using the C++ and CUDA programming languages, and CUDA/OpenGL Interoperability for immediate rendering. The goal of our experiment is to benchmark this implementation compared to the existing ones in terms of speed, memory usage, configurability and scalability. In our comparison we focus on the most important Lenia implementations, selected for their prominence, accessibility and widespread use in the scientific community. The implementations include MATLAB, JavaScript, ShaderToy GLSL, Jupyter, Rust and R. The list is not exhaustive but provides a broad view of the principal current approaches and their respective strengths and weaknesses. Our comparison primarily considers computational performance and memory efficiency, as these factors are critical for large-scale simulations, but we also investigate the ease of use and configurability. The experimental runs conducted so far demonstrate that the CUDA C++ implementation outperforms the other implementations by one order of magnitude or more. The benefits of using the GPU become apparent especially with larger grids and convolution kernels. However, our research is still ongoing. We are currently exploring the impact of several software design choices and optimization techniques, such as convolution with Fast Fourier Transforms (FFT), various GPU memory management scenarios, and the trade-off between speed and accuracy using single versus double precision floating point arithmetic. The results will give valuable insights into the practice of parallel programming of the Lenia algorithm, and all conclusions will be thoroughly presented in the conference paper. The final version of our CUDA C++ implementation will be published on github and made freely accessible to the Alife community for further development.Keywords: artificial life, cellular automaton, GPU optimization, Lenia, comparative analysis.
Procedia PDF Downloads 4117 Long-Term Subcentimeter-Accuracy Landslide Monitoring Using a Cost-Effective Global Navigation Satellite System Rover Network: Case Study
Authors: Vincent Schlageter, Maroua Mestiri, Florian Denzinger, Hugo Raetzo, Michel Demierre
Abstract:
Precise landslide monitoring with differential global navigation satellite system (GNSS) is well known, but technical or economic reasons limit its application by geotechnical companies. This study demonstrates the reliability and the usefulness of Geomon (Infrasurvey Sàrl, Switzerland), a stand-alone and cost-effective rover network. The system permits deploying up to 15 rovers, plus one reference station for differential GNSS. A dedicated radio communication links all the modules to a base station, where an embedded computer automatically provides all the relative positions (L1 phase, open-source RTKLib software) and populates an Internet server. Each measure also contains information from an internal inclinometer, battery level, and position quality indices. Contrary to standard GNSS survey systems, which suffer from a limited number of beacons that must be placed in areas with good GSM signal, Geomon offers greater flexibility and permits a real overview of the whole landslide with good spatial resolution. Each module is powered with solar panels, ensuring autonomous long-term recordings. In this study, we have tested the system on several sites in the Swiss mountains, setting up to 7 rovers per site, for an 18 month-long survey. The aim was to assess the robustness and the accuracy of the system in different environmental conditions. In one case, we ran forced blind tests (vertical movements of a given amplitude) and compared various session parameters (duration from 10 to 90 minutes). Then the other cases were a survey of real landslides sites using fixed optimized parameters. Sub centimetric-accuracy with few outliers was obtained using the best parameters (session duration of 60 minutes, baseline 1 km or less), with the noise level on the horizontal component half that of the vertical one. The performance (percent of aborting solutions, outliers) was reduced with sessions shorter than 30 minutes. The environment also had a strong influence on the percent of aborting solutions (ambiguity search problem), due to multiple reflections or satellites obstructed by trees and mountains. The length of the baseline (distance reference-rover, single baseline processing) reduced the accuracy above 1 km but had no significant effect below this limit. In critical weather conditions, the system’s robustness was limited: snow, avalanche, and frost-covered some rovers, including the antenna and vertically oriented solar panels, leading to data interruption; and strong wind damaged a reference station. The possibility of changing the sessions’ parameters remotely was very useful. In conclusion, the rover network tested provided the foreseen sub-centimetric-accuracy while providing a dense spatial resolution landslide survey. The ease of implementation and the fully automatic long-term survey were timesaving. Performance strongly depends on surrounding conditions, but short pre-measures should allow moving a rover to a better final placement. The system offers a promising hazard mitigation technique. Improvements could include data post-processing for alerts and automatic modification of the duration and numbers of sessions based on battery level and rover displacement velocity.Keywords: GNSS, GSM, landslide, long-term, network, solar, spatial resolution, sub-centimeter.
Procedia PDF Downloads 11116 Saving Lives from a Laptop: How to Produce a Live Virtual Media Briefing That Will Inform, Educate, and Protect Communities in Crisis
Authors: Cory B. Portner, Julie A. Grauert, Lisa M. Stromme, Shelby D. Anderson, Franji H. Mayes
Abstract:
Introduction: WASHINGTON state in the Pacific Northwest of the United States is internationally known for its technology industry, fisheries, agriculture, and vistas. On January 21, 2020, Washington state also became known as the first state with a confirmed COVID-19 case in the United States, thrusting the state into the international spotlight as the world came to grips with the global threat of this disease presented. Tourism is Washington state’s fourth-largest industry. Tourism to the state generates over 1.8 billion dollars (USD) in local and state tax revenue and employs over 180,000 people. Communicating with residents, stakeholders, and visitors on the status of disease activity, prevention measures, and response updates was vital to stopping the pandemic and increasing compliance and awareness. Significance: In order to communicate vital public health updates, guidance implementation, and safety measures to the public, the Washington State Department of Health established routine live virtual media briefings to reach audiences via social media, internet television, and broadcast television. Through close partnership with regional broadcast news stations and the state public affairs news network, the Washington State Department of Health hosted 95 media briefings from January 2020 through September 2022 and continues to regularly host live virtual media briefings to accommodate the needs of the public and media. Methods: Our methods quickly evolved from hosting briefings in the cement closet of a military base to being able to produce and stream the briefings live from any home-office location. The content was tailored to the hot topic of the day and to the reporter's questions and needs. Virtual media briefings hosted through inexpensive or free platforms online are extremely cost-effective: the only mandatory components are WiFi, a laptop, and a monitor. There is no longer a need for a fancy studio or expensive production software to achieve the goal of communicating credible, reliable information promptly. With minimal investment and a small learning curve, facilitators and panelists are able to host highly produced and engaging media availabilities from their living rooms. Results: The briefings quickly developed a reputation as the best source for local and national journalists to get the latest and most factually accurate information about the pandemic. In the height of the COVID-19 response, 135 unique media outlets logged on to participate in the briefing. The briefings typically featured 4-5 panelists, with as many as 9 experts in attendance to provide information and respond to media questions. Preparation was always a priority: Public Affairs staff for the Washington State Department of Health produced over 170 presenter remarks, including guidance on talking points for 63 expert guest panelists. Implication For Practice: Information is today’s most valuable currency. The ability to disseminate correct information urgently and on a wide scale is the most effective tool in crisis communication. Due to our role as the first state with a confirmed COVID-19 case, we were forced to develop the most accurate and effective way to get life-saving information to the public. The cost-effective, web-based methods we developed can be applied in any crisis to educate and protect communities under threat, ultimately saving lives from a laptop.Keywords: crisis communications, public relations, media management, news media
Procedia PDF Downloads 18315 Experiences and Perceptions of the Barriers and Facilitators of Continence Care Provision in Residential and Nursing Homes for Older Adults: A Systematic Evidence Synthesis and Qualitative Exploration
Authors: Jennifer Wheeldon, Nick de Viggiani, Nikki Cotterill
Abstract:
Background: Urinary and fecal incontinence affect a significant proportion of older adults aged 65 and over who permanently reside in residential and nursing home facilities. Incontinence symptoms have been linked to comorbidities, an increased risk of infection and reduced quality of life and mental wellbeing of residents. However, continence care provision can often be poor, further compromising the health and wellbeing of this vulnerable population. Objectives: To identify experiences and perceptions of continence care provision in older adult residential care settings and to identify factors that help or hinder good continence care provision. Settings included both residential care homes and nursing homes for older adults. Methods: A qualitative evidence synthesis using systematic review methodology established the current evidence-base. Data from 20 qualitative and mixed-method studies was appraised and synthesized. Following the review process, 10* qualitative interviews with staff working in older adult residential care settings were conducted across six* sites, which included registered managers, registered nurses and nursing/care assistants/aides. Purposive sampling recruited individuals from across England. Both evidence synthesis and interview data was analyzed thematically, both manually and with NVivo software. Results: The evidence synthesis revealed complex barriers and facilitators for continence care provision at three influencing levels: macro (structural and societal external influences), meso (organizational and institutional influences) and micro (day-to-day actions of individuals impacting service delivery). Macro-level barriers included negative stigmas relating to incontinence, aging and working in the older adult social care sector, restriction of continence care resources such as containment products (i.e. pads), short staffing in care facilities, shortfalls in the professional education and training of care home staff and the complex health and social care needs of older adult residents. Meso-level barriers included task-centered organizational cultures, ageist institutional perspectives regarding old age and incontinence symptoms, inadequate care home management and poor communication and teamwork among care staff. Micro-level barriers included poor knowledge and negative attitudes of care home staff and residents regarding incontinence symptoms and symptom management and treatment. Facilitators at the micro-level included proactive and inclusive leadership skills of individuals in management roles. Conclusions: The findings of the evidence synthesis study help to outline the complexities of continence care provision in older adult care homes facilities. Macro, meso and micro level influences demonstrate problematic and interrelated barriers across international contexts, indicating that improving continence care in this setting is extremely challenging due to the multiple levels at which care provision and services are impacted. Both international and national older adult social care policy-makers, researchers and service providers must recognize this complexity, and any intervention seeking to improve continence care in older adult care home settings must be planned accordingly and appreciatively of the complex and interrelated influences. It is anticipated that the findings of the qualitative interviews will shed further light on the national context of continence care provision specific to England; data collection is ongoing*. * Sample size is envisaged to be between 20-30 participants from multiple sites by Spring 2023.Keywords: continence care, residential and nursing homes, evidence synthesis, qualitative
Procedia PDF Downloads 8514 Designing and Simulation of the Rotor and Hub of the Unmanned Helicopter
Authors: Zbigniew Czyz, Ksenia Siadkowska, Krzysztof Skiba, Karol Scislowski
Abstract:
Today’s progress in the rotorcraft is mostly associated with an optimization of aircraft performance achieved by active and passive modifications of main rotor assemblies and a tail propeller. The key task is to improve their performance, improve the hover quality factor for rotors but not change in specific fuel consumption. One of the tasks to improve the helicopter is an active optimization of the main rotor providing for flight stages, i.e., an ascend, flight, a descend. An active interference with the airflow around the rotor blade section can significantly change characteristics of the aerodynamic airfoil. The efficiency of actuator systems modifying aerodynamic coefficients in the current solutions is relatively high and significantly affects the increase in strength. The solution to actively change aerodynamic characteristics assumes a periodic change of geometric features of blades depending on flight stages. Changing geometric parameters of blade warping enables an optimization of main rotor performance depending on helicopter flight stages. Structurally, an adaptation of shape memory alloys does not significantly affect rotor blade fatigue strength, which contributes to reduce costs associated with an adaptation of the system to the existing blades, and gains from a better performance can easily amortize such a modification and improve profitability of such a structure. In order to obtain quantitative and qualitative data to solve this research problem, a number of numerical analyses have been necessary. The main problem is a selection of design parameters of the main rotor and a preliminary optimization of its performance to improve the hover quality factor for rotors. This design concept assumes a three-bladed main rotor with a chord of 0.07 m and radius R = 1 m. The value of rotor speed is a calculated parameter of an optimization function. To specify the initial distribution of geometric warping, a special software has been created that uses a numerical method of a blade element which respects dynamic design features such as fluctuations of a blade in its joints. A number of performance analyses as a function of rotor speed, forward speed, and altitude have been performed. The calculations were carried out for the full model assembly. This approach makes it possible to observe the behavior of components and their mutual interaction resulting from the forces. The key element of each rotor is the shaft, hub and pins holding the joints and blade yokes. These components are exposed to the highest loads. As a result of the analysis, the safety factor was determined at the level of k > 1.5, which gives grounds to obtain certification for the strength of the structure. The construction of the joint rotor has numerous moving elements in its structure. Despite the high safety factor, the places with the highest stresses, where the signs of wear and tear may appear, have been indicated. The numerical analysis carried out showed that the most loaded element is the pin connecting the modular bearing of the blade yoke with the element of the horizontal oscillation joint. The stresses in this element result in a safety factor of k=1.7. The other analysed rotor components have a safety factor of more than 2 and in the case of the shaft, this factor is more than 3. However, it must be remembered that the structure is as strong as the weakest cell is. Designed rotor for unmanned aerial vehicles adapted to work with blades with intelligent materials in its structure meets the requirements for certification testing. Acknowledgement: This work has been financed by the Polish National Centre for Research and Development under the LIDER program, Grant Agreement No. LIDER/45/0177/L-9/17/NCBR/2018.Keywords: main rotor, rotorcraft aerodynamics, shape memory alloy, materials, unmanned helicopter
Procedia PDF Downloads 15713 Machine Learning Based Digitalization of Validated Traditional Cognitive Tests and Their Integration to Multi-User Digital Support System for Alzheimer’s Patients
Authors: Ramazan Bakir, Gizem Kayar
Abstract:
It is known that Alzheimer and Dementia are the two most common types of Neurodegenerative diseases and their visibility is getting accelerated for the last couple of years. As the population sees older ages all over the world, researchers expect to see the rate of this acceleration much higher. However, unfortunately, there is no known pharmacological cure for both, although some help to reduce the rate of cognitive decline speed. This is why we encounter with non-pharmacological treatment and tracking methods more for the last five years. Many researchers, including well-known associations and hospitals, lean towards using non-pharmacological methods to support cognitive function and improve the patient’s life quality. As the dementia symptoms related to mind, learning, memory, speaking, problem-solving, social abilities and daily activities gradually worsen over the years, many researchers know that cognitive support should start from the very beginning of the symptoms in order to slow down the decline. At this point, life of a patient and caregiver can be improved with some daily activities and applications. These activities include but not limited to basic word puzzles, daily cleaning activities, taking notes. Later, these activities and their results should be observed carefully and it is only possible during patient/caregiver and M.D. in-person meetings in hospitals. These meetings can be quite time-consuming, exhausting and financially ineffective for hospitals, medical doctors, caregivers and especially for patients. On the other hand, digital support systems are showing positive results for all stakeholders of healthcare systems. This can be observed in countries that started Telemedicine systems. The biggest potential of our system is setting the inter-user communication up in the best possible way. In our project, we propose Machine Learning based digitalization of validated traditional cognitive tests (e.g. MOCA, Afazi, left-right hemisphere), their analyses for high-quality follow-up and communication systems for all stakeholders. R. Bakir and G. Kayar are with Gefeasoft, Inc, R&D – Software Development and Health Technologies company. Emails: ramazan, gizem @ gefeasoft.com This platform has a high potential not only for patient tracking but also for making all stakeholders feel safe through all stages. As the registered hospitals assign corresponding medical doctors to the system, these MDs are able to register their own patients and assign special tasks for each patient. With our integrated machine learning support, MDs are able to track the failure and success rates of each patient and also see general averages among similarly progressed patients. In addition, our platform also supports multi-player technology which helps patients play with their caregivers so that they feel much safer at any point they are uncomfortable. By also gamifying the daily household activities, the patients will be able to repeat their social tasks and we will provide non-pharmacological reminiscence therapy (RT – life review therapy). All collected data will be mined by our data scientists and analyzed meaningfully. In addition, we will also add gamification modules for caregivers based on Naomi Feil’s Validation Therapy. Both are behaving positively to the patient and keeping yourself mentally healthy is important for caregivers. We aim to provide a therapy system based on gamification for them, too. When this project accomplishes all the above-written tasks, patients will have the chance to do many tasks at home remotely and MDs will be able to follow them up very effectively. We propose a complete platform and the whole project is both time and cost-effective for supporting all stakeholders.Keywords: alzheimer’s, dementia, cognitive functionality, cognitive tests, serious games, machine learning, artificial intelligence, digitalization, non-pharmacological, data analysis, telemedicine, e-health, health-tech, gamification
Procedia PDF Downloads 13612 A Low-Cost Disposable PDMS Microfluidic Cartridge with Reagent Storage Silicone Blisters for Isothermal DNA Amplification
Authors: L. Ereku, R. E. Mackay, A. Naveenathayalan, K. Ajayi, W. Balachandran
Abstract:
Over the past decade the increase of sexually transmitted infections (STIs) especially in the developing world due to high cost and lack of sufficient medical testing have given rise to the need for a rapid, low cost point of care medical diagnostic that is disposable and most significantly reproduces equivocal results achieved within centralised laboratories. This paper present the development of a disposable PDMS microfluidic cartridge incorporating blisters filled with reagents required for isothermal DNA amplification in clinical diagnostics and point-of-care testing. In view of circumventing the necessity for external complex microfluidic pumps, designing on-chip pressurised fluid reservoirs is embraced using finger actuation and blister storage. The fabrication of the blisters takes into consideration three proponents that include: material characteristics, fluid volume and structural design. Silicone rubber is the chosen material due to its good chemical stability, considerable tear resistance and moderate tension/compression strength. The case of fluid capacity and structural form go hand in hand as the reagent need for the experimental analysis determines the volume size of the blisters, whereas the structural form has to be designed to provide low compression stress when deformed for fluid expulsion. Furthermore, the top and bottom section of the blisters are embedded with miniature polar opposite magnets at a defined parallel distance. These magnets are needed to lock or restrain the blisters when fully compressed so as to prevent unneeded backflow as a result of elasticity. The integrated chip is bonded onto a large microscope glass slide (50mm x 75mm). Each part is manufactured using a 3D printed mould designed using Solidworks software. Die-casting is employed, using 3D printed moulds, to form the deformable blisters by forcing a proprietary liquid silicone rubber through the positive mould cavity. The set silicone rubber is removed from the cast and prefilled with liquid reagent and then sealed with a thin (0.3mm) burstable layer of recast silicone rubber. The main microfluidic cartridge is fabricated using classical soft lithographic techniques. The cartridge incorporates microchannel circuitry, mixing chamber, inlet port, outlet port, reaction chamber and waste chamber. Polydimethylsiloxane (PDMS, QSil 216) is mixed and degassed using a centrifuge (ratio 10:1) is then poured after the prefilled blisters are correctly positioned on the negative mould. Heat treatment of about 50C to 60C in the oven for about 3hours is needed to achieve curing. The latter chip production stage involves bonding the cured PDMS to the glass slide. A plasma coroner treater device BD20-AC (Electro-Technic Products Inc., US) is used to activate the PDMS and glass slide before they are both joined and adequately compressed together, then left in the oven over the night to ensure bonding. There are two blisters in total needed for experimentation; the first will be used as a wash buffer to remove any remaining cell debris and unbound DNA while the second will contain 100uL amplification reagents. This paper will present results of chemical cell lysis, extraction using a biopolymer paper membrane and isothermal amplification on a low-cost platform using the finger actuated blisters for reagent storage. The platform has been shown to detect 1x105 copies of Chlamydia trachomatis using Recombinase Polymerase Amplification (RPA).Keywords: finger actuation, point of care, reagent storage, silicone blisters
Procedia PDF Downloads 36811 Exploring Factors That May Contribute to the Underdiagnosis of Hereditary Transthyretin Amyloidosis in African American Patients
Authors: Kelsi Hagerty, Ami Rosen, Aaliyah Heyward, Nadia Ali, Emily Brown, Erin Demo, Yue Guan, Modele Ogunniyi, Brianna McDaniels, Alanna Morris, Kunal Bhatt
Abstract:
Hereditary transthyretin amyloidosis (hATTR) is a progressive, multi-systemic, and life-threatening disease caused by a disruption in the TTR protein that delivers thyroxine and retinol to the liver. This disruption causes the protein to misfold into amyloid fibrils, leading to the accumulation of the amyloid fibrils in the heart, nerves, and GI tract. Over 130 variants in the TTR gene are known to cause hATTR. The Val122Ile variant is the most common in the United States and is seen almost exclusively in people of African descent. TTR variants are inherited in an autosomal dominant fashion and have incomplete penetrance and variable expressivity. Individuals with hATTR may exhibit symptoms from as early as 30 years to as late as 80 years of age. hATTR is characterized by a wide range of clinical symptoms such as cardiomyopathy, neuropathy, carpal tunnel syndrome, and GI complications. Without treatment, hATTR leads to progressive disease and can ultimately lead to heart failure. hATTR disproportionately affects individuals of African descent; the estimated prevalence of hATTR among Black individuals in the US is 3.4%. Unfortunately, hATTR is often underdiagnosed and misdiagnosed because many symptoms of the disease overlap with other cardiac conditions. Due to the progressive nature of the disease, multi-systemic manifestations that can lead to a shortened lifespan, and the availability of free genetic testing and promising FDA-approved therapies that enhance treatability, early identification of individuals with a pathogenic hATTR variant is important, as this can significantly impact medical management for patients and their relatives. Furthermore, recent literature suggests that TTR genetic testing should be performed in all patients with suspicion of TTR-related cardiomyopathy, regardless of age, and that follow-up with genetic counseling services is recommended. Relatives of patients with hATTR benefit from genetic testing because testing can identify carriers early and allow relatives to receive regular screening and management. Despite the striking prevalence of hATTR among Black individuals, hATTR remains underdiagnosed in this patient population, and germline genetic testing for hATTR in Black individuals seems to be underrepresented, though the reasons for this have not yet been brought to light. Historically, Black patients experience a number of barriers to seeking healthcare that has been hypothesized to perpetuate the underdiagnosis of hATTR, such as lack of access and mistrust of healthcare professionals. Prior research has described a myriad of factors that shape an individual’s decision about whether to pursue presymptomatic genetic testing for a familial pathogenic variant, such as family closeness and communication, family dynamics, and a desire to inform other family members about potential health risks. This study explores these factors through 10 in-depth interviews with patients with hATTR about what factors may be contributing to the underdiagnosis of hATTR in the Black population. Participants were selected from the Emory University Amyloidosis clinic based on having a molecular diagnosis of hATTR. Interviews were recorded and transcribed verbatim, then coded using MAXQDA software. Thematic analysis was completed to draw commonalities between participants. Upon preliminary analysis, several themes have emerged. Barriers identified include i) Misdiagnosis and a prolonged diagnostic odyssey, ii) Family communication and dynamics surrounding health issues, iii) Perceptions of healthcare and one’s own health risks, and iv) The need for more intimate provider-patient relationships and communication. Overall, this study gleaned valuable insight from members of the Black community about possible factors contributing to the underdiagnosis of hATTR, as well as potential solutions to go about resolving this issue.Keywords: cardiac amyloidosis, heart failure, TTR, genetic testing
Procedia PDF Downloads 9710 Developing a Cloud Intelligence-Based Energy Management Architecture Facilitated with Embedded Edge Analytics for Energy Conservation in Demand-Side Management
Authors: Yu-Hsiu Lin, Wen-Chun Lin, Yen-Chang Cheng, Chia-Ju Yeh, Yu-Chuan Chen, Tai-You Li
Abstract:
Demand-Side Management (DSM) has the potential to reduce electricity costs and carbon emission, which are associated with electricity used in the modern society. A home Energy Management System (EMS) commonly used by residential consumers in a down-stream sector of a smart grid to monitor, control, and optimize energy efficiency to domestic appliances is a system of computer-aided functionalities as an energy audit for residential DSM. Implementing fault detection and classification to domestic appliances monitored, controlled, and optimized is one of the most important steps to realize preventive maintenance, such as residential air conditioning and heating preventative maintenance in residential/industrial DSM. In this study, a cloud intelligence-based green EMS that comes up with an Internet of Things (IoT) technology stack for residential DSM is developed. In the EMS, Arduino MEGA Ethernet communication-based smart sockets that module a Real Time Clock chip to keep track of current time as timestamps via Network Time Protocol are designed and implemented for readings of load phenomena reflecting on voltage and current signals sensed. Also, a Network-Attached Storage providing data access to a heterogeneous group of IoT clients via Hypertext Transfer Protocol (HTTP) methods is configured to data stores of parsed sensor readings. Lastly, a desktop computer with a WAMP software bundle (the Microsoft® Windows operating system, Apache HTTP Server, MySQL relational database management system, and PHP programming language) serves as a data science analytics engine for dynamic Web APP/REpresentational State Transfer-ful web service of the residential DSM having globally-Advanced Internet of Artificial Intelligence (AI)/Computational Intelligence. Where, an abstract computing machine, Java Virtual Machine, enables the desktop computer to run Java programs, and a mash-up of Java, R language, and Python is well-suited and -configured for AI in this study. Having the ability of sending real-time push notifications to IoT clients, the desktop computer implements Google-maintained Firebase Cloud Messaging to engage IoT clients across Android/iOS devices and provide mobile notification service to residential/industrial DSM. In this study, in order to realize edge intelligence that edge devices avoiding network latency and much-needed connectivity of Internet connections for Internet of Services can support secure access to data stores and provide immediate analytical and real-time actionable insights at the edge of the network, we upgrade the designed and implemented smart sockets to be embedded AI Arduino ones (called embedded AIduino). With the realization of edge analytics by the proposed embedded AIduino for data analytics, an Arduino Ethernet shield WizNet W5100 having a micro SD card connector is conducted and used. The SD library is included for reading parsed data from and writing parsed data to an SD card. And, an Artificial Neural Network library, ArduinoANN, for Arduino MEGA is imported and used for locally-embedded AI implementation. The embedded AIduino in this study can be developed for further applications in manufacturing industry energy management and sustainable energy management, wherein in sustainable energy management rotating machinery diagnostics works to identify energy loss from gross misalignment and unbalance of rotating machines in power plants as an example.Keywords: demand-side management, edge intelligence, energy management system, fault detection and classification
Procedia PDF Downloads 2509 Settlement Prediction in Cape Flats Sands Using Shear Wave Velocity – Penetration Resistance Correlations
Authors: Nanine Fouche
Abstract:
The Cape Flats is a low-lying sand-covered expanse of approximately 460 square kilometres, situated to the southeast of the central business district of Cape Town in the Western Cape of South Africa. The aeolian sands masking this area are often loose and compressible in the upper 1m to 1.5m of the surface, and there is a general exceedance of the maximum allowable settlement in these sands. The settlement of shallow foundations on Cape Flats sands is commonly predicted using the results of in-situ tests such as the SPT or DPSH due to the difficulty of retrieving undisturbed samples for laboratory testing. Varying degrees of accuracy and reliability are associated with these methods. More recently, shear wave velocity (Vs) profiles obtained from seismic testing, such as continuous surface wave tests (CSW), are being used for settlement prediction. Such predictions have the advantage of considering non-linear stress-strain behaviour of soil and the degradation of stiffness with increasing strain. CSW tests are rarely executed in the Cape Flats, whereas SPT’s are commonly performed. For this reason, and to facilitate better settlement predictions in Cape Flats sand, equations representing shear wave velocity (Vs) as a function of SPT blow count (N60) and vertical effective stress (v’) were generated by statistical regression of site investigation data. To reveal the most appropriate method of overburden correction, analyses were performed with a separate overburden term (Pa/σ’v) as well as using stress corrected shear wave velocity and SPT blow counts (correcting Vs. and N60 to Vs1and (N1)60respectively). Shear wave velocity profiles and SPT blow count data from three sites masked by Cape Flats sands were utilised to generate 80 Vs-SPT N data pairs for analysis. Investigated terrains included sites in the suburbs of Athlone, Muizenburg, and Atlantis, all underlain by windblown deposits comprising fine and medium sand with varying fines contents. Elastic settlement analysis was also undertaken for the Cape Flats sands, using a non-linear stepwise method based on small-strain stiffness estimates, which was obtained from the best Vs-N60 model and compared to settlement estimates using the general elastic solution with stiffness profiles determined using Stroud’s (1989) and Webb’s (1969) SPT N60-E transformation models. Stroud’s method considers strain level indirectly whereasWebb’smethod does not take account of the variation in elastic modulus with strain. The expression of Vs. in terms of N60 and Pa/σv’ derived from the Atlantis data set revealed the best fit with R2 = 0.83 and a standard error of 83.5m/s. Less accurate Vs-SPT N relations associated with the combined data set is presumably the result of inversion routines used in the analysis of the CSW results showcasing significant variation in relative density and stiffness with depth. The regression analyses revealed that the inclusion of a separate overburden term in the regression of Vs and N60, produces improved fits, as opposed to the stress corrected equations in which the R2 of the regression is notably lower. It is the correction of Vs and N60 to Vs1 and (N1)60 with empirical constants ‘n’ and ‘m’ prior to regression, that introduces bias with respect to overburden pressure. When comparing settlement prediction methods, both Stroud’s method (considering strain level indirectly) and the small strain stiffness method predict higher stiffnesses for medium dense and dense profiles than Webb’s method, which takes no account of strain level in the determination of soil stiffness. Webb’s method appears to be suitable for loose sands only. The Versak software appears to underestimate differences in settlement between square and strip footings of similar width. In conclusion, settlement analysis using small-strain stiffness data from the proposed Vs-N60 model for Cape Flats sands provides a way to take account of the non-linear stress-strain behaviour of the sands when calculating settlement.Keywords: sands, settlement prediction, continuous surface wave test, small-strain stiffness, shear wave velocity, penetration resistance
Procedia PDF Downloads 1748 Climate Change Threats to UNESCO-Designated World Heritage Sites: Empirical Evidence from Konso Cultural Landscape, Ethiopia
Authors: Yimer Mohammed Assen, Abiyot Legesse Kura, Engida Esyas Dube, Asebe Regassa Debelo, Girma Kelboro Mensuro, Lete Bekele Gure
Abstract:
Climate change has posed severe threats to many cultural landscapes of UNESCO world heritage sites recently. The UNESCO State of Conservation (SOC) reports categorized flooding, temperature increment, and drought as threats to cultural landscapes. This study aimed to examine variations and trends of rainfall and temperature extreme events and their threats to the UNESCO-designated Konso Cultural Landscape in southern Ethiopia. The study used dense merged satellite-gauge station rainfall data (1981-2020) with spatial resolution of 4km by 4km and observed maximum and minimum temperature data (1987-2020). Qualitative data were also gathered from cultural leaders, local administrators, and religious leaders using structured interview checklists. The spatial patterns, coefficient of variation, standardized anomalies, trends, and magnitude of change of rainfall and temperature extreme events both at annual and seasonal levels were computed using the Mann-Kendall trend test and Sen’s slope estimator under the CDT package. The standard precipitation index (SPI) was also used to calculate drought severity, frequency, and trend maps. The data gathered from key informant interviews and focus group discussions were coded and analyzed thematically to complement statistical findings. Thematic areas that explain the impacts of extreme events on the cultural landscape were chosen for coding. The thematic analysis was conducted using Nvivo software. The findings revealed that rainfall was highly variable and unpredictable, resulting in extreme drought and flood. There were significant (P<0.05) increasing trends of heavy rainfall (R10mm and R20mm) and the total amount of rain on wet days (PRCPTOT), which might have resulted in flooding. The study also confirmed that absolute temperature extreme indices (TXx, TXn, and TNx) and the percentile-based temperature extreme indices (TX90p, TN90p, TX10p, and TN10P) showed significant (P<0.05) increasing trends which are signals for warming of the study area. The results revealed that the frequency as well as the severity of drought at 3-months (katana/hageya seasons) was more pronounced than the 12-months (annual) time scale. The highest number of droughts in 100 years is projected at a 3-months timescale across the study area. The findings also showed that frequent drought has led to loss of grasses which are used for making traditional individual houses and multipurpose communal houses (pafta), food insecurity, migration, loss of biodiversity, and commodification of stones from terrace. On the other hand, the increasing trends of rainfall extreme indices resulted in destruction of terraces, soil erosion, loss of life and damage of properties. The study shows that a persistent decline in farmland productivity, due to erratic and extreme rainfall and frequent drought occurrences, forced the local people to participate in non-farm activities and retreat from daily preservation and management of their landscape. Overall, the increasing rainfall and temperature extremes coupled with prevalence of drought are thought to have an impact on the sustainability of cultural landscape through disrupting the ecosystem services and livelihood of the community. Therefore, more localized adaptation and mitigation strategies to the changing climate are needed to maintain the sustainability of Konso cultural landscapes as a global cultural treasure and to strengthen the resilience of smallholder farmers.Keywords: adaptation, cultural landscape, drought, extremes indices
Procedia PDF Downloads 257 Full Characterization of Heterogeneous Antibody Samples under Denaturing and Native Conditions on a Hybrid Quadrupole-Orbitrap Mass Spectrometer
Authors: Rowan Moore, Kai Scheffler, Eugen Damoc, Jennifer Sutton, Aaron Bailey, Stephane Houel, Simon Cubbon, Jonathan Josephs
Abstract:
Purpose: MS analysis of monoclonal antibodies (mAbs) at the protein and peptide levels is critical during development and production of biopharmaceuticals. The compositions of current generation therapeutic proteins are often complex due to various modifications which may affect efficacy. Intact proteins analyzed by MS are detected in higher charge states that also provide more complexity in mass spectra. Protein analysis in native or native-like conditions with zero or minimal organic solvent and neutral or weakly acidic pH decreases charge state value resulting in mAb detection at higher m/z ranges with more spatial resolution. Methods: Three commercially available mAbs were used for all experiments. Intact proteins were desalted online using size exclusion chromatography (SEC) or reversed phase chromatography coupled on-line with a mass spectrometer. For streamlined use of the LC- MS platform we used a single SEC column and alternately selected specific mobile phases to perform separations in either denaturing or native-like conditions: buffer A (20 % ACN, 0.1 % FA) with Buffer B (100 mM ammonium acetate). For peptide analysis mAbs were proteolytically digested with and without prior reduction and alkylation. The mass spectrometer used for all experiments was a commercially available Thermo Scientific™ hybrid Quadrupole-Orbitrap™ mass spectrometer, equipped with the new BioPharma option which includes a new High Mass Range (HMR) mode that allows for improved high mass transmission and mass detection up to 8000 m/z. Results: We have analyzed the profiles of three mAbs under reducing and native conditions by direct infusion with offline desalting and with on-line desalting via size exclusion and reversed phase type columns. The presence of high salt under denaturing conditions was found to influence the observed charge state envelope and impact mass accuracy after spectral deconvolution. The significantly lower charge states observed under native conditions improves the spatial resolution of protein signals and has significant benefits for the analysis of antibody mixtures, e.g. lysine variants, degradants or sequence variants. This type of analysis requires the detection of masses beyond the standard mass range ranging up to 6000 m/z requiring the extended capabilities available in the new HMR mode. We have compared each antibody sample that was analyzed individually with mixtures in various relative concentrations. For this type of analysis, we observed that apparent native structures persist and ESI is benefited by the addition of low amounts of acetonitrile and formic acid in combination with the ammonium acetate-buffered mobile phase. For analyses on the peptide level we analyzed reduced/alkylated, and non-reduced proteolytic digests of the individual antibodies separated via reversed phase chromatography aiming to retrieve as much information as possible regarding sequence coverage, disulfide bridges, post-translational modifications such as various glycans, sequence variants, and their relative quantification. All data acquired were submitted to a single software package for analysis aiming to obtain a complete picture of the molecules analyzed. Here we demonstrate the capabilities of the mass spectrometer to fully characterize homogeneous and heterogeneous therapeutic proteins on one single platform. Conclusion: Full characterization of heterogeneous intact protein mixtures by improved mass separation on a quadrupole-Orbitrap™ mass spectrometer with extended capabilities has been demonstrated.Keywords: disulfide bond analysis, intact analysis, native analysis, mass spectrometry, monoclonal antibodies, peptide mapping, post-translational modifications, sequence variants, size exclusion chromatography, therapeutic protein analysis, UHPLC
Procedia PDF Downloads 3616 Development Programmes Requirements for Managing and Supporting the Ever-Dynamic Job Roles of Middle Managers in Higher Education Institutions: The Espousal Demanded from Human Resources Department; Case Studies of a New University in United Kingdom
Authors: Mohamed Sameer Mughal, Andrew D. Ross, Damian J. Fearon
Abstract:
Background: The fast-paced changing landscape of UK Higher Education Institution (HEIs) is poised by changes and challenges affecting Middle Managers (MM) in their job roles. MM contribute to the success of HEIs by balancing the equilibrium and pass organization strategies from senior staff towards operationalization directives to junior staff. However, this study showcased from the data analyzed during the semi structured interviews; MM job role is becoming more complex due to changes and challenges creating colossal pressures and workloads in day-to-day working. Current development programmes provisions by Human Resources (HR) departments in such HEIs are not feasible, applicable, and matching the true essence and requirements of MM who suggest that programmes offered by HR are too generic to suit their precise needs and require tailor made espousal to work effectively in their pertinent job roles. Methodologies: This study aims to capture demands of MM Development Needs (DN) by means of a conceptual model as conclusive part of the research that is divided into 2 phases. Phase 1 initiated by carrying out 2 pilot interviews with a retired Emeritus status professor and HR programmes development coordinator. Key themes from the pilot and literature review subsidized into formulation of 22 set of questions (Kvale and Brinkmann) in form of interviewing questionnaire during qualitative data collection. Data strategy and collection consisted of purposeful sampling of 12 semi structured interviews (n=12) lasting approximately an hour for all participants. The MM interviewed were at faculty and departmental levels which included; deans (n=2), head of departments (n=4), subject leaders (n=2), and lastly programme leaders (n=4). Participants recruitment was carried out via emails and snowballing technique. The interviews data was transcribed (verbatim) and managed using Computer Assisted Qualitative Data Analysis using Nvivo ver.11 software. Data was meticulously analyzed using Miles and Huberman inductive approach of positivistic style grounded theory, whereby key themes and categories emerged from the rich data collected. The data was precisely coded and classified into case studies (Robert Yin); with a main case study, sub cases (4 classes of MM) and embedded cases (12 individual MMs). Major Findings: An interim conceptual model emerged from analyzing the data with main concepts that included; key performance indicators (KPI’s), HEI effectiveness and outlook, practices, processes and procedures, support mechanisms, student events, rules, regulations and policies, career progression, reporting/accountability, changes and challenges, and lastly skills and attributes. Conclusion: Dynamic elements affecting MM includes; increase in government pressures, student numbers, irrelevant development programmes, bureaucratic structures, transparency and accountability, organization policies, skills sets… can only be confronted by employing structured development programmes originated by HR that are not provided generically. Future Work: Stage 2 (Quantitative method) of the study plans to validate the interim conceptual model externally through fully completed online survey questionnaire (Bram Oppenheim) from external HEIs (n=150). The total sample targeted is 1500 MM. Author contribution focuses on enhancing management theory and narrow the gap between by HR and MM development programme provision.Keywords: development needs (DN), higher education institutions (HEIs), human resources (HR), middle managers (MM)
Procedia PDF Downloads 2305 A Spatial Repetitive Controller Applied to an Aeroelastic Model for Wind Turbines
Authors: Riccardo Fratini, Riccardo Santini, Jacopo Serafini, Massimo Gennaretti, Stefano Panzieri
Abstract:
This paper presents a nonlinear differential model, for a three-bladed horizontal axis wind turbine (HAWT) suited for control applications. It is based on a 8-dofs, lumped parameters structural dynamics coupled with a quasi-steady sectional aerodynamics. In particular, using the Euler-Lagrange Equation (Energetic Variation approach), the authors derive, and successively validate, such model. For the derivation of the aerodynamic model, the Greenbergs theory, an extension of the theory proposed by Theodorsen to the case of thin airfoils undergoing pulsating flows, is used. Specifically, in this work, the authors restricted that theory under the hypothesis of low perturbation reduced frequency k, which causes the lift deficiency function C(k) to be real and equal to 1. Furthermore, the expressions of the aerodynamic loads are obtained using the quasi-steady strip theory (Hodges and Ormiston), as a function of the chordwise and normal components of relative velocity between flow and airfoil Ut, Up, their derivatives, and section angular velocity ε˙. For the validation of the proposed model, the authors carried out open and closed-loop simulations of a 5 MW HAWT, characterized by radius R =61.5 m and by mean chord c = 3 m, with a nominal angular velocity Ωn = 1.266rad/sec. The first analysis performed is the steady state solution, where a uniform wind Vw = 11.4 m/s is considered and a collective pitch angle θ = 0.88◦ is imposed. During this step, the authors noticed that the proposed model is intrinsically periodic due to the effect of the wind and of the gravitational force. In order to reject this periodic trend in the model dynamics, the authors propose a collective repetitive control algorithm coupled with a PD controller. In particular, when the reference command to be tracked and/or the disturbance to be rejected are periodic signals with a fixed period, the repetitive control strategies can be applied due to their high precision, simple implementation and little performance dependency on system parameters. The functional scheme of a repetitive controller is quite simple and, given a periodic reference command, is composed of a control block Crc(s) usually added to an existing feedback control system. The control block contains and a free time-delay system eτs in a positive feedback loop, and a low-pass filter q(s). It should be noticed that, while the time delay term reduces the stability margin, on the other hand the low pass filter is added to ensure stability. It is worth noting that, in this work, the authors propose a phase shifting for the controller and the delay system has been modified as e^(−(T−γk)), where T is the period of the signal and γk is a phase shifting of k samples of the same periodic signal. It should be noticed that, the phase shifting technique is particularly useful in non-minimum phase systems, such as flexible structures. In fact, using the phase shifting, the iterative algorithm could reach the convergence also at high frequencies. Notice that, in our case study, the shifting of k samples depends both on the rotor angular velocity Ω and on the rotor azimuth angle Ψ: we refer to this controller as a spatial repetitive controller. The collective repetitive controller has also been coupled with a C(s) = PD(s), in order to dampen oscillations of the blades. The performance of the spatial repetitive controller is compared with an industrial PI controller. In particular, starting from wind speed velocity Vw = 11.4 m/s the controller is asked to maintain the nominal angular velocity Ωn = 1.266rad/s after an instantaneous increase of wind speed (Vw = 15 m/s). Then, a purely periodic external disturbance is introduced in order to stress the capabilities of the repetitive controller. The results of the simulations show that, contrary to a simple PI controller, the spatial repetitive-PD controller has the capability to reject both external disturbances and periodic trend in the model dynamics. Finally, the nominal value of the angular velocity is reached, in accordance with results obtained with commercial software for a turbine of the same type.Keywords: wind turbines, aeroelasticity, repetitive control, periodic systems
Procedia PDF Downloads 2484 Design and Construction of a Solar Dehydration System as a Technological Strategy for Food Sustainability in Difficult-to-Access Territories
Authors: Erika T. Fajardo-Ariza, Luis A. Castillo-Sanabria, Andrea Nieto-Veloza, Carlos M. Zuluaga-Domínguez
Abstract:
The growing emphasis on sustainable food production and preservation has driven the development of innovative solutions to minimize postharvest losses and improve market access for small-scale farmers. This project focuses on designing, constructing, and selecting materials for solar dryers in certain regions of Colombia where inadequate infrastructure limits access to major commercial hubs. Postharvest losses pose a significant challenge, impacting food security and farmer income. Addressing these losses is crucial for enhancing the value of agricultural products and supporting local economies. A comprehensive survey of local farmers revealed substantial challenges, including limited market access, inefficient transportation, and significant postharvest losses. For crops such as coffee, bananas, and citrus fruits, losses range from 0% to 50%, driven by factors like labor shortages, adverse climatic conditions, and transportation difficulties. To address these issues, the project prioritized selecting effective materials for the solar dryer. Various materials, recovered acrylic, original acrylic, glass, and polystyrene, were tested for their performance. The tests showed that recovered acrylic and glass were most effective in increasing the temperature difference between the interior and the external environment. The solar dryer was designed using Fusion 360® software (Autodesk, USA) and adhered to architectural guidelines from Architectural Graphic Standards. It features up to sixteen aluminum trays, each with a maximum load capacity of 3.5 kg, arranged in two levels to optimize drying efficiency. The constructed dryer was then tested with two locally available plant materials: green plantains (Musa paradisiaca L.) and snack bananas (Musa AA Simonds). To monitor performance, Thermo hygrometers and an Arduino system recorded internal and external temperature and humidity at one-minute intervals. Despite challenges such as adverse weather conditions and delays in local government funding, the active involvement of local producers was a significant advantage, fostering ownership and understanding of the project. The solar dryer operated under conditions of 31°C dry bulb temperature (Tbs), 55% relative humidity, and 21°C wet bulb temperature (Tbh). The drying curves showed a consistent drying period with critical moisture content observed between 200 and 300 minutes, followed by a sharp decrease in moisture loss, reaching an equilibrium point after 3,400 minutes. Although the solar dryer requires more time and is highly dependent on atmospheric conditions, it can approach the efficiency of an electric dryer when properly optimized. The successful design and construction of solar dryer systems in difficult-to-access areas represent a significant advancement in agricultural sustainability and postharvest loss reduction. By choosing effective materials such as recovered acrylic and implementing a carefully planned design, the project provides a valuable tool for local farmers. The initiative not only improves the quality and marketability of agricultural products but also offers broader environmental benefits, such as reduced reliance on fossil fuels and decreased waste. Additionally, it supports economic growth by enhancing the value of crops and potentially increasing farmer income. The successful implementation and testing of the dryer, combined with the engagement of local stakeholders, highlight its potential for replication and positive impact in similar contexts.Keywords: drying technology, postharvest loss reduction, solar dryers, sustainable agriculture
Procedia PDF Downloads 283 Open Science Philosophy, Research and Innovation
Authors: C.Ardil
Abstract:
Open Science translates the understanding and application of various theories and practices in open science philosophy, systems, paradigms and epistemology. Open Science originates with the premise that universal scientific knowledge is a product of a collective scholarly and social collaboration involving all stakeholders and knowledge belongs to the global society. Scientific outputs generated by public research are a public good that should be available to all at no cost and without barriers or restrictions. Open Science has the potential to increase the quality, impact and benefits of science and to accelerate advancement of knowledge by making it more reliable, more efficient and accurate, better understandable by society and responsive to societal challenges, and has the potential to enable growth and innovation through reuse of scientific results by all stakeholders at all levels of society, and ultimately contribute to growth and competitiveness of global society. Open Science is a global movement to improve accessibility to and reusability of research practices and outputs. In its broadest definition, it encompasses open access to publications, open research data and methods, open source, open educational resources, open evaluation, and citizen science. The implementation of open science provides an excellent opportunity to renegotiate the social roles and responsibilities of publicly funded research and to rethink the science system as a whole. Open Science is the practice of science in such a way that others can collaborate and contribute, where research data, lab notes and other research processes are freely available, under terms that enable reuse, redistribution and reproduction of the research and its underlying data and methods. Open Science represents a novel systematic approach to the scientific process, shifting from the standard practices of publishing research results in scientific publications towards sharing and using all available knowledge at an earlier stage in the research process, based on cooperative work and diffusing scholarly knowledge with no barriers and restrictions. Open Science refers to efforts to make the primary outputs of publicly funded research results (publications and the research data) publicly accessible in digital format with no limitations. Open Science is about extending the principles of openness to the whole research cycle, fostering, sharing and collaboration as early as possible, thus entailing a systemic change to the way science and research is done. Open Science is the ongoing transition in how open research is carried out, disseminated, deployed, and transformed to make scholarly research more open, global, collaborative, creative and closer to society. Open Science involves various movements aiming to remove the barriers for sharing any kind of output, resources, methods or tools, at any stage of the research process. Open Science embraces open access to publications, research data, source software, collaboration, peer review, notebooks, educational resources, monographs, citizen science, or research crowdfunding. The recognition and adoption of open science practices, including open science policies that increase open access to scientific literature and encourage data and code sharing, is increasing in the open science philosophy. Revolutionary open science policies are motivated by ethical, moral or utilitarian arguments, such as the right to access digital research literature for open source research or science data accumulation, research indicators, transparency in the field of academic practice, and reproducibility. Open science philosophy is adopted primarily to demonstrate the benefits of open science practices. Researchers use open science applications for their own advantage in order to get more offers, increase citations, attract media attention, potential collaborators, career opportunities, donations and funding opportunities. In open science philosophy, open data findings are evidence that open science practices provide significant benefits to researchers in scientific research creation, collaboration, communication, and evaluation according to more traditional closed science practices. Open science considers concerns such as the rigor of peer review, common research facts such as financing and career development, and the sacrifice of author rights. Therefore, researchers are recommended to implement open science research within the framework of existing academic evaluation and incentives. As a result, open science research issues are addressed in the areas of publishing, financing, collaboration, resource management and sharing, career development, discussion of open science questions and conclusions.Keywords: Open Science, Open Science Philosophy, Open Science Research, Open Science Data
Procedia PDF Downloads 1292 Numerical Simulation of Von Karman Swirling Bioconvection Nanofluid Flow from a Deformable Rotating Disk
Authors: Ali Kadir, S. R. Mishra, M. Shamshuddin, O. Anwar Beg
Abstract:
Motivation- Rotating disk bio-reactors are fundamental to numerous medical/biochemical engineering processes including oxygen transfer, chromatography, purification and swirl-assisted pumping. The modern upsurge in biologically-enhanced engineering devices has embraced new phenomena including bioconvection of micro-organisms (photo-tactic, oxy-tactic, gyrotactic etc). The proven thermal performance superiority of nanofluids i.e. base fluids doped with engineered nanoparticles has also stimulated immense implementation in biomedical designs. Motivated by these emerging applications, we present a numerical thermofluid dynamic simulation of the transport phenomena in bioconvection nanofluid rotating disk bioreactor flow. Methodology- We study analytically and computationally the time-dependent three-dimensional viscous gyrotactic bioconvection in swirling nanofluid flow from a rotating disk configuration. The disk is also deformable i.e. able to extend (stretch) in the radial direction. Stefan blowing is included. The Buongiorno dilute nanofluid model is adopted wherein Brownian motion and thermophoresis are the dominant nanoscale effects. The primitive conservation equations for mass, radial, tangential and axial momentum, heat (energy), nanoparticle concentration and micro-organism density function are formulated in a cylindrical polar coordinate system with appropriate wall and free stream boundary conditions. A mass convective condition is also incorporated at the disk surface. Forced convection is considered i.e. buoyancy forces are neglected. This highly nonlinear, strongly coupled system of unsteady partial differential equations is normalized with the classical Von Karman and other transformations to render the boundary value problem (BVP) into an ordinary differential system which is solved with the efficient Adomian decomposition method (ADM). Validation with earlier Runge-Kutta shooting computations in the literature is also conducted. Extensive computations are presented (with the aid of MATLAB symbolic software) for radial and circumferential velocity components, temperature, nanoparticle concentration, micro-organism density number and gradients of these functions at the disk surface (radial local skin friction, local circumferential skin friction, Local Nusselt number, Local Sherwood number, motile microorganism mass transfer rate). Main Findings- Increasing radial stretching parameter decreases radial velocity and radial skin friction, reduces azimuthal velocity and skin friction, decreases local Nusselt number and motile micro-organism mass wall flux whereas it increases nano-particle local Sherwood number. Disk deceleration accelerates the radial flow, damps the azimuthal flow, decreases temperatures and thermal boundary layer thickness, depletes the nano-particle concentration magnitudes (and associated nano-particle species boundary layer thickness) and furthermore decreases the micro-organism density number and gyrotactic micro-organism species boundary layer thickness. Increasing Stefan blowing accelerates the radial flow and azimuthal (circumferential flow), elevates temperatures of the nanofluid, boosts nano-particle concentration (volume fraction) and gyrotactic micro-organism density number magnitudes whereas suction generates the reverse effects. Increasing suction effect reduces radial skin friction and azimuthal skin friction, local Nusselt number, and motile micro-organism wall mass flux whereas it enhances the nano-particle species local Sherwood number. Conclusions - Important transport characteristics are identified of relevance to real bioreactor nanotechnological systems not discussed in previous works. ADM is shown to achieve very rapid convergence and highly accurate solutions and shows excellent promise in simulating swirling multi-physical nano-bioconvection fluid dynamics problems. Furthermore, it provides an excellent complement to more general commercial computational fluid dynamics simulations.Keywords: bio-nanofluids, rotating disk bioreactors, Von Karman swirling flow, numerical solutions
Procedia PDF Downloads 1561 Developing VR-Based Neurorehabilitation Support Tools: A Step-by-Step Approach for Cognitive Rehabilitation and Pain Distraction during Invasive Techniques in Hospital Settings
Authors: Alba Prats-Bisbe, Jaume López-Carballo, David Leno-Colorado, Alberto García Molina, Alicia Romero Marquez, Elena Hernández Pena, Eloy Opisso Salleras, Raimon Jané Campos
Abstract:
Neurological disorders are a leading cause of disability and premature mortality worldwide. Neurorehabilitation (NRHB) is a clinical process aimed at reducing functional impairment, promoting societal participation, and improving the quality of life for affected individuals. Virtual reality (VR) technology is emerging as a promising NRHB support tool. Its immersive nature fosters a strong sense of agency and embodiment, motivating patients to engage in meaningful tasks and increasing adherence to therapy. However, the clinical benefits of VR interventions are challenging to determine due to the high heterogeneity among health applications. This study explores a stepwise development approach for creating VR-based tools to assist individuals with neurological disorders in medical practice, aiming to enhance reproducibility, facilitate comparison, and promote the generalization of findings. Building on previous research, the step-by-step methodology encompasses: Needs Identification– conducting cross-disciplinary meetings to brainstorm problems, solutions, and address barriers. Intervention Definition– target population, set goals, and conceptualize the VR system (equipment and environments). Material Selection and Placement– choose appropriate hardware and software, place the device within the hospital setting, and test equipment. Co-design– collaboratively create VR environments, user interfaces, and data management strategies. Prototyping– develop VR prototypes, conduct user testing, and make iterative redesigns. Usability and Feasibility Assessment– design protocols and conduct trials with stakeholders in the hospital setting. Efficacy Assessment– conduct clinical trials to evaluate outcomes and long-term effects. Cost-Effectiveness Validation– assess reproducibility, sustainability, and balance between costs and benefits. NRHB is complex due to the multifaceted needs of patients and the interdisciplinary healthcare architecture. VR has the potential to support various applications, such as motor skill training, cognitive tasks, pain management, unilateral spatial neglect (diagnosis and treatment), mirror therapy, and ecologically valid activities of daily living. Following this methodology was crucial for launching a VR-based system in a real hospital environment. Collaboration with neuropsychologists lead to develop A) a VR-based tool for cognitive rehabilitation in patients with acquired brain injury (ABI). The system comprises a head-mounted display (HTC Vive Pro Eye) and 7 tasks targeting attention, memory, and executive functions. A desktop application facilitates session configuration, while database records in-game variables. The VR tool's usability and feasibility were demonstrated in proof-of-concept trials with 20 patients, and effectiveness is being tested through a clinical protocol with 12 patients completing 24-session treatment. Another case involved collaboration with nurses and paediatric physiatrists to create B) a VR-based distraction tool during invasive techniques. The goal is to alleviate pain and anxiety associated with botulinum toxin (BTX) injections, blood tests, or intravenous placements. An all-in-one headset (HTC Vive Focus 3) deploys 360º videos to improve the experience for paediatric patients and their families. This study presents a framework for developing clinically relevant and technologically feasible VR-based support tools for hospital settings. Despite differences in patient type, intervention purpose, and VR system, the methodology demonstrates usability, viability, reproducibility and preliminary clinical benefits. It highlights the importance approach centred on clinician and patient needs for any aspect of NRHB within a real hospital setting.Keywords: neurological disorders, neurorehabilitation, stepwise development approach, virtual reality
Procedia PDF Downloads 30