Search results for: unmanned intervention algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6093

Search results for: unmanned intervention algorithm

213 A First Step towards Automatic Evolutionary for Gas Lifts Allocation Optimization

Authors: Younis Elhaddad, Alfonso Ortega

Abstract:

Oil production by means of gas lift is a standard technique in oil production industry. To optimize the total amount of oil production in terms of the amount of gas injected is a key question in this domain. Different methods have been tested to propose a general methodology. Many of them apply well-known numerical methods. Some of them have taken into account the power of evolutionary approaches. Our goal is to provide the experts of the domain with a powerful automatic searching engine into which they can introduce their knowledge in a format close to the one used in their domain, and get solutions comprehensible in the same terms, as well. These proposals introduced in the genetic engine the most expressive formal models to represent the solutions to the problem. These algorithms have proven to be as effective as other genetic systems but more flexible and comfortable for the researcher although they usually require huge search spaces to justify their use due to the computational resources involved in the formal models. The first step to evaluate the viability of applying our approaches to this realm is to fully understand the domain and to select an instance of the problem (gas lift optimization) in which applying genetic approaches could seem promising. After analyzing the state of the art of this topic, we have decided to choose a previous work from the literature that faces the problem by means of numerical methods. This contribution includes details enough to be reproduced and complete data to be carefully analyzed. We have designed a classical, simple genetic algorithm just to try to get the same results and to understand the problem in depth. We could easily incorporate the well mathematical model, and the well data used by the authors and easily translate their mathematical model, to be numerically optimized, into a proper fitness function. We have analyzed the 100 curves they use in their experiment, similar results were observed, in addition, our system has automatically inferred an optimum total amount of injected gas for the field compatible with the addition of the optimum gas injected in each well by them. We have identified several constraints that could be interesting to incorporate to the optimization process but that could be difficult to numerically express. It could be interesting to automatically propose other mathematical models to fit both, individual well curves and also the behaviour of the complete field. All these facts and conclusions justify continuing exploring the viability of applying the approaches more sophisticated previously proposed by our research group.

Keywords: evolutionary automatic programming, gas lift, genetic algorithms, oil production

Procedia PDF Downloads 142
212 Predicting Career Adaptability and Optimism among University Students in Turkey: The Role of Personal Growth Initiative and Socio-Demographic Variables

Authors: Yagmur Soylu, Emir Ozeren, Erol Esen, Digdem M. Siyez, Ozlem Belkis, Ezgi Burc, Gülce Demirgurz

Abstract:

The aim of the study is to determine the predictive power of personal growth initiative, socio-demographic variables (such as sex, grade, and working condition) on career adaptability and optimism of bachelor students in Dokuz Eylul University in Turkey. According to career construction theory, career adaptability is viewed as a psychosocial construct, which refers to an individual’s resources for dealing with current and expected tasks, transitions and traumas in their occupational roles. Career optimism is defined as positive results for future career development of individuals in the expectation that it will achieve or to put the emphasis on the positive aspects of the event and feel comfortable about the career planning process. Personal Growth Initiative (PGI) is defined as being proactive about one’s personal development. Additionally, personal growth is defined as the active and intentional engagement in the process of personal. A study conducted on college students revealed that individuals with high self-development orientation make more effort to discover the requirements of the profession and workspaces than individuals with low levels of personal development orientation. University life is a period that social relations and the importance of academic activities are increased, the students make efforts to progress through their career paths and it is also an environment that offers opportunities to students for their self-realization. For these reasons, personal growth initiative is potentially an important variable which has a key role for an individual during the transition phase from university to the working life. Based on the review of the literature, it is expected that individual’s personal growth initiative, sex, grade, and working condition would significantly predict one’s career adaptability. In the relevant literature, it can be seen that there are relatively few studies available on the career adaptability and optimism of university students. Most of the existing studies have been carried out with limited respondents. In this study, the authors aim to conduct a comprehensive research with a large representative sample of bachelor students in Dokuz Eylul University, Izmir, Turkey. By now, personal growth initiative and career development constructs have been predominantly discussed in western contexts where individualistic tendencies are likely to be seen. Thus, the examination of the same relationship within the context of Turkey where collectivistic cultural characteristics can be more observed is expected to offer valuable insights and provide an important contribution to the literature. The participants in this study were comprised of 1500 undergraduate students being included from thirteen faculties in Dokuz Eylul University. Stratified and random sampling methods were adopted for the selection of the participants. The Personal Growth Initiative Scale-II and Career Futures Inventory were used as the major measurement tools. In data analysis stage, several statistical analysis concerning the regression analysis, one-way ANOVA and t-test will be conducted to reveal the relationships of the constructs under investigation. At the end of this project, we will be able to determine the level of career adaptability and optimism of university students at varying degrees so that a fertile ground is likely to be created to carry out several intervention techniques to make a contribution to an emergence of a healthier and more productive youth generation in psycho-social sense.

Keywords: career optimism, career adaptability, personal growth initiative, university students

Procedia PDF Downloads 389
211 Structural Monitoring of Externally Confined RC Columns with Inadequate Lap-Splices, Using Fibre-Bragg-Grating Sensors

Authors: Petros M. Chronopoulos, Evangelos Z. Astreinidis

Abstract:

A major issue of the structural assessment and rehabilitation of existing RC structures is the inadequate lap-splicing of the longitudinal reinforcement. Although prohibited by modern Design Codes, the practice of arranging lap-splices inside the critical regions of RC elements was commonly applied in the past. Today this practice is still the rule, at least for conventional new buildings. Therefore, a lot of relevant research is ongoing in many earthquake prone countries. The rehabilitation of deficient lap-splices of RC elements by means of external confinement is widely accepted as the most efficient technique. If correctly applied, this versatile technique offers a limited increase of flexural capacity and a considerable increase of local ductility and of axial and shear capacities. Moreover, this intervention does not affect the stiffness of the elements and does not affect the dynamic characteristics of the structure. This technique has been extensively discussed and researched contributing to vast accumulation of technical and scientific knowledge that has been reported in relevant books, reports and papers, and included in recent Design Codes and Guides. These references are mostly dealing with modeling and redesign, covering both the enhanced (axial and) shear capacity (due to the additional external closed hoops or jackets) and the increased ductility (due to the confining action, preventing the unzipping of lap-splices and the buckling of continuous reinforcement). An analytical and experimental program devoted to RC members with lap-splices is completed in the Lab. of RC/NTU of Athens/GR. This program aims at the proposal of a rational and safe theoretical model and the calibration of the relevant Design Codes’ provisions. Tests, on forty two (42) full scale specimens, covering mostly beams and columns (not walls), strengthened or not, with adequate or inadequate lap-splices, have been already performed and evaluated. In this paper, the results of twelve (12) specimens under fully reversed cyclic actions are presented and discussed. In eight (8) specimens the lap-splices were inadequate (splicing length of 20 or 30 bar diameters) and they were retrofitted before testing by means of additional external confinement. The two (2) most commonly applied confining materials were used in this study, namely steel and FRPs. More specifically, jackets made of CFRP wraps or light cages made of mild steel were applied. The main parameters of these tests were (i) the degree of confinement (internal and external), and (ii) the length of lap-splices, equal to 20, 30 or 45 bar diameters. These tests were thoroughly instrumented and monitored, by means of conventional (LVDTs, strain gages, etc.) and innovative (optic fibre-Bragg-grating) sensors. This allowed for a thorough investigation of the most influencing design parameter, namely the hoop-stress developed in the confining material. Based on these test results and on comparisons with the provisions of modern Design Codes, it could be argued that shorter (than the normative) lap-splices, commonly found in old structures, could still be effective and safe (at least for lengths more than an absolute minimum), depending on the required ductility, if a properly arranged and adequately detailed external confinement is applied.

Keywords: concrete, fibre-Bragg-grating sensors, lap-splices, retrofitting / rehabilitation

Procedia PDF Downloads 228
210 Foucault and Governmentality: International Organizations and State Power

Authors: Sara Dragisic

Abstract:

Using the theoretical analysis of the birth of biopolitics that Foucault performed through the history of liberalism and neoliberalism, in this paper we will try to show how, precisely through problematizing the role of international institutions, the model of governance differs from previous ways of objectifying body and life. Are the state and its mechanisms still a Leviathan to fight against, or can it be even the driver of resistance against the proponents of modern governance and the biopolitical power? Do paradigmatic examples of biopolitics still appear through sovereignty and (international) law, or is it precisely this sphere that shows a significant dose of incompetence and powerlessness in relation to, not only the economic sphere (Foucault’s critique of neoliberalism) but also the new politics of freedom? Have the struggle for freedom and human rights, as well as the war on terrorism, opened a new spectrum of biopolitical processes, which are manifested precisely through new international institutions and humanitarian discourse? We will try to answer these questions, in the following way. On the one hand, we will show that the views of authors such as Agamben and Hardt and Negri, in whom the state and sovereignty are seen as enemies to be defeated or overcome, fail to see how such attempts could translate into the politicization of life like it is done in many examples through the doctrine of liberal interventionism and humanitarianism. On the other hand, we will point out that it is precisely the humanitarian discourse and the defense of the right to intervention that can be the incentive and basis for the politicization of the category of life and lead to the selective application of human rights. Zizek example of the killing of United Nations workers and doctors in a village during the Vietnam War, who were targeted even before police or soldiers, because they were precisely seen as a powerful instrument of American imperialism (as they were sincerely trying to help the population), will be focus of this part of the analysis. We’ll ask the question whether such interpretation is a kind of liquidation of the extreme left of the political (Laclau) or on this basis can be explained at least in part the need to review the functioning of international organizations, ranging from those dealing with humanitarian aid (and humanitarian military interventions) to those dealing with protection and the security of the population, primarily from growing terrorism. Based on the above examples, we will also explain how the discourse of terrorism itself plays a dual role: it can appear as a tool of liberal biopolitics, although, more superficially, it mostly appears as an enemy that wants to destroy the liberal system and its values. This brings us to the basic problem that this paper will tackle: do the mechanisms of institutional struggle for human rights and freedoms, which is often seen as opposed to the security mechanisms of the state, serve the governance of citizens in such a way that the latter themselves participate in producing biopolitical governmental practices? Is the freedom today "nothing but the correlative development of apparatuses of security" (Foucault)? Or, we can continue this line of Foucault’s argumentation and enhance the interpretation with the important question of what precisely today reflects the change in the rationality of governance in which society is transformed from a passive object into a subject of its own production. Finally, in order to understand the skills of biopolitical governance in modern civil society, it is necessary to pay attention to the status of international organizations, which seem to have become a significant place for the implementation of global governance. In this sense, the power of sovereignty can turn out to be an insufficiently strong power of security policy, which can go hand in hand with freedom policies, through neoliberal governmental techniques.

Keywords: neoliberalism, Foucault, sovereignty, biopolitics, international organizations, NGOs, Agamben, Hardt&Negri, Zizek, security, state power

Procedia PDF Downloads 179
209 Advancements in Arthroscopic Surgery Techniques for Anterior Cruciate Ligament (ACL) Reconstruction

Authors: Islam Sherif, Ahmed Ashour, Ahmed Hassan, Hatem Osman

Abstract:

Anterior Cruciate Ligament (ACL) injuries are common among athletes and individuals participating in sports with sudden stops, pivots, and changes in direction. Arthroscopic surgery is the gold standard for ACL reconstruction, aiming to restore knee stability and function. Recent years have witnessed significant advancements in arthroscopic surgery techniques, graft materials, and technological innovations, revolutionizing the field of ACL reconstruction. This presentation delves into the latest advancements in arthroscopic surgery techniques for ACL reconstruction and their potential impact on patient outcomes. Traditionally, autografts from the patellar tendon, hamstring tendon, or quadriceps tendon have been commonly used for ACL reconstruction. However, recent studies have explored the use of allografts, synthetic scaffolds, and tissue-engineered grafts as viable alternatives. This abstract evaluates the benefits and potential drawbacks of each graft type, considering factors such as graft incorporation, strength, and risk of graft failure. Moreover, the application of augmented reality (AR) and virtual reality (VR) technologies in surgical planning and intraoperative navigation has gained traction. AR and VR platforms provide surgeons with detailed 3D anatomical reconstructions of the knee joint, enhancing preoperative visualization and aiding in graft tunnel placement during surgery. We discuss the integration of AR and VR in arthroscopic ACL reconstruction procedures, evaluating their accuracy, cost-effectiveness, and overall impact on surgical outcomes. Beyond graft selection and surgical navigation, patient-specific planning has gained attention in recent research. Advanced imaging techniques, such as MRI-based personalized planning, enable surgeons to tailor ACL reconstruction procedures to each patient's unique anatomy. By accounting for individual variations in the femoral and tibial insertion sites, this personalized approach aims to optimize graft placement and potentially improve postoperative knee kinematics and stability. Furthermore, rehabilitation and postoperative care play a crucial role in the success of ACL reconstruction. This abstract explores novel rehabilitation protocols, emphasizing early mobilization, neuromuscular training, and accelerated recovery strategies. Integrating technology, such as wearable sensors and mobile applications, into postoperative care can facilitate remote monitoring and timely intervention, contributing to enhanced rehabilitation outcomes. In conclusion, this presentation provides an overview of the cutting-edge advancements in arthroscopic surgery techniques for ACL reconstruction. By embracing innovative graft materials, augmented reality, patient-specific planning, and technology-driven rehabilitation, orthopedic surgeons and sports medicine specialists can achieve superior outcomes in ACL injury management. These developments hold great promise for improving the functional outcomes and long-term success rates of ACL reconstruction, benefitting athletes and patients alike.

Keywords: arthroscopic surgery, ACL, autograft, allograft, graft materials, ACL reconstruction, synthetic scaffolds, tissue-engineered graft, virtual reality, augmented reality, surgical planning, intra-operative navigation

Procedia PDF Downloads 64
208 Isotope Effects on Inhibitors Binding to HIV Reverse Transcriptase

Authors: Agnieszka Krzemińska, Katarzyna Świderek, Vicente Molinier, Piotr Paneth

Abstract:

In order to understand in details the interactions between ligands and the enzyme isotope effects were studied between clinically used drugs that bind in the active site of Human Immunodeficiency Virus Reverse Transcriptase, HIV-1 RT, as well as triazole-based inhibitor that binds in the allosteric pocket of this enzyme. The magnitudes and origins of the resulting binding isotope effects were analyzed. Subsequently, binding isotope effect of the same triazole-based inhibitor bound in the active site were analyzed and compared. Together, these results show differences in binding origins in two sites of the enzyme and allow to analyze binding mode and place of newly synthesized inhibitors. Typical protocol is described below on the example of triazole ligand in the allosteric pocket. Triazole was docked into allosteric cavity of HIV-1 RT with Glide using extra-precision mode as implemented in Schroedinger software. The structure of HIV-1 RT was obtained from Protein Data Bank as structure of PDB ID 2RKI. The pKa for titratable amino acids was calculated using PROPKA software, and in order to neutralize the system 15 Cl- were added using tLEaP package implemented in AMBERTools ver.1.5. Also N-terminals and C-terminals were build using tLEaP. The system was placed in 144x160x144Å3 orthorhombic box of water molecules using NAMD program. Missing parameters for triazole were obtained at the AM1 level using Antechamber software implemented in AMBERTools. The energy minimizations were carried out by means of a conjugate gradient algorithm using NAMD. Then system was heated from 0 to 300 K with temperature increment 0.001 K. Subsequently 2 ns Langevin−Verlet (NVT) MM MD simulation with AMBER force field implemented in NAMD was carried out. Periodic Boundary Conditions and cut-offs for the nonbonding interactions, range radius from 14.5 to 16 Å, are used. After 2 ns relaxation 200 ps of QM/MM MD at 300 K were simulated. The triazole was treated quantum mechanically at the AM1 level, protein was described using AMBER and water molecules were described using TIP3P, as implemented in fDynamo library. Molecules 20 Å apart from the triazole were kept frozen, with cut-offs established on range radius from 14.5 to 16 Å. In order to describe interactions between triazole and RT free energy of binding using Free Energy Perturbation method was done. The change in frequencies from ligand in solution to ligand bounded in enzyme was used to calculate binding isotope effects.

Keywords: binding isotope effects, molecular dynamics, HIV, reverse transcriptase

Procedia PDF Downloads 407
207 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 185
206 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 377
205 Development of Coastal Inundation–Inland and River Flow Interface Module Based on 2D Hydrodynamic Model

Authors: Eun-Taek Sin, Hyun-Ju Jang, Chang Geun Song, Yong-Sik Han

Abstract:

Due to the climate change, the coastal urban area repeatedly suffers from the loss of property and life by flooding. There are three main causes of inland submergence. First, when heavy rain with high intensity occurs, the water quantity in inland cannot be drained into rivers by increase in impervious surface of the land development and defect of the pump, storm sewer. Second, river inundation occurs then water surface level surpasses the top of levee. Finally, Coastal inundation occurs due to rising sea water. However, previous studies ignored the complex mechanism of flooding, and showed discrepancy and inadequacy due to linear summation of each analysis result. In this study, inland flooding and river inundation were analyzed together by HDM-2D model. Petrov-Galerkin stabilizing method and flux-blocking algorithm were applied to simulate the inland flooding. In addition, sink/source terms with exponentially growth rate attribute were added to the shallow water equations to include the inland flooding analysis module. The applications of developed model gave satisfactory results, and provided accurate prediction in comprehensive flooding analysis. The applications of developed model gave satisfactory results, and provided accurate prediction in comprehensive flooding analysis. To consider the coastal surge, another module was developed by adding seawater to the existing Inland Flooding-River Inundation binding module for comprehensive flooding analysis. Based on the combined modules, the Coastal Inundation – Inland & River Flow Interface was simulated by inputting the flow rate and depth data in artificial flume. Accordingly, it was able to analyze the flood patterns of coastal cities over time. This study is expected to help identify the complex causes of flooding in coastal areas where complex flooding occurs, and assist in analyzing damage to coastal cities. Acknowledgements—This research was supported by a grant ‘Development of the Evaluation Technology for Complex Causes of Inundation Vulnerability and the Response Plans in Coastal Urban Areas for Adaptation to Climate Change’ [MPSS-NH-2015-77] from the Natural Hazard Mitigation Research Group, Ministry of Public Safety and Security of Korea.

Keywords: flooding analysis, river inundation, inland flooding, 2D hydrodynamic model

Procedia PDF Downloads 335
204 Skin-to-Skin Contact Simulation: Improving Health Outcomes for Medically Fragile Newborns in the Neonatal Intensive Care Unit

Authors: Gabriella Zarlenga, Martha L. Hall

Abstract:

Introduction: Premature infants are at risk for neurodevelopmental deficits and hospital readmissions, which can increase the financial burden on the health care system and families. Kangaroo care (skin-to-skin contact) is a practice that can improve preterm infant health outcomes. Preterm infants can acquire adequate body temperature, heartbeat, and breathing regulation through lying directly on the mother’s abdomen and in between her breasts. Due to some infant’s condition, kangaroo care is not a feasible intervention. The purpose of this proof-of-concept research project is to create a device which simulates skin-to-skin contact for pre-term infants not eligible for kangaroo care, with the aim of promoting baby’s health outcomes, reducing the incidence of serious neonatal and early childhood illnesses, and/or improving cognitive, social and emotional aspects of development. Methods: The study design is a proof-of-concept based on a three-phase approach; (1) observational study and data analysis of the standard of care for 2 groups of pre-term infants, (2) design and concept development of a novel device for pre-term infants not currently eligible for standard kangaroo care, and (3) prototyping, laboratory testing, and evaluation of the novel device in comparison to current assessment parameters of kangaroo care. A single center study will be conducted in an area hospital offering Level III neonatal intensive care. Eligible participants include newborns born premature (28-30 weeks of age) admitted to the NICU. The study design includes 2 groups: a control group receiving standard kangaroo care and an experimental group not eligible for kangaroo care. Based on behavioral analysis of observational video data collected in the NICU, the device will be created to simulate mother’s body using electrical components in a thermoplastic polymer housing covered in silicone. It will be designed with a microprocessor that controls simulated respiration, heartbeat, and body temperature of the 'simulated caregiver' by using a pneumatic lung, vibration sensors (heartbeat), pressure sensors (weight/position), and resistive film to measure temperature. A slight contour of the simulator surface may be integrated to help position the infant correctly. Control and monitoring of the skin-to-skin contact simulator would be performed locally by an integrated touchscreen. The unit would have built-in Wi-Fi connectivity as well as an optional Bluetooth connection in which the respiration and heart rate could be synced with a parent or caregiver. A camera would be integrated, allowing a video stream of the infant in the simulator to be streamed to a monitoring location. Findings: Expected outcomes are stabilization of respiratory and cardiac rates, thermoregulation of those infants not eligible for skin to skin contact with their mothers, and real time mother Bluetooth to the device to mimic the experience in the womb. Results of this study will benefit clinical practice by creating a new standard of care for premature neonates in the NICU that are deprived of skin to skin contact due to various health restrictions.

Keywords: kangaroo care, wearable technology, pre-term infants, medical design

Procedia PDF Downloads 137
203 Developing a Methodology to Examine Psychophysiological Responses during Stress Exposure and Relaxation: An Experimental Paradigm

Authors: M. Velana, G. Rinkenauer

Abstract:

Nowadays, nurses are facing unprecedented amounts of pressure due to the ongoing global health demands. Work-related stress can cause a high physical and psychological workload, which can lead, in turn, to burnout. On the physiological level, stress triggers an initial activation of the sympathetic nervous and adrenomedullary systems resulting in increases in cardiac activity. Furthermore, activation of the hypothalamus-pituitary-adrenal axis provokes endocrine and immune changes leading to the release of cortisol and cytokines in an effort to re-establish body balance. Based on the current state of the literature, it has been identified that resilience and mindfulness exercises among nurses can effectively decrease stress and improve mood. However, it is still unknown what relaxation techniques would be suitable for and to what extent would be effective to decrease psychophysiological arousal deriving from either a physiological or a psychological stressor. Moreover, although cardiac activity and cortisol are promising candidates to examine the effectiveness of relaxation to reduce stress, it still remains to shed light on the role of cytokines in this process so as to thoroughly understand the body’s response to stress and to relaxation. Therefore, the main aim of the present study is to develop a comprehensive experimental paradigm and assess different relaxation techniques, namely progressive muscle relaxation and a mindfulness exercise originating from cognitive therapy by means of biofeedback, under highly controlled laboratory conditions. An experimental between-subject design will be employed, where 120 participants will be randomized either to a physiological or a psychological stress-related experiment. Particularly, the cold pressor test refers to a procedure in which the participants have to immerse their non-dominant hands into ice water (2-3 °C) for 3 min. The participants are requested to keep their hands in the water throughout the whole duration. However, they can immediately terminate the test in case it would be barely tolerable. A pre-test anticipation phase and a post-stress period of 3 min, respectively, are planned. The Trier Social Stress Test will be employed to induce psychological stress. During this laboratory stressor, the participants are instructed to give a 5-min speech in front of a committee of communication specialists. Before the main task, there is a 10-min anticipation period. Subsequently, participants are requested to perform an unexpected arithmetic task. After stress exposure, the participants will perform one of the relaxation exercises (treatment condition) or watch a neutral video (control condition). Electrocardiography, salivary samples, and self-report will be collected at different time points. The preliminary results deriving from the pilot study showed that the aforementioned paradigm could effectively induce stress reactions and that relaxation might decrease the impact of stress exposure. It is of utmost importance to assess how the human body responds under different stressors and relaxation exercises so that an evidence-based intervention could be transferred in a clinical setting to improve nurses’ general health. Based on suggestive future laboratory findings, the research group plans to conduct a pilot-level randomized study to decrease stress and promote well-being among nurses who work in the stress-riddled environment of a hospital located in Northern Germany.

Keywords: nurses, psychophysiology, relaxation, stress

Procedia PDF Downloads 88
202 Design and Implementation of Generative Models for Odor Classification Using Electronic Nose

Authors: Kumar Shashvat, Amol P. Bhondekar

Abstract:

In the midst of the five senses, odor is the most reminiscent and least understood. Odor testing has been mysterious and odor data fabled to most practitioners. The delinquent of recognition and classification of odor is important to achieve. The facility to smell and predict whether the artifact is of further use or it has become undesirable for consumption; the imitation of this problem hooked on a model is of consideration. The general industrial standard for this classification is color based anyhow; odor can be improved classifier than color based classification and if incorporated in machine will be awfully constructive. For cataloging of odor for peas, trees and cashews various discriminative approaches have been used Discriminative approaches offer good prognostic performance and have been widely used in many applications but are incapable to make effectual use of the unlabeled information. In such scenarios, generative approaches have better applicability, as they are able to knob glitches, such as in set-ups where variability in the series of possible input vectors is enormous. Generative models are integrated in machine learning for either modeling data directly or as a transitional step to form an indeterminate probability density function. The algorithms or models Linear Discriminant Analysis and Naive Bayes Classifier have been used for classification of the odor of cashews. Linear Discriminant Analysis is a method used in data classification, pattern recognition, and machine learning to discover a linear combination of features that typifies or divides two or more classes of objects or procedures. The Naive Bayes algorithm is a classification approach base on Bayes rule and a set of qualified independence theory. Naive Bayes classifiers are highly scalable, requiring a number of restraints linear in the number of variables (features/predictors) in a learning predicament. The main recompenses of using the generative models are generally a Generative Models make stronger assumptions about the data, specifically, about the distribution of predictors given the response variables. The Electronic instrument which is used for artificial odor sensing and classification is an electronic nose. This device is designed to imitate the anthropological sense of odor by providing an analysis of individual chemicals or chemical mixtures. The experimental results have been evaluated in the form of the performance measures i.e. are accuracy, precision and recall. The investigational results have proven that the overall performance of the Linear Discriminant Analysis was better in assessment to the Naive Bayes Classifier on cashew dataset.

Keywords: odor classification, generative models, naive bayes, linear discriminant analysis

Procedia PDF Downloads 357
201 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 214
200 Assessing Online Learning Paths in an Learning Management Systems Using a Data Mining and Machine Learning Approach

Authors: Alvaro Figueira, Bruno Cabral

Abstract:

Nowadays, students are used to be assessed through an online platform. Educators have stepped up from a period in which they endured the transition from paper to digital. The use of a diversified set of question types that range from quizzes to open questions is currently common in most university courses. In many courses, today, the evaluation methodology also fosters the students’ online participation in forums, the download, and upload of modified files, or even the participation in group activities. At the same time, new pedagogy theories that promote the active participation of students in the learning process, and the systematic use of problem-based learning, are being adopted using an eLearning system for that purpose. However, although there can be a lot of feedback from these activities to student’s, usually it is restricted to the assessments of online well-defined tasks. In this article, we propose an automatic system that informs students of abnormal deviations of a 'correct' learning path in the course. Our approach is based on the fact that by obtaining this information earlier in the semester, may provide students and educators an opportunity to resolve an eventual problem regarding the student’s current online actions towards the course. Our goal is to prevent situations that have a significant probability to lead to a poor grade and, eventually, to failing. In the major learning management systems (LMS) currently available, the interaction between the students and the system itself is registered in log files in the form of registers that mark beginning of actions performed by the user. Our proposed system uses that logged information to derive new one: the time each student spends on each activity, the time and order of the resources used by the student and, finally, the online resource usage pattern. Then, using the grades assigned to the students in previous years, we built a learning dataset that is used to feed a machine learning meta classifier. The produced classification model is then used to predict the grades a learning path is heading to, in the current year. Not only this approach serves the teacher, but also the student to receive automatic feedback on her current situation, having past years as a perspective. Our system can be applied to online courses that integrate the use of an online platform that stores user actions in a log file, and that has access to other student’s evaluations. The system is based on a data mining process on the log files and on a self-feedback machine learning algorithm that works paired with the Moodle LMS.

Keywords: data mining, e-learning, grade prediction, machine learning, student learning path

Procedia PDF Downloads 101
199 A Randomized, Controlled Trial To Test Behavior Change Techniques (BCTS) To Improve Low Intensity Physical Activity In Older Adults

Authors: Ciaran Friel, Jerry Suls, Patrick Robles, Frank Vicari, Joan Duer-Hefele, Karina W. Davidson

Abstract:

Physical activity guidelines focus on increasing moderate intensity activity for older adults, but adherence to recommendations remains low. This is despite the fact that scientific evidence supports that any increase in physical activity is positively correlated with health benefits. Behavior change techniques (BCTs) have demonstrated effectiveness in reducing sedentary behavior and promoting physical activity. This pilot study uses a Personalized Trials (N-of-1) design to evaluate the efficacy of using four BCTs to promote an increase in low-intensity physical activity (2,000 steps of walking per day) in adults aged 45-75 years old. The 4 BCTs tested were goal setting, action planning, feedback, and self-monitoring. BCTs were tested in random order and delivered by text message prompts requiring participant response. The study recruited health system employees in the target age range, without mobility restrictions and demonstrating interest in increasing their daily activity by a minimum of 2,000 steps per day for a minimum of five days per week. Participants were sent a Fitbit Charge 4 fitness tracker with an established study account and password. Participants were recommended to wear the Fitbit device 24/7, but were required to wear it for a minimum of ten hours per day. Baseline physical activity was measured by the Fitbit for two weeks. Participants then engaged with a clinical research coordinator to review comprehension of the text message content and required actions for each of the BCTs to be tested. Participants then selected a consistent daily time in which they would receive their text message prompt. In the 8 week intervention phase of the study, participants received each of the four BCTs, in random order, for a two week period. Text message prompts were delivered daily at a time selected by the participant. All prompts required an interactive response from participants and may have included recording their detailed plan for walking or daily step goal (action planning, goal setting). Additionally, participants may have been directed to a study dashboard to view their step counts or compare themselves with peers (self-monitoring, feedback). At the end of each two week testing interval, participants were asked to complete the Self-Efficacy for Walking Scale (SEW_Dur), a validated measure that assesses the participant’s confidence in walking incremental distances and a survey measuring their satisfaction with the individual BCT that they tested. At the end of their trial, participants received a personalized summary of their step data in response to each individual BCT. Analysis will examine the novel individual-level heterogeneity of treatment effect made possible by N-of-1 design, and pool results across participants to efficiently estimate the overall efficacy of the selected behavioral change techniques in increasing low-intensity walking by 2,000 steps, 5 days per week. Self-efficacy will be explored as the likely mechanism of action prompting behavior change. This study will inform the providers and demonstrate the feasibility of N-of-1 study design to effectively promote physical activity as a component of healthy aging.

Keywords: aging, exercise, habit, walking

Procedia PDF Downloads 108
198 Medical Workforce Knowledge of Adrenaline (Epinephrine) Administration in Anaphylaxis in Adults Considerably Improved with Training in an UK Hospital from 2010 to 2017

Authors: Jan C. Droste, Justine Burns, Nithin Narayan

Abstract:

Introduction: Life-threatening detrimental effects of inappropriate adrenaline (epinephrine) administration, e.g., by giving the wrong dose, in the context of anaphylaxis management is well documented in the medical literature. Half of the fatal anaphylactic reactions in the UK are iatrogenic, and the median time to a cardio-respiratory arrest can be as short as 5 minutes. It is therefore imperative that hospital doctors of all grades have active and accurate knowledge of the correct route, site, and dosage of administration of adrenaline. Given this time constraint and the potential fatal outcome with inappropriate management of anaphylaxis, it is alarming that surveys over the last 15 years have repeatedly shown only a minority of doctors to have accurate knowledge of adrenaline administration as recommended by the UK Resuscitation Council guidelines (2008 updated 2012). This comparison of survey results of the medical workforce over several years in a small NHS District General Hospital was conducted in order to establish the effect of the employment of multiple educational methods regarding adrenaline administration in anaphylaxis in adults. Methods: Between 2010 and 2017, several education methods and tools were used to repeatedly inform the medical workforce (doctors and advanced clinical practitioners) in a single district general hospital regarding the treatment of anaphylaxis in adults. Whilst the senior staff remained largely the same cohort, junior staff had changed fully in every survey. Examples included: (i) Formal teaching -in Grand Rounds; during the junior doctors’ induction process; advanced life support courses (ii) In-situ simulation training performed by the clinical skills simulation team –several ad hoc sessions and one 3-day event in 2017 visiting 16 separate clinical areas performing an acute anaphylaxis scenario using actors- around 100 individuals from multi-disciplinary teams were involved (iii) Hospital-wide distribution of the simulation event via the Trust’s Simulation Newsletter (iv) Laminated algorithms were attached to the 'crash trolleys' (v) A short email 'alert' was sent to all medical staff 3 weeks prior to the survey detailing the emergency treatment of anaphylaxis (vi) In addition, the performance of the surveys themselves represented a teaching opportunity when gaps in knowledge could be addressed. Face to face surveys were carried out in 2010 ('pre-intervention), 2015, and 2017, in the latter two occasions including advanced clinical practitioners (ACP). All surveys consisted of convenience samples. If verbal consent to conduct the survey was obtained, the medical practitioners' answers were recorded immediately on a data collection sheet. Results: There was a sustained improvement in the knowledge of the medical workforce from 2010 to 2017: Answers improved regarding correct drug by 11% (84%, 95%, and 95%); the correct route by 20% (76%, 90%, and 96%); correct site by 40% (43%, 83%, and 83%) and the correct dose by 45% (27%, 54%, and 72%). Overall, knowledge of all components -correct drug, route, site, and dose-improved from 13% in 2010 to 62% in 2017. Conclusion: This survey comparison shows knowledge of the medical workforce regarding adrenaline administration for treatment of anaphylaxis in adults can be considerably improved by employing a variety of educational methods.

Keywords: adrenaline, anaphylaxis, epinephrine, medical education, patient safety

Procedia PDF Downloads 108
197 Improving the Technology of Assembly by Use of Computer Calculations

Authors: Mariya V. Yanyukina, Michael A. Bolotov

Abstract:

Assembling accuracy is the degree of accordance between the actual values of the parameters obtained during assembly, and the values specified in the assembly drawings and technical specifications. However, the assembling accuracy depends not only on the quality of the production process but also on the correctness of the assembly process. Therefore, preliminary calculations of assembly stages are carried out to verify the correspondence of real geometric parameters to their acceptable values. In the aviation industry, most calculations involve interacting dimensional chains. This greatly complicates the task. Solving such problems requires a special approach. The purpose of this article is to carry out the problem of improving the technology of assembly of aviation units by use of computer calculations. One of the actual examples of the assembly unit, in which there is an interacting dimensional chain, is the turbine wheel of gas turbine engine. Dimensional chain of turbine wheel is formed by geometric parameters of disk and set of blades. The interaction of the dimensional chain consists in the formation of two chains. The first chain is formed by the dimensions that determine the location of the grooves for the installation of the blades, and the dimensions of the blade roots. The second dimensional chain is formed by the dimensions of the airfoil shroud platform. The interaction of the dimensional chain of the turbine wheel is the interdependence of the first and second chains by means of power circuits formed by a plurality of middle parts of the turbine blades. The timeliness of the calculation of the dimensional chain of the turbine wheel is the need to improve the technology of assembly of this unit. The task at hand contains geometric and mathematical components; therefore, its solution can be implemented following the algorithm: 1) research and analysis of production errors by geometric parameters; 2) development of a parametric model in the CAD system; 3) creation of set of CAD-models of details taking into account actual or generalized distributions of errors of geometrical parameters; 4) calculation model in the CAE-system, loading of various combinations of models of parts; 5) the accumulation of statistics and analysis. The main task is to pre-simulate the assembly process by calculating the interacting dimensional chains. The article describes the approach to the solution from the point of view of mathematical statistics, implemented in the software package Matlab. Within the framework of the study, there are data on the measurement of the components of the turbine wheel-blades and disks, as a result of which it is expected that the assembly process of the unit will be optimized by solving dimensional chains.

Keywords: accuracy, assembly, interacting dimension chains, turbine

Procedia PDF Downloads 350
196 Mental Health Promotion for Children of Mentally Ill Parents in Schools. Assessment and Promotion of Teacher Mental Health Literacy in Order to Promote Child Related Mental Health (Teacher-MHL)

Authors: Dirk Bruland, Paulo Pinheiro, Ullrich Bauer

Abstract:

Introduction: Over 3 million children, about one quarter of all students, experience at least one parent with mental disorder in Germany every year. Children of mentally-ill parents are at considerably higher risk of developing serious mental health problems. The different burden patterns and coping attempts often become manifest in children's school lives. In this context, schools can have an important protective function, but can also create risk potentials. In reference to Jorm, pupil-related teachers’ mental health literacy (Teacher-MHL) includes the ability to recognize change behaviour, the knowledge of risk factors, the implementation of first aid intervention, and seeking professional help (teacher as gatekeeper). Although teachers’ knowledge and increased awareness of this topic is essential, the literature provides little information on the extent of teachers' abilities. As part of a German-wide research consortium on health literacy, this project, launched in March for 3 years, will conduct evidence-based mental health literacy research. The primary objective is to measure Teacher-MHL in the context of pupil-related psychosocial factors at primary and secondary schools (grades 5 & 6), while also focussing on children’s social living conditions. Methods: (1) A systematic literature review in different databases to identify papers with regard to Teacher-MHL (completed). (2) Based on these results, an interview guide was developed. This research step includes a qualitative pre-study to inductively survey the general profiles of teachers (n=24). The evaluation will be presented on the conference. (3) These findings will be translated into a quantitative teacher survey (n=2500) in order to assess the extent of socio-analytical skills of teachers as well as in relation to institutional and individual characteristics. (4) Based on results 1 – 3, developing a training program for teachers. Results: The review highlights a lack of information for Teacher-MHL and their skills, especially related to high-risk-groups like children of mentally ill parents. The literature is limited to a few studies only. According to these, teacher are not good at identifying burdened children and if they identify those children they do not know how to handle the situations in school. They are not sufficiently trained to deal with these children, especially there are great uncertainties in dealing with the teaching situation. Institutional means and resources are missing as well. Such a mismatch can result in insufficient support and use of opportunities for children at risk. First impressions from the interviews confirm these results and allow a greater insight in the everyday school-life according to critical life events in families. Conclusions: For the first time schools will be addressed as a setting where children are especially "accessible" for measures of health promotion. Addressing Teacher-MHL gives reason to expect high effectiveness. Targeting professionals' abilities for dealing with this high-risk-group leads to a discharge for teacher themselves to handle those situations and increases school health promotion. In view of the fact that only 10-30% of such high-risk families accept offers of therapy and assistance, this will be the first primary preventive and health-promoting approach to protect the health of a yet unaffected, but particularly burdened, high-risk group.

Keywords: children of mentally ill parents, health promotion, mental health literacy, school

Procedia PDF Downloads 515
195 Dogs Chest Homogeneous Phantom for Image Optimization

Authors: Maris Eugênia Dela Rosa, Ana Luiza Menegatti Pavan, Marcela De Oliveira, Diana Rodrigues De Pina, Luis Carlos Vulcano

Abstract:

In medical veterinary as well as in human medicine, radiological study is essential for a safe diagnosis in clinical practice. Thus, the quality of radiographic image is crucial. In last year’s there has been an increasing substitution of image acquisition screen-film systems for computed radiology equipment (CR) without technical charts adequacy. Furthermore, to carry out a radiographic examination in veterinary patient is required human assistance for restraint this, which can compromise image quality by generating dose increasing to the animal, for Occupationally Exposed and also the increased cost to the institution. The image optimization procedure and construction of radiographic techniques are performed with the use of homogeneous phantoms. In this study, we sought to develop a homogeneous phantom of canine chest to be applied to the optimization of these images for the CR system. In carrying out the simulator was created a database with retrospectives chest images of computed tomography (CT) of the Veterinary Hospital of the Faculty of Veterinary Medicine and Animal Science - UNESP (FMVZ / Botucatu). Images were divided into four groups according to the animal weight employing classification by sizes proposed by Hoskins & Goldston. The thickness of biological tissues were quantified in a 80 animals, separated in groups of 20 animals according to their weights: (S) Small - equal to or less than 9.0 kg, (M) Medium - between 9.0 and 23.0 kg, (L) Large – between 23.1 and 40.0kg and (G) Giant – over 40.1 kg. Mean weight for group (S) was 6.5±2.0 kg, (M) 15.0±5.0 kg, (L) 32.0±5.5 kg and (G) 50.0 ±12.0 kg. An algorithm was developed in Matlab in order to classify and quantify biological tissues present in CT images and convert them in simulator materials. To classify tissues presents, the membership functions were created from the retrospective CT scans according to the type of tissue (adipose, muscle, bone trabecular or cortical and lung tissue). After conversion of the biologic tissue thickness in equivalent material thicknesses (acrylic simulating soft tissues, bone tissues simulated by aluminum and air to the lung) were obtained four different homogeneous phantoms, with (S) 5 cm of acrylic, 0,14 cm of aluminum and 1,8 cm of air; (M) 8,7 cm of acrylic, 0,2 cm of aluminum and 2,4 cm of air; (L) 10,6 cm of acrylic, 0,27 cm of aluminum and 3,1 cm of air and (G) 14,8 cm of acrylic, 0,33 cm of aluminum and 3,8 cm of air. The developed canine homogeneous phantom is a practical tool, which will be employed in future, works to optimize veterinary X-ray procedures.

Keywords: radiation protection, phantom, veterinary radiology, computed radiography

Procedia PDF Downloads 395
194 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana

Authors: Gautier Viaud, Paul-Henry Cournède

Abstract:

Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.

Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models

Procedia PDF Downloads 281
193 Design and Construction of a Home-Based, Patient-Led, Therapeutic, Post-Stroke Recovery System Using Iterative Learning Control

Authors: Marco Frieslaar, Bing Chu, Eric Rogers

Abstract:

Stroke is a devastating illness that is the second biggest cause of death in the world (after heart disease). Where it does not kill, it leaves survivors with debilitating sensory and physical impairments that not only seriously harm their quality of life, but also cause a high incidence of severe depression. It is widely accepted that early intervention is essential for recovery, but current rehabilitation techniques largely favor hospital-based therapies which have restricted access, expensive and specialist equipment and tend to side-step the emotional challenges. In addition, there is insufficient funding available to provide the long-term assistance that is required. As a consequence, recovery rates are poor. The relatively unexplored solution is to develop therapies that can be harnessed in the home and are formulated from technologies that already exist in everyday life. This would empower individuals to take control of their own improvement and provide choice in terms of when and where they feel best able to undertake their own healing. This research seeks to identify how effective post-stroke, rehabilitation therapy can be applied to upper limb mobility, within the physical context of a home rather than a hospital. This is being achieved through the design and construction of an automation scheme, based on iterative learning control and the Riener muscle model, that has the ability to adapt to the user and react to their level of fatigue and provide tangible physical recovery. It utilizes a SMART Phone and laptop to construct an iterative learning control (ILC) system, that monitors upper arm movement in three dimensions, as a series of exercises are undertaken. The equipment generates functional electrical stimulation to assist in muscle activation and thus improve directional accuracy. In addition, it monitors speed, accuracy, areas of motion weakness and similar parameters to create a performance index that can be compared over time and extrapolated to establish an independent and objective assessment scheme, plus an approximate estimation of predicted final outcome. To further extend its assessment capabilities, nerve conduction velocity readings are taken by the software, between the shoulder and hand muscles. This is utilized to measure the speed of response of neuron signal transfer along the arm and over time, an online indication of regeneration levels can be obtained. This will prove whether or not sufficient training intensity is being achieved even before perceivable movement dexterity is observed. The device also provides the option to connect to other users, via the internet, so that the patient can avoid feelings of isolation and can undertake movement exercises together with others in a similar position. This should create benefits not only for the encouragement of rehabilitation participation, but also an emotional support network potential. It is intended that this approach will extend the availability of stroke recovery options, enable ease of access at a low cost, reduce susceptibility to depression and through these endeavors, enhance the overall recovery success rate.

Keywords: home-based therapy, iterative learning control, Riener muscle model, SMART phone, stroke rehabilitation

Procedia PDF Downloads 243
192 Traumatic Brain Injury Neurosurgical Care Continuum Delays in Mulago Hospital in Kampala Uganda

Authors: Silvia D. Vaca, Benjamin J. Kuo, Joao Ricardo Nickenig Vissoci, Catherine A. Staton, Linda W. Xu, Michael Muhumuza, Hussein Ssenyonjo, John Mukasa, Joel Kiryabwire, Henry E. Rice, Gerald A. Grant, Michael M. Haglund

Abstract:

Background: Patients with traumatic brain injury (TBI) can develop rapid neurological deterioration from swelling and intracranial hematomas, which can result in focal tissue ischemia, brain compression, and herniation. Moreover, delays in management increase the risk of secondary brain injury from hypoxemia and hypotension. Therefore, in TBI patients with subdural hematomas (SDHs) and epidural hematomas (EDHs), surgical intervention is both necessary and time sensitive. Significant delays are seen along the care continuum in low- and middle-income countries (LMICs) largely due to limited healthcare capacity to address the disproportional rates of TBI in Sub Saharan Africa (SSA). While many LMICs have subsidized systems to offset surgical costs, the burden of securing funds by the patients for medications, supplies, and CT diagnostics poses a significant challenge to timely surgical interventions. In Kampala Uganda, the challenge of obtaining timely CT scans is twofold: logistical and financial barriers. These bottlenecks contribute significantly to the care continuum delays and are associated with poor TBI outcomes. Objective: The objectives of this study are to 1) describe the temporal delays through a modified three delays model that fits the context of neurosurgical interventions for TBI patients in Kampala and 2) investigate the association between delays and mortality. Methods: Prospective data were collected for 563 TBI patients presenting to a tertiary hospital in Kampala from 1 June – 30 November 2016. Four time intervals were constructed along five time points: injury, hospital arrival, neurosurgical evaluation, CT results, and definitive surgery. Time interval differences among mild, moderate and severe TBI and their association with mortality were analyzed. Results: The mortality rate of all TBI patients presenting to MNRH was 9.6%, which ranged from 4.7% for mild and moderate TBI patients receiving surgery to 81.8% for severe TBI patients who failed to receive surgery. The duration from injury to surgery varied considerably across TBI severity with the largest gap seen between mild TBI (174 hours) and severe TBI (69 hours) patients. Further analysis revealed care continuum differences for interval 3 (neurosurgical evaluation to CT result) and 4 (CT result to surgery) between severe TBI patients (7 hours for interval 3 and 24 hours for interval 4) and mild TBI patients (19 hours for interval 3, and 96 hours for interval 4). These post-arrival delays were associated with mortality for mild (p=0.05) and moderate TBI (p=0.03) patients. Conclusions: To our knowledge, this is the first analysis using a modified 'three delays' framework to analyze the care continuum of TBI patients in Uganda from injury to surgery. We found significant associations between delays and mortality for mild and moderate TBI patients. As it currently stands, poorer outcomes were observed for these mild and moderate TBI patients who were managed non-operatively or failed to receive surgery while surgical services were shunted to more severely ill patients. While well intentioned, high mortality rates were still observed for the severe TBI patients managed surgically. These results suggest the need for future research to optimize triage practices, understand delay contributors, and improve pre-hospital logistical referral systems.

Keywords: care continuum, global neurosurgery, Kampala Uganda, LMIC, Mulago, traumatic brain injury

Procedia PDF Downloads 190
191 A New Method Separating Relevant Features from Irrelevant Ones Using Fuzzy and OWA Operator Techniques

Authors: Imed Feki, Faouzi Msahli

Abstract:

Selection of relevant parameters from a high dimensional process operation setting space is a problem frequently encountered in industrial process modelling. This paper presents a method for selecting the most relevant fabric physical parameters for each sensory quality feature. The proposed relevancy criterion has been developed using two approaches. The first utilizes a fuzzy sensitivity criterion by exploiting from experimental data the relationship between physical parameters and all the sensory quality features for each evaluator. Next an OWA aggregation procedure is applied to aggregate the ranking lists provided by different evaluators. In the second approach, another panel of experts provides their ranking lists of physical features according to their professional knowledge. Also by applying OWA and a fuzzy aggregation model, the data sensitivity-based ranking list and the knowledge-based ranking list are combined using our proposed percolation technique, to determine the final ranking list. The key issue of the proposed percolation technique is to filter automatically and objectively the relevant features by creating a gap between scores of relevant and irrelevant parameters. It permits to automatically generate threshold that can effectively reduce human subjectivity and arbitrariness when manually choosing thresholds. For a specific sensory descriptor, the threshold is defined systematically by iteratively aggregating (n times) the ranking lists generated by OWA and fuzzy models, according to a specific algorithm. Having applied the percolation technique on a real example, of a well known finished textile product especially the stonewashed denims, usually considered as the most important quality criteria in jeans’ evaluation, we separate the relevant physical features from irrelevant ones for each sensory descriptor. The originality and performance of the proposed relevant feature selection method can be shown by the variability in the number of physical features in the set of selected relevant parameters. Instead of selecting identical numbers of features with a predefined threshold, the proposed method can be adapted to the specific natures of the complex relations between sensory descriptors and physical features, in order to propose lists of relevant features of different sizes for different descriptors. In order to obtain more reliable results for selection of relevant physical features, the percolation technique has been applied for combining the fuzzy global relevancy and OWA global relevancy criteria in order to clearly distinguish scores of the relevant physical features from those of irrelevant ones.

Keywords: data sensitivity, feature selection, fuzzy logic, OWA operators, percolation technique

Procedia PDF Downloads 576
190 Navigating Rapids And Collecting Medical Insights: A Data Collection Of Athletes Presenting To The Medical Team At The International Canoe Federation Canoe Slalom World Championships 2023

Authors: Dr Grace Scaplehorn, Mr Muhammad Adeel Akhtar, Dr Jane Gibson

Abstract:

Background: Canoe Slalom entails the skilful navigation of a carbon composite canoe or kayak through a series of 18-25 hanging gates, strategically positioned along the course, either upstream or downstream, amidst currents of whitewater rapids in natural and man-made river settings. Athletes compete individually in timed trials, competing for the fastest course time, typically around 80 to 120 seconds. In the new discipline of Kayak Cross, descents of the course are initiated by groups of four athletes freefalling simultaneously from a starting platform situated 3m above the river. Kayak Cross athletes, in contrast to Canoe Slalom, can make physical contact with suspended gates without incurring time penalties and are required to perform a kayak roll half way down the course. The Canoe Slalom World Championships were held at Lee Valley Whitewater Centre, London, from 19th to 24th September 2023. The event comprised 299 international athletes competing for 10 World Championship titles in Canoe/Kayak Slalom events (Olympic Debut Munich 1972), and the new Kayak Cross discipline (Olympic Debut Paris 2024). The inaugural appearance of Kayak Cross at the World Championships occurred in 2017, in Pau, France. There is limited literature surrounding Kayak Cross and the incidence of athlete injuries compared to traditional Canoe Slalom, hence it was felt important to undertake this review to address the perception that the event is dangerous. Aim: The study aimed to quantify and collate data collected from athletes presenting to the event medical centre. Methods: Athletes’ details were collected at initial assessments from the start of the practice period (16th–18th September) and throughout the event. Demographics such as age, sex and nationality were recorded along with presenting complaints, treatment, medication administered and outcome. Specifically, injuries were then sub-classified into body regions. The data does not include athletes who sought medical attention from their own governing body’s medical team. Results: During the 8-day period, there were 11 individual presentations to the medical centre, 3.7% of the athlete population (n=299). The mean age was 23.9 years (n=7), 6 were male (n=10). The most common presentation was minor injury (n=9), with 6 being musculoskeletal and 3 comprising skin damage, followed by insect sting/allergy (n=1) and pain relief requests (n=1). Five presentations were event-related, all being musculoskeletal injuries; 2 shoulder/arm, 1 head/neck, 1 hand/wrist and 1 other (data was not recorded). Of these injuries, the only intervention was 2 cases of 400mg Ibuprofen, which was given to both shoulder/arm injuries. Four of the 11 presentations were pre-existing injuries, which had been exacerbated due to increased intensity of practice. Two patients were advised to return for review, with 100% compliance. There were no unplanned re-presentations, and no emergency transfers to secondary care. Both the Kayak Cross and Canoe Slalom competitions resulted in 1 new event-related athlete presentation each. Conclusion: The event resulted in a negligible incidence of presentations at the medical centre, for both Kayak Cross and Canoe Slalom. This data holds significance in informing risk assessments and medical protocols necessary for the organisation of canoe slalom events.

Keywords: canoe slalom, kayak cross, athlete injuries, event injuries

Procedia PDF Downloads 33
189 Invisible to Invaluable - How Social Media is Helping Tackle Stigma and Discrimination Against Informal Waste Pickers of Bengaluru

Authors: Varinder Kaur Gambhir, Neema Gupta, Sonal Tickoo Chaudhuri

Abstract:

Bengaluru, a rapidly growing metropolis in India, with a population of 12.5 million citizens, generates 5,757 metric tonnes of solid waste per day. Despite their invaluable contribution to waste management, society and the economy, waste pickers face significant stigma, suspicion and contempt and are left with a sense of shame about their work. In this context, BBC Media Action was funded by the H&M Foundation to develop a 3-year multi-phase social media campaign to shift perceptions of waste picking and informal waste pickers amongst the Bengaluru population. Research has been used to inform project strategy and adaptation, at all stages. Formative research to inform campaign strategy used mixed methods– 14 focused group discussions followed by 406 online surveys – to explore people’s knowledge of, and attitudes towards waste pickers, and identify potential barriers and motivators to changing perceptions. Use of qualitative techniques like metaphor maps (using bank of pictures rather than direct questions to understand mindsets) helped establish the invisibility of informal waste pickers, and the quantitative research enabled audience segmentation based on attitudes towards informal waste pickers. To pretest the campaign idea, eight I-GDs (individual interaction followed by group discussions) were conducted to allow interviewees to first freely express their feelings individually, before discussing in a group. Robert Plucthik’s ‘wheel of emotions’ was used to understand audience’s emotional response to the content. A robust monitoring and evaluation is being conducted (baseline and first phase of monitoring already completed) using a rotating longitudinal panel of 1,800 social media users (exposed and unexposed to the campaign), recruited face to face and representative of the social media universe of Bengaluru city. In addition, qualitative in-depth interviews are being conducted after each phase to better understand change drivers. The research methodology and ethical protocols for impact evaluation have been independently reviewed by an Institutional Review Board. Formative research revealed that while waste on the streets is visible and is of concern to the public, informal waste pickers are virtually ‘invisible’, for most people in Bengaluru Pretesting research revealed that the creative outputs evoked emotions like acceptance and gratitude towards waste-pickers, suggesting that the content had the potential to encourage attitudinal change. After the first phase of campaign, social media analytics show that #Invaluables content reached at least 2.6 million unique people (21% of the Bengaluru population) through Facebook and Instagram. Further, impact monitoring results show significant improvements in spontaneous awareness of different segments of informal waste pickers ( such as sorters at scrap shops or dry waste collection centres -from 10% at baseline to 16% amongst exposed and no change amongst unexposed), recognition that informal waste pickers help the environment (71% at baseline to 77% among exposed and no change among unexposed) and greater discussion about informal waste pickers among those exposed (60%) as against not exposed (49%). Using the insights from this research, the planned social media intervention is designed to increase the visibility of and appreciation for the work of waste pickers in Bengaluru, supporting a more inclusive society.

Keywords: awareness, discussion, discrimination, informal waste pickers, invisibility, social media campaign, waste management

Procedia PDF Downloads 72
188 Supporting 'vulnerable' Students to Complete Their Studies During the Economic Crisis in Greece: The Umbrella Program of International Hellenic University

Authors: Rigas Kotsakis, Nikolaos Tsigilis, Vasilis Grammatikopoulos, Evridiki Zachopoulou

Abstract:

During the last decade, Greece has faced an unprecedented financial crisis, affecting various aspects and functionalities of Higher Education. Besides the restricted funding of academic institutions, the students and their families encountered economical difficulties that undoubtedly influenced the effective completion of their studies. In this context, a fairly large number of students in Alexander campus of International Hellenic University (IHU) delay, interrupt, or even abandon their studies, especially when they come from low-income families, belong to sensitive social or special needs groups, they have different cultural origins, etc. For this reason, a European project, named “Umbrella”, was initiated aiming at providing the necessary psychological support and counseling, especially to disadvantaged students, towards the completion of their studies. To this end, a network of various academic members (academic staff and students) from IHU, namely iMentor, were implicated in different roles. Specifically, experienced academic staff trained students to serve as intermediate links for the integration and educational support of students that fall into the aforementioned sensitive social groups and face problems for the completion of their studies. The main idea of the project is held upon its person-centered character, which facilitates direct student-to-student communication without the intervention of the teaching staff. The backbone of the iMentors network are senior students that face no problem in their academic life and volunteered for this project. It should be noted that there is a provision from the Umbrella structure for substantial and ethical rewards for their engagement. In this context, a well-defined, stringent methodology was implemented for the evaluation of the extent of the problem in IHU and the detection of the profile of the “candidate” disadvantaged students. The first phase included two steps, (a) data collection and (b) data cleansing/ preprocessing. The first step involved the data collection process from the Secretary Services of all Schools in IHU, from 1980 to 2019, which resulted in 96.418 records. The data set included the School name, the semester of studies, a student enrolling criteria, the nationality, the graduation year or the current, up-to-date academic state (still studying, delayed, dropped off, etc.). The second step of the employed methodology involved the data cleansing/preprocessing because of the existence of “noisy” data, missing and erroneous values, etc. Furthermore, several assumptions and grouping actions were imposed to achieve data homogeneity and an easy-to-interpret subsequent statistical analysis. Specifically, the duration of 40 years recording was limited to the last 15 years (2004-2019). In 2004 the Greek Technological Institutions were evolved into Higher Education Universities, leading into a stable and unified frame of graduate studies. In addition, the data concerning active students were excluded from the analysis since the initial processing effort was focused on the detection of factors/variables that differentiated graduate and deleted students. The final working dataset included 21.432 records with only two categories of students, those that have a degree and those who abandoned their studies. Findings of the first phase are presented across faculties and further discussed.

Keywords: higher education, students support, economic crisis, mentoring

Procedia PDF Downloads 91
187 The Association between C-Reactive Protein and Hypertension with Different US Participants Ethnicity-Findings from National Health and Nutrition Examination Survey 1999-2010

Authors: Ghada Abo-Zaid

Abstract:

The main objective of this study was to examine the association between the elevated level of CRP and incidence of hypertension before and after adjusting by age, BMI, gender, SES, smoking, diabetes, cholesterol LDL and cholesterol HDL and to determine whether the association were differ by race. Method: Cross sectional data for participations from age 17 to age 74 years who included in The National Health and Nutrition Examination Survey (NHANES) from 1999 to 2010 were analysed. CRP level was classified into three categories ( > 3mg/L, between 1mg/LL and 3mg/L, and < 3 mg/L). Blood pressure categorization was done using JNC 7 algorithm Hypertension defined as either systolic blood pressure (SBP) of 140 mmHg or more and disystolic blood pressure (DBP) of 90mmHg or greater, otherwise a self-reported prior diagnosis by a physician. Pre-hypertension was defined as (139 > SBP > 120 or 89 > DPB > 80). Multinominal regression model was undertaken to measure the association between CRP level and hypertension. Results: In univariable models, CRP concentrations > 3 mg/L were associated with a 73% greater risk of incident hypertension compared with CRP concentrations < 1 mg/L (Hypertension: odds ratio [OR] = 1.73; 95% confidence interval [CI], 1.50-1.99). Ethnic comparisons showed that American Mexican had the highest risk of incident hypertension (odds ratio [OR] = 2.39; 95% confidence interval [CI], 2.21-2.58).This risk was statistically insignificant, however, either after controlling by other variables (Hypertension: OR = 0.75; 95% CI, 0.52-1.08,), or categorized by race [American Mexican: odds ratio [OR] = 1.58; 95% confidence interval [CI], 0,58-4.26, Other Hispanic: odds ratio [OR] = 0.87; 95% confidence interval [CI], 0.19-4.42, Non-Hispanic white: odds ratio [OR] = 0.90; 95% confidence interval [CI], 0.50-1.59, Non-Hispanic Black: odds ratio [OR] = 0.44; 95% confidence interval [CI], 0.22-0,87]. The same results were found for pre-hypertension, and the Non-Hispanic black showed the highest significant risk for Pre-Hypertension (odds ratio [OR] = 1.60; 95% confidence interval [CI], 1.26-2.03). When CRP concentrations were between 1.0-3.0 mg/L, in an unadjusted models prehypertension was associated with higher likelihood of elevated CRP (OR = 1.37; 95% CI, 1.15-1.62). The same relationship was maintained in Non-Hispanic white, Non-Hispanic black, and other race (Non-Hispanic white: OR = 1.24; 95% CI, 1.03-1.48, Non-Hispanic black: OR = 1.60; 95% CI, 1.27-2.03, other race: OR = 2.50; 95% CI, 1.32-4.74) while the association was insignificant with American Mexican and other Hispanic. In the adjusted model, the relationship between CRP and prehypertension were no longer available. In contrary, Hypertension was not independently associated with elevated CRP, and the results were the same after grouped by race or adjusted by the confounder variables. The same results were obtained when SBP or DBP were on a continuous measure. Conclusions: This study confirmed the existence of an association between hypertension, prehypertension and elevated level of CRP, however this association was no longer available after adjusting by other variables. Ethic group differences were statistically significant at the univariable models, while it disappeared after controlling by other variables.

Keywords: CRP, hypertension, ethnicity, NHANES, blood pressure

Procedia PDF Downloads 390
186 Investigating the Influence of Activation Functions on Image Classification Accuracy via Deep Convolutional Neural Network

Authors: Gulfam Haider, sana danish

Abstract:

Convolutional Neural Networks (CNNs) have emerged as powerful tools for image classification, and the choice of optimizers profoundly affects their performance. The study of optimizers and their adaptations remains a topic of significant importance in machine learning research. While numerous studies have explored and advocated for various optimizers, the efficacy of these optimization techniques is still subject to scrutiny. This work aims to address the challenges surrounding the effectiveness of optimizers by conducting a comprehensive analysis and evaluation. The primary focus of this investigation lies in examining the performance of different optimizers when employed in conjunction with the popular activation function, Rectified Linear Unit (ReLU). By incorporating ReLU, known for its favorable properties in prior research, the aim is to bolster the effectiveness of the optimizers under scrutiny. Specifically, we evaluate the adjustment of these optimizers with both the original Softmax activation function and the modified ReLU activation function, carefully assessing their impact on overall performance. To achieve this, a series of experiments are conducted using a well-established benchmark dataset for image classification tasks, namely the Canadian Institute for Advanced Research dataset (CIFAR-10). The selected optimizers for investigation encompass a range of prominent algorithms, including Adam, Root Mean Squared Propagation (RMSprop), Adaptive Learning Rate Method (Adadelta), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD). The performance analysis encompasses a comprehensive evaluation of the classification accuracy, convergence speed, and robustness of the CNN models trained with each optimizer. Through rigorous experimentation and meticulous assessment, we discern the strengths and weaknesses of the different optimization techniques, providing valuable insights into their suitability for image classification tasks. By conducting this in-depth study, we contribute to the existing body of knowledge surrounding optimizers in CNNs, shedding light on their performance characteristics for image classification. The findings gleaned from this research serve to guide researchers and practitioners in making informed decisions when selecting optimizers and activation functions, thus advancing the state-of-the-art in the field of image classification with convolutional neural networks.

Keywords: deep neural network, optimizers, RMsprop, ReLU, stochastic gradient descent

Procedia PDF Downloads 79
185 Impact of Interdisciplinary Therapy Allied to Online Health Education on Cardiometabolic Parameters and Inflammation Factor Rating in Obese Adolescents

Authors: Yasmin A. M. Ferreira, Ana C. K. Pelissari, Sofia De C. F. Vicente, Raquel M. Da S. Campos, Deborah C. L. Masquio, Lian Tock, Lila M. Oyama, Flavia C. Corgosinho, Valter T. Boldarine, Ana R. Dâmaso

Abstract:

The prevalence of overweight and obesity is growing around the world and currently considered a global epidemic. Food and nutrition are essential requirements for promoting health and protecting non-communicable chronic diseases, such as obesity and cardiovascular disease. Specific dietary components may modulate the inflammation and oxidative stress in obese individuals. Few studies have investigated the dietary Inflammation Factor Rating (IFR) in obese adolescents. The IFR was developed to characterize an individual´s diet on anti- to pro-inflammatory score. This evaluation contributes to investigate the effects of inflammatory diet in metabolic profile in several individual conditions. Objectives: The present study aims to investigate the effects of a multidisciplinary weight loss therapy on inflammation factor rating and cardiometabolic risk in obese adolescents. Methods: A total of 26 volunteers (14-19 y.o) were recruited and submitted to 20 weeks interdisciplinary therapy allied to health education website- Ciclo do Emagrecimento®, including clinical, nutritional, psychological counseling and exercise training. The body weight was monitored weekly by self-report and photo. The adolescents answered a test to evaluate the knowledge of the topics covered in the videos. A 24h dietary record was applied at the baseline and after 20 weeks to assess the food intake and to calculate IFR. A negative IFR suggests that diet may have inflammatory effects and a positive IFR indicates an anti-inflammatory effect. Statistical analysis was performed using the program STATISTICA version 12.5 for Windows. The adopted significant value was α ≤ 5 %. Data normality was verified with the Kolmogorov Smirnov test. Data were expressed as mean±SD values. To analyze the effects of intervention it was applied test t. Pearson´s correlations test was performed. Results: After 20 weeks of treatment, body mass index (BMI), body weight, body fat (kg and %), abdominal and waist circumferences decreased significantly. The mean of high-density lipoprotein cholesterol (HDL-c) increased after the therapy. Moreover, it was found an improvement of inflammation factor rating from -427,27±322,47 to -297,15±240,01, suggesting beneficial effects of nutritional counselling. Considering the correlations analysis, it was found that pro-inflammatory diet is associated with increase in the BMI, very low-density lipoprotein cholesterol (VLDL), triglycerides, insulin and insulin resistance index (HOMA-IR); while an anti-inflammatory diet is associated with improvement of HDL-c and insulin sensitivity Check index (QUICKI). Conclusion: The 20-week blended multidisciplinary therapy was effective to reduce body weight, anthropometric circumferences and improve inflammatory markers in obese adolescents. In addition, our results showed that an increase in inflammatory profile diet is associated with cardiometabolic parameters, suggesting the relevance to stimulate anti-inflammatory diet habits as an effective strategy to treat and control of obesity and related comorbidities. Financial Support: FAPESP (2017/07372-1) and CNPq (409943/2016-9)

Keywords: cardiometabolic risk, inflammatory diet, multidisciplinary therapy, obesity

Procedia PDF Downloads 174
184 Kinematic Modelling and Task-Based Synthesis of a Passive Architecture for an Upper Limb Rehabilitation Exoskeleton

Authors: Sakshi Gupta, Anupam Agrawal, Ekta Singla

Abstract:

An exoskeleton design for rehabilitation purpose encounters many challenges, including ergonomically acceptable wearing technology, architectural design human-motion compatibility, actuation type, human-robot interaction, etc. In this paper, a passive architecture for upper limb exoskeleton is proposed for assisting in rehabilitation tasks. Kinematic modelling is detailed for task-based kinematic synthesis of the wearable exoskeleton for self-feeding tasks. The exoskeleton architecture possesses expansion and torsional springs which are able to store and redistribute energy over the human arm joints. The elastic characteristics of the springs have been optimized to minimize the mechanical work of the human arm joints. The concept of hybrid combination of a 4-bar parallelogram linkage and a serial linkage were chosen, where the 4-bar parallelogram linkage with expansion spring acts as a rigid structure which is used to provide the rotational degree-of-freedom (DOF) required for lowering and raising of the arm. The single linkage with torsional spring allows for the rotational DOF required for elbow movement. The focus of the paper is kinematic modelling, analysis and task-based synthesis framework for the proposed architecture, keeping in considerations the essential tasks of self-feeding and self-exercising during rehabilitation of partially healthy person. Rehabilitation of primary functional movements (activities of daily life, i.e., ADL) is routine activities that people tend to every day such as cleaning, dressing, feeding. We are focusing on the feeding process to make people independent in respect of the feeding tasks. The tasks are focused to post-surgery patients under rehabilitation with less than 40% weakness. The challenges addressed in work are ensuring to emulate the natural movement of the human arm. Human motion data is extracted through motion-sensors for targeted tasks of feeding and specific exercises. Task-based synthesis procedure framework will be discussed for the proposed architecture. The results include the simulation of the architectural concept for tracking the human-arm movements while displaying the kinematic and static study parameters for standard human weight. D-H parameters are used for kinematic modelling of the hybrid-mechanism, and the model is used while performing task-based optimal synthesis utilizing evolutionary algorithm.

Keywords: passive mechanism, task-based synthesis, emulating human-motion, exoskeleton

Procedia PDF Downloads 118