Search results for: Bruno Valverde Chahaira
69 Geological Structure as the Main Factor in Landslide Deployment in Purworejo District Central Java Province Indonesia
Authors: Hilman Agil Satria, Rezky Naufan Hendrawan
Abstract:
Indonesia is vulnerable to geological hazard because of its location in subduction zone and have tropical climate. Landslide is one of the most happened geological hazard in Indonesia, based on Indonesia Geospasial data, at least 194 landslides recorded in 2013. In fact, research location is placed as the third city that most happened landslide in Indonesia. Landslide caused damage of many houses and wrecked the road. The purpose of this research is to make a landslide zone therefore can be used as one of mitigation consideration. The location is in Bruno, Porworejo district Central Java Province Indonesia at 109.903 – 109.99 and -7.59 – -7.50 with 10 Km x 10 Km wide. Based on geological mapping result, the research location consist of Late Miocene sandstone and claystone, and Pleistocene volcanic breccia and tuff. Those landslide happened in the lithology that close with fault zone. This location has so many geological structures: joints, faults and folds. There are 3 thrust faults, 1 normal faults, 4 strike slip faults and 6 folds. This geological structure movement is interpreted as the main factor that has triggered landslide in this location. This research use field data as well as samples of rock, joint, slicken side and landslide location which is combined with DEM SRTM to analyze geomorphology. As the final result of combined data will be presented as geological map, geological structure map and landslide zone map. From this research we can assume that there is correlation between geological structure and landslide locations.Keywords: geological structure, landslide, Porworejo, Indonesia
Procedia PDF Downloads 28668 A Comprehensive CFD Model for Sugar-Cane Bagasse Heterogeneous Combustion in a Grate Boiler System
Authors: Daniel José de Oliveira Ferreira, Juan Harold Sosa-Arnao, Bruno Cássio Moreira, Leonardo Paes Rangel, Song Won Park
Abstract:
The comprehensive CFD models have been used to represent and study the heterogeneous combustion of biomass. In the present work, the operation of a global flue gas circuit in the sugar-cane bagasse combustion, from wind boxes below primary air grate supply, passing by bagasse insertion in swirl burners and boiler furnace, to boiler bank outlet is simulated. It uses five different meshes representing each part of this system located in sequence: wind boxes and grate, boiler furnace, swirl burners, super heaters and boiler bank. The model considers turbulence using standard k-ε, combustion using EDM, radiation heat transfer using DTM with 16 ray directions and bagasse particle tracking represented by Schiller-Naumann model. The results showed good agreement with expected behavior found in literature and equipment design. The more detailed results view in separated parts of flue gas system allows to observe some flow behaviors that cannot be represented by usual simplifications like bagasse supply under homogeneous axial and rotational vectors and others that can be represented using new considerations like the representation of 26 thousand grate orifices by 144 rectangular inlets.Keywords: comprehensive CFD model, sugar-cane bagasse combustion, sugar-cane bagasse grate boiler, axial
Procedia PDF Downloads 47267 Pilot Program for the Promotion of Normal Childbirth in the North, Northeast and Midwest of Brazil
Authors: Natália Bruno Chaves, Richardes Caúla, Roosevelt do Vale, Daniela Toneti, Rafaela Carvalho, Renata Silva Lopes, Antônio Carlos Júnior, Adner Nobre, Viviane Santiago, Yara Alana Caldato, Estefania Rodriguez Urrego, André Buarque Lemos, Catarina Nucci Stetner, Marcos Mauro Barreto, Stefany Moreira Lima, Mara Cavalcante, Ticiane Ribeiro
Abstract:
The Well Born (Nascer Bem – in Portuguese) Program was created in the Hapvida health network with the aim of improving access to safe and quality prenatal care for users. In addition to offering a line of prenatal care, the inclusion of obstetric nursing and the decentralization of childbirth, bring security that professionals did not indicate the route of delivery for professional convenience. The introduction of the nursing consultation came to reinforce the care to our users, strengthening their bond and reception. In 2021, the program maintained an average of 40% of normal births in the north, northeast and central-west regions of Brazil, an average above that observed in the rest of the country's private health systems, around 20%. In addition, the neonatal hospitalization rate of this population remained around 5.1%, a figure below the national average. With these data, the “Nascer Bem” program is affirmed as a safe and effective strategy for the promotion of safe normal birth.Keywords: quality, safe, prenatal, obstetric nursing
Procedia PDF Downloads 11966 Tunable Control of Therapeutics Release from the Nanochannel Delivery System (nDS)
Authors: Thomas Geninatti, Bruno Giacomo, Alessandro Grattoni
Abstract:
Nanofluidic devices have been investigated for over a decade as promising platforms for the controlled release of therapeutics. The nanochannel drug delivery system (nDS), a membrane fabricated with high precision silicon techniques, capable of zero-order release of drugs by exploiting diffusion transport at the nanoscale originated from the interactions between molecules with nanochannel surfaces, showed the flexibility of the sustained release in vitro and in vivo, over periods of time ranging from weeks to months. To improve the implantable bio nanotechnology, in order to create a system that possesses the key features for achieve the suitable release of therapeutics, the next generation of nDS has been created. Platinum electrodes are integrated by e-beam deposition onto both surfaces of the membrane allowing low voltage (<2 V) and active temporal control of drug release through modulation of electrostatic potentials at the inlet and outlet of the membrane’s fluidic channels. Hence, a tunable administration of drugs is ensured from the nanochannel drug delivery system. The membrane will be incorporated into a peek implantable capsule, which will include drug reservoir, control hardware and RF system to allow suitable therapeutic regimens in real-time. Therefore, this new nanotechnology offers tremendous potential solutions to manage chronic disease such as cancer, heart disease, circadian dysfunction, pain and stress.Keywords: nanochannel membrane, drug delivery, tunable release, personalized administration, nanoscale transport, biomems
Procedia PDF Downloads 31465 A Goal-Oriented Approach for Supporting Input/Output Factor Determination in the Regulation of Brazilian Electricity Transmission
Authors: Bruno de Almeida Vilela, Heinz Ahn, Ana Lúcia Miranda Lopes, Marcelo Azevedo Costa
Abstract:
Benchmarking public utilities such as transmission system operators (TSOs) is one of the main strategies employed by regulators in order to fix monopolistic companies’ revenues. Since 2007 the Brazilian regulator has been utilizing Data Envelopment Analysis (DEA) to benchmark TSOs. Despite the application of DEA to improve the transmission sector’s efficiency, some problems can be pointed out, such as the high price of electricity in Brazil; the limitation of the benchmarking only to operational expenses (OPEX); the absence of variables that represent the outcomes of the transmission service; and the presence of extremely low and high efficiencies. As an alternative to the current concept of benchmarking the Brazilian regulator uses, we propose a goal-oriented approach. Our proposal supports input/output selection by taking traditional organizational goals and measures as a basis for the selection of factors for benchmarking purposes. As the main advantage, it resolves the classical DEA problems of input/output selection, undesirable and dual-role factors. We also provide a demonstration of our goal-oriented concept regarding service quality. As a result, most TSOs’ efficiencies in Brazil might improve when considering quality as important in their efficiency estimation.Keywords: decision making, goal-oriented benchmarking, input/output factor determination, TSO regulation
Procedia PDF Downloads 19664 Cosmetic Surgery on the Rise: The Impact of Remote Communication
Authors: Bruno Di Pace, Roxanne H. Padley
Abstract:
Aims: The recent increase in remote video interaction has increased the number of requests for teleconsultations with plastic surgeons in private practice (70% in the UK and 64% in the USA). This study investigated the motivations for such an increase and the underlying psychological impact on patients. Method: An anonymous web-based poll of 8 questions was designed and distributed to patients seeking cosmetic surgery through social networks in both Italy and the UK. The questions gathered responses regarding 1. Reasons for pursuing cosmetic surgery; 2. The effects of delays caused by the SARS-COV-2 pandemic; 3. The effects on mood; 4. The influence of video conferencing on body-image perception. Results: 85 respondents completed the online poll. Overall, 68% of respondents stated that seeing themselves more frequently online had influenced their decision to seek cosmetic surgery. The types of surgeries indicated were predominantly to the upper body and face (82%). Delays and access to surgeons during the pandemic were perceived as negatively impacting patients' moods (95%). Body-image perception and self-esteem were lower than in the pre-pandemic, particularly during lockdown (72%). Patients were more inclined to undergo cosmetic surgery during the pandemic, both due to the wish to improve their “lockdown face” for video conferencing (77%) and also due to the benefits of home recovery while in smart working (58%). Conclusions: Overall, findings suggest that video conferencing has led to a significant increase in requests for cosmetic surgery and the so-called “Zoom Boom” effect.Keywords: cosmetic surgery, remote communication, telehealth, zoom boom
Procedia PDF Downloads 17963 Analysis of the Presence of Alkylglycerols by Gas Chromatography in Ostrich Oil
Authors: Luana N. Cardozo, Debora A. S. Coutinho, Fabiola Lagher, Bruno J. G. Silva, Ivonilce Venture, Mainara Tesser, Graciela Venera
Abstract:
Ostrich oil is used as food in Brazil, and it has been the subject of scientific research because it contains essential fatty acids (Omega 3, 6, 7, and 9), which provide benefits to human health. Alkylglycerols are lipid ethers consisted of a saturated or unsaturated hydrocarbon chain joined by ether-type bonding to one of the glycerol hydroxyls. It is known that supplementation with alkylglycerols can act significantly on the functioning of immune system cells, both in pathological situations and in homeostasis. Objective: Analyze the presence of alkylglycerols in ostrich oil. Methods: The ostrich oil was bought from an industry that manufactures the product for sale as food, located in Mirante da Serra, northern Brazil. The samples were sent for analysis to the chemistry department of the Federal University of Paraná, where they were analyzed by the gas chromatography method. Results: The analysis of the ostrich oil presented alkylglycerols in area 514505154. Comparison, it is possible to observe that shark liver oil contains the area 26190196, and the difference between both is highly significant. Conclusion: The importance of alkylglycerol supplementation for the immune system is known. The analysis of the results made it possible to verify the presence of alkylglycerols in the ostrich oil, which is five times higher than in the shark liver oil, that would be the largest source food, but was surpassed by the ostrich oil until the present time. The present study emphasizes that ostrich oil can be considered a food source of alkylglycerols and may play a promising role in the immune system because it contains such substance, but further studies are needed to prove its performance in the body.Keywords: ostrich oil, nutritional composition, alkylglycerols, food
Procedia PDF Downloads 14162 Local Differential Privacy-Based Data-Sharing Scheme for Smart Utilities
Authors: Veniamin Boiarkin, Bruno Bogaz Zarpelão, Muttukrishnan Rajarajan
Abstract:
The manufacturing sector is a vital component of most economies, which leads to a large number of cyberattacks on organisations, whereas disruption in operation may lead to significant economic consequences. Adversaries aim to disrupt the production processes of manufacturing companies, gain financial advantages, and steal intellectual property by getting unauthorised access to sensitive data. Access to sensitive data helps organisations to enhance the production and management processes. However, the majority of the existing data-sharing mechanisms are either susceptible to different cyber attacks or heavy in terms of computation overhead. In this paper, a privacy-preserving data-sharing scheme for smart utilities is proposed. First, a customer’s privacy adjustment mechanism is proposed to make sure that end-users have control over their privacy, which is required by the latest government regulations, such as the General Data Protection Regulation. Secondly, a local differential privacy-based mechanism is proposed to ensure the privacy of the end-users by hiding real data based on the end-user preferences. The proposed scheme may be applied to different industrial control systems, whereas in this study, it is validated for energy utility use cases consisting of smart, intelligent devices. The results show that the proposed scheme may guarantee the required level of privacy with an expected relative error in utility.Keywords: data-sharing, local differential privacy, manufacturing, privacy-preserving mechanism, smart utility
Procedia PDF Downloads 7661 Haematological Responses on Amateur Cycling Stages Race
Authors: Renato André S. Silva, Nana L. F. Sampaio, Carlos J. G. Cruz, Bruno Vianna, Flávio O. Pires
Abstract:
multiple stage bicycle races require high physiological loads from professional cyclists. Such demands can lead to immunosuppression and health problems. However, in this type of competition, little is known about its physiological effects on amateur athletes, who generally receive less medical support. Thus, this study analyzes the hematological effects of a multiple stage bicycle race on amateur cyclists. Seven Brazilian national amateur cyclists (34 ± 4.21 years) underwent a laboratory test to evaluate VO2Max (69.89 ± 7.43 ml⋅kg-1⋅min-1). Six days later, these volunteers raced in the Tour of Goiás, participating in five races in four days (435 km) of competition. Arterial blood samples were collected one day before and one day after the competition. The Kolmogorov-Smirnov tests were used to evaluate the data distribution and Wilcoxon to compare the two moments (p <0.05) of data collection. The results show: Red cells ↓ 7.8% (5.1 ± 0.28 vs 4.7 ± 0.37 106 / mm 3, p = 0.01); Hemoglobin ↓ 7.9% (15.1 ± 0.31 vs 13.9 ± 0.27 g / dL, p = 0.01); Leukocytes ↑ 9.5% (4946 ± 553 versus 5416 ± 1075 / mm 3, p = 0.17); Platelets ↓ 7.0% (200.2 ± 51.5 vs 186.1 ± 39.5 / mm 3, p = 0.01); LDH ↑ 11% (164.4 ± 28.5 vs 182.5 ± 20.5 U / L, p = 0.17); CK ↑ 13.5% (290.7 ± 206.1 vs 330.1 ± 90.5 U / L, p = 0.39); CK-MB ↑ 2% (15.7 ± 3.9 vs. 20.1 ± 2.9 U / L, p = 0.06); Cortizol ↓ 13.5% (12.1 ± 2.4 vs 9.9 ± 1.9 μg / dL, p = 0.01); Total testosterone ↓ 7% (453.6 ± 120.1 vs 421.7 ± 74.3 ng / dL, p = 0.12); IGF-1 ↓ 15.1% (213.8 ± 18.8 vs 181.5 ± 34.7 ng / mL, p = 0.04). This means that there was significant reductions in O2 allocation / transport capacities, vascular injury disruption, and a fortuitous reduction of muscle skeletal anabolism along with maintenance and / or slight elevation of immune function, glucose and lipid energy and myocardial damage. Therefore, the results suggest that no abnormal health effect was identified among the athletes after participating in the Tour de Goiás.Keywords: cycling, health effects, cycling stages races, haematology
Procedia PDF Downloads 20060 Experimental Set-up for the Thermo-Hydric Study of a Wood Chips Bed Crossed by an Air Flow
Authors: Dimitri Bigot, Bruno Malet-Damour, Jérôme Vigneron
Abstract:
Many studies have been made about using bio-based materials in buildings. The goal is to reduce its environmental footprint by analyzing its life cycle. This can lead to minimize the carbon emissions or energy consumption. A previous work proposed to numerically study the feasibility of using wood chips to regulate relative humidity inside a building. This has shown the capability of a wood chips bed to regulate humidity inside the building, to improve thermal comfort, and so potentially reduce building energy consumption. However, it also shown that some physical parameters of the wood chips must be identified to validate the proposed model and the associated results. This paper presents an experimental setup able to study such a wood chips bed with different solicitations. It consists of a simple duct filled with wood chips and crossed by an air flow with variable temperature and relative humidity. Its main objective is to study the thermal behavior of the wood chips bed by controlling temperature and relative humidity of the air that enters into it and by observing the same parameters at the output. First, the experimental set up is described according to previous results. A focus is made on the particular properties that have to be characterized. Then some case studies are presented in relation to the previous results in order to identify the key physical properties. Finally, the feasibility of the proposed technology is discussed, and some model validation paths are given.Keywords: wood chips bed, experimental set-up, bio-based material, desiccant, relative humidity, water content, thermal behaviour, air treatment
Procedia PDF Downloads 12259 Normalized Enterprises Architectures: Portugal's Public Procurement System Application
Authors: Tiago Sampaio, André Vasconcelos, Bruno Fragoso
Abstract:
The Normalized Systems Theory, which is designed to be applied to software architectures, provides a set of theorems, elements and rules, with the purpose of enabling evolution in Information Systems, as well as ensuring that they are ready for change. In order to make that possible, this work’s solution is to apply the Normalized Systems Theory to the domain of enterprise architectures, using Archimate. This application is achieved through the adaptation of the elements of this theory, making them artifacts of the modeling language. The theorems are applied through the identification of the viewpoints to be used in the architectures, as well as the transformation of the theory’s encapsulation rules into architectural rules. This way, it is possible to create normalized enterprise architectures, thus fulfilling the needs and requirements of the business. This solution was demonstrated using the Portuguese Public Procurement System. The Portuguese government aims to make this system as fair as possible, allowing every organization to have the same business opportunities. The aim is for every economic operator to have access to all public tenders, which are published in any of the 6 existing platforms, independently of where they are registered. In order to make this possible, we applied our solution to the construction of two different architectures, which are able of fulfilling the requirements of the Portuguese government. One of those architectures, TO-BE A, has a Message Broker that performs the communication between the platforms. The other, TO-BE B, represents the scenario in which the platforms communicate with each other directly. Apart from these 2 architectures, we also represent the AS-IS architecture that demonstrates the current behavior of the Public Procurement Systems. Our evaluation is based on a comparison between the AS-IS and the TO-BE architectures, regarding the fulfillment of the rules and theorems of the Normalized Systems Theory and some quality metrics.Keywords: archimate, architecture, broker, enterprise, evolvable systems, interoperability, normalized architectures, normalized systems, normalized systems theory, platforms
Procedia PDF Downloads 35658 Tailoring of ECSS Standard for Space Qualification Test of CubeSat Nano-Satellite
Authors: B. Tiseo, V. Quaranta, G. Bruno, G. Sisinni
Abstract:
There is an increasing demand of nano-satellite development among universities, small companies, and emerging countries. Low-cost and fast-delivery are the main advantages of such class of satellites achieved by the extensive use of commercial-off-the-shelf components. On the other side, the loss of reliability and the poor success rate are limiting the use of nano-satellite to educational and technology demonstration and not to the commercial purpose. Standardization of nano-satellite environmental testing by tailoring the existing test standard for medium/large satellites is then a crucial step for their market growth. Thus, it is fundamental to find the right trade-off between the improvement of reliability and the need to keep their low-cost/fast-delivery advantages. This is particularly even more essential for satellites of CubeSat family. Such miniaturized and standardized satellites have 10 cm cubic form and mass no more than 1.33 kilograms per 1 unit (1U). For this class of nano-satellites, the qualification process is mandatory to reduce the risk of failure during a space mission. This paper reports the description and results of the space qualification test campaign performed on Endurosat’s CubeSat nano-satellite and modules. Mechanical and environmental tests have been carried out step by step: from the testing of the single subsystem up to the assembled CubeSat nano-satellite. Functional tests have been performed during all the test campaign to verify the functionalities of the systems. The test duration and levels have been selected by tailoring the European Space Agency standard ECSS-E-ST-10-03C and GEVS: GSFC-STD-7000A.Keywords: CubeSat, nano-satellite, shock, testing, vibration
Procedia PDF Downloads 18657 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level
Authors: Pedro M. Abreu, Bruno R. Mendes
Abstract:
The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.Keywords: clinical pharmacy, co-payments, healthcare, medicines
Procedia PDF Downloads 25156 Influences of Separation of the Boundary Layer in the Reservoir Pressure in the Shock Tube
Authors: Bruno Coelho Lima, Joao F.A. Martos, Paulo G. P. Toro, Israel S. Rego
Abstract:
The shock tube is a ground-facility widely used in aerospace and aeronautics science and technology for studies on gas dynamic and chemical-physical processes in gases at high-temperature, explosions and dynamic calibration of pressure sensors. A shock tube in its simplest form is comprised of two separate tubes of equal cross-section by a diaphragm. The diaphragm function is to separate the two reservoirs at different pressures. The reservoir containing high pressure is called the Driver, the low pressure reservoir is called Driven. When the diaphragm is broken by pressure difference, a normal shock wave and non-stationary (named Incident Shock Wave) will be formed in the same place of diaphragm and will get around toward the closed end of Driven. When this shock wave reaches the closer end of the Driven section will be completely reflected. Now, the shock wave will interact with the boundary layer that was created by the induced flow by incident shock wave passage. The interaction between boundary layer and shock wave force the separation of the boundary layer. The aim of this paper is to make an analysis of influences of separation of the boundary layer in the reservoir pressure in the shock tube. A comparison among CDF (Computational Fluids Dynamics), experiments test and analytical analysis were performed. For the analytical analysis, some routines in Python was created, in the numerical simulations (Computational Fluids Dynamics) was used the Ansys Fluent, and the experimental tests were used T1 shock tube located in IEAv (Institute of Advanced Studies).Keywords: boundary layer separation, moving shock wave, shock tube, transient simulation
Procedia PDF Downloads 31555 Enhancing Solar Fuel Production by CO₂ Photoreduction Using Transition Metal Oxide Catalysts in Reactors Prepared by Additive Manufacturing
Authors: Renata De Toledo Cintra, Bruno Ramos, Douglas Gouvêa
Abstract:
There is a huge global concern due to the emission of greenhouse gases, consequent environmental problems, and the increase in the average temperature of the planet, caused mainly by fossil fuels, petroleum derivatives represent a big part. One of the main greenhouse gases, in terms of volume, is CO₂. Recovering a part of this product through chemical reactions that use sunlight as an energy source and even producing renewable fuel (such as ethane, methane, ethanol, among others) is a great opportunity. The process of artificial photosynthesis, through the conversion of CO₂ and H₂O into organic products and oxygen using a metallic oxide catalyst, and incidence of sunlight, is one of the promising solutions. Therefore, this research is of great relevance. To this reaction take place efficiently, an optimized reactor was developed through simulation and prior analysis so that the geometry of the internal channel is an efficient route and allows the reaction to happen, in a controlled and optimized way, in flow continuously and offering the least possible resistance. The design of this reactor prototype can be made in different materials, such as polymers, ceramics and metals, and made through different processes, such as additive manufacturing (3D printer), CNC, among others. To carry out the photocatalysis in the reactors, different types of catalysts will be used, such as ZnO deposited by spray pyrolysis in the lighting window, probably modified ZnO, TiO₂ and modified TiO₂, among others, aiming to increase the production of organic molecules, with the lowest possible energy.Keywords: artificial photosynthesis, CO₂ reduction, photocatalysis, photoreactor design, 3D printed reactors, solar fuels
Procedia PDF Downloads 8654 In-Silico Fusion of Bacillus Licheniformis Chitin Deacetylase with Chitin Binding Domains from Chitinases
Authors: Keyur Raval, Steffen Krohn, Bruno Moerschbacher
Abstract:
Chitin, the biopolymer of the N-acetylglucosamine, is the most abundant biopolymer on the planet after cellulose. Industrially, chitin is isolated and purified from the shell residues of shrimps. A deacetylated derivative of chitin i.e. chitosan has more market value and applications owing to it solubility and overall cationic charge compared to the parent polymer. This deacetylation on an industrial scale is performed chemically using alkalis like sodium hydroxide. This reaction not only is hazardous to the environment owing to negative impact on the marine ecosystem. A greener option to this process is the enzymatic process. In nature, the naïve chitin is converted to chitosan by chitin deacetylase (CDA). This enzymatic conversion on the industrial scale is however hampered by the crystallinity of chitin. Thus, this enzymatic action requires the substrate i.e. chitin to be soluble which is technically difficult and an energy consuming process. We in this project wanted to address this shortcoming of CDA. In lieu of this, we have modeled a fusion protein with CDA and an auxiliary protein. The main interest being to increase the accessibility of the enzyme towards crystalline chitin. A similar fusion work with chitinases had improved the catalytic ability towards insoluble chitin. In the first step, suitable partners were searched through the protein data bank (PDB) wherein the domain architecture were sought. The next step was to create the models of the fused product using various in silico techniques. The models were created by MODELLER and evaluated for properties such as the energy or the impairment of the binding sites. A fusion PCR has been designed based on the linker sequences generated by MODELLER and would be tested for its activity towards insoluble chitin.Keywords: chitin deacetylase, modeling, chitin binding domain, chitinases
Procedia PDF Downloads 24253 New Photosensitizers Encapsulated within Arene-Ruthenium Complexes Active in Photodynamic Therapy: Intracellular Signaling and Evaluation in Colorectal Cancer Models
Authors: Suzan Ghaddar, Aline Pinon, Manuel Gallardo-villagran, Mona Diab-assaf, Bruno Therrien, Bertrand Liagre
Abstract:
Colorectal cancer (CRC) is the third most common cancer and exhibits a consistently rising incidence worldwide. Despite notable advancements in CRC treatment, frequent occurrences of side effects and the development of therapy resistance persistently challenge current approaches. Eventually, innovations in focal therapies remain imperative to enhance the patient’s overall quality of life. Photodynamic therapy (PDT) emerges as a promising treatment modality, clinically used for the treatment of various cancer types. It relies on the use of photosensitive molecules called photosensitizers (PS), which are photoactivated after accumulation in cancer cells, to induce the production of reactive oxygen species (ROS) that cause cancer cell death. Among commonly used metal-based drugs in cancer therapy, ruthenium (Ru) possesses favorable attributes that demonstrate its selectivity towards cancer cells and render it suitable for anti-cancer drug design. In vitro studies using distinct arene-Ru complexes, encapsulating porphin PS, are conducted on human HCT116 and HT-29 colorectal cancer cell lines. These studies encompass the evaluation of the antiproliferative effect, ROS production, apoptosis, cell cycle progression, molecular localization, and protein expression. Preliminary results indicated that these complexes exert significant photocytotoxicity on the studied colorectal cancer cell lines, representing them as promising and potential candidates for anti- cancer agents.Keywords: colorectal cancer, photodynamic therapy, photosensitizers, arene-ruthenium complexes, apoptosis
Procedia PDF Downloads 9952 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)
Procedia PDF Downloads 38351 Uptake of Copper by Dead Biomass of Burkholderia cenocepacia Isolated from a Metal Mine in Pará, Brazil
Authors: Ingrid R. Avanzi, Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Elen A. Perpetuo, Claudio Auguto Oller do Nascimento
Abstract:
In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process. In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process.Keywords: biosorption, dead biomass, biotechnology, copper recovery
Procedia PDF Downloads 33750 Optimization of Quercus cerris Bark Liquefaction
Authors: Luísa P. Cruz-Lopes, Hugo Costa e Silva, Idalina Domingos, José Ferreira, Luís Teixeira de Lemos, Bruno Esteves
Abstract:
The liquefaction process of cork based tree barks has led to an increase of interest due to its potential innovation in the lumber and wood industries. In this particular study the bark of Quercus cerris (Turkish oak) is used due to its appreciable amount of cork tissue, although of inferior quality when compared to the cork provided by other Quercus trees. This study aims to optimize alkaline catalysis liquefaction conditions, regarding several parameters. To better comprehend the possible chemical characteristics of the bark of Quercus cerris, a complete chemical analysis was performed. The liquefaction process was performed in a double-jacket reactor heated with oil, using glycerol and a mixture of glycerol/ethylene glycol as solvents, potassium hydroxide as a catalyst, and varying the temperature, liquefaction time and granulometry. Due to low liquefaction efficiency resulting from the first experimental procedures a study was made regarding different washing techniques after the filtration process using methanol and methanol/water. The chemical analysis stated that the bark of Quercus cerris is mostly composed by suberin (ca. 30%) and lignin (ca. 24%) as well as insolvent hemicelluloses in hot water (ca. 23%). On the liquefaction stage, the results that led to higher yields were: using a mixture of methanol/ethylene glycol as reagents and a time and temperature of 120 minutes and 200 ºC, respectively. It is concluded that using a granulometry of <80 mesh leads to better results, even if this parameter barely influences the liquefaction efficiency. Regarding the filtration stage, washing the residue with methanol and then distilled water leads to a considerable increase on final liquefaction percentages, which proves that this procedure is effective at liquefying suberin content and lignocellulose fraction.Keywords: liquefaction, Quercus cerris, polyalcohol liquefaction, temperature
Procedia PDF Downloads 33249 Developing an Edutainment Game for Children with ADHD Based on SAwD and VCIA Model
Authors: Bruno Gontijo Batista
Abstract:
This paper analyzes how the Socially Aware Design (SAwD) and the Value-oriented and Culturally Informed Approach (VCIA) design model can be used to develop an edutainment game for children with Attention Deficit Hyperactivity Disorder (ADHD). The SAwD approach seeks a design that considers new dimensions in human-computer interaction, such as culture, aesthetics, emotional and social aspects of the user's everyday experience. From this perspective, the game development was VCIA model-based, including the users in the design process through participatory methodologies, considering their behavioral patterns, culture, and values. This is because values, beliefs, and behavioral patterns influence how technology is understood and used and the way it impacts people's lives. This model can be applied at different stages of design, which goes from explaining the problem and organizing the requirements to the evaluation of the prototype and the final solution. Thus, this paper aims to understand how this model can be used in the development of an edutainment game for children with ADHD. In the area of education and learning, children with ADHD have difficulties both in behavior and in school performance, as they are easily distracted, which is reflected both in classes and on tests. Therefore, they must perform tasks that are exciting or interesting for them, once the pleasure center in the brain is activated, it reinforces the center of attention, leaving the child more relaxed and focused. In this context, serious games have been used as part of the treatment of ADHD in children aiming to improve focus and attention, stimulate concentration, as well as be a tool for improving learning in areas such as math and reading, combining education and entertainment (edutainment). Thereby, as a result of the research, it was developed, in a participatory way, applying the VCIA model, an edutainment game prototype, for a mobile platform, for children between 8 and 12 years old.Keywords: ADHD, edutainment, SAwD, VCIA
Procedia PDF Downloads 18948 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models
Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru
Abstract:
Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.Keywords: maize, stem borers, density, RapidEye, GLM
Procedia PDF Downloads 49647 Greenhouse Controlled with Graphical Plotting in Matlab
Authors: Bruno R. A. Oliveira, Italo V. V. Braga, Jonas P. Reges, Luiz P. O. Santos, Sidney C. Duarte, Emilson R. R. Melo, Auzuir R. Alexandria
Abstract:
This project aims to building a controlled greenhouse, or for better understanding, a structure where one can maintain a given range of temperature values (°C) coming from radiation emitted by an incandescent light, as previously defined, characterizing as a kind of on-off control and a differential, which is the plotting of temperature versus time graphs assisted by MATLAB software via serial communication. That way it is possible to connect the stove with a computer and monitor parameters. In the control, it was performed using a PIC 16F877A microprocessor which enabled convert analog signals to digital, perform serial communication with the IC MAX232 and enable signal transistors. The language used in the PIC's management is Basic. There are also a cooling system realized by two coolers 12V distributed in lateral structure, being used for venting and the other for exhaust air. To find out existing temperature inside is used LM35DZ sensor. Other mechanism used in the greenhouse construction was comprised of a reed switch and a magnet; their function is in recognition of the door position where a signal is sent to a buzzer when the door is open. Beyond it exist LEDs that help to identify the operation which the stove is located. To facilitate human-machine communication is employed an LCD display that tells real-time temperature and other information. The average range of design operating without any major problems, taking into account the limitations of the construction material and structure of electrical current conduction, is approximately 65 to 70 ° C. The project is efficient in these conditions, that is, when you wish to get information from a given material to be tested at temperatures not as high. With the implementation of the greenhouse automation, facilitating the temperature control and the development of a structure that encourages correct environment for the most diverse applications.Keywords: greenhouse, microcontroller, temperature, control, MATLAB
Procedia PDF Downloads 40246 Simultaneous Interpreting and Meditation: An Experimental Study on the Effects of Qigong Meditation on Simultaneous Interpreting Performance
Authors: Lara Bruno, Ilaria Tipà, Franco Delogu
Abstract:
Simultaneous interpreting (SI) is a demanding language task which includes the contemporary activation of different cognitive processes. This complex activity requires interpreters not only to be proficient in their working languages; but also to have a great ability in focusing attention and controlling anxiety during their performance. Effects of Qigong meditation techniques have a positive impact on several cognitive functions, including attention and anxiety control. This study aims at exploring the influence of Qigong meditation on the quality of simultaneous interpreting. 20 interpreting students, divided into two groups, were trained for 8 days in Qigong meditation practice. Before and after training, a brief simultaneous interpreting task was performed. Language combinations of group A and group B were respectively English-Italian and Chinese-Italian. Students’ performances were recorded and rated by independent evaluators. Assessments were based on 12 different parameters, divided into 4 macro-categories: content, form, delivery and anxiety control. To determine if there was any significant variation between the pre-training and post-training SI performance, ANOVA analyses were conducted on the ratings provided by the independent evaluators. Main results indicate a significant improvement of the interpreting performance after the meditation training intervention for both groups. However, group A registered a higher improvement compared to Group B. Nonetheless, positive effects of meditation have been found in all the observed macro-categories. Meditation was not only beneficial for speech delivery and anxiety control but also for cognitive and attention abilities. From a cognitive and pedagogical point of view, present results open new paths of research on the practice of meditation as a tool to improve SI performances.Keywords: cognitive science, interpreting studies, Qigong meditation, simultaneous interpreting, training
Procedia PDF Downloads 16045 The Impact of Study Abroad Experience on Interpreting Performance
Authors: Ruiyuan Wang, Jing Han, Bruno Di Biase, Mark Antoniou
Abstract:
The purpose of this study is to explore the relationship between working memory (WM) capacity and Chinese-English consecutive interpreting (CI) performance in interpreting learners with different study abroad experience (SAE). Such relationship is not well understood. This study also examines whether Chinese interpreting learners with SAE in English-speaking countries, demonstrate a better performance in inflectional morphology and agreement, notoriously unstable in Chinese speakers of English L2, in their interpreting output than learners without SAE. Fifty Chinese university students, majoring in Chinese-English Interpreting, were recruited in Australia (n=25) and China (n=25). The two groups matched in age, language proficiency, and interpreting training period. Study abroad (SA) group has been studying in an English-speaking country (Australia) for over 12 months, and none of the students recruited in China (the no study abroad = NSA group) had ever studied or lived in an English-speaking country. Data on language proficiency and training background were collected via a questionnaire. Lexical retrieval performance and working memory (WM) capacity data were collected experimentally, and finally, interpreting data was elicited via a direct CI task. Main results of the study show that WM significantly correlated with participants' CI performance independently of learning context. Moreover, SA outperformed NSA learners in terms of subject-verb number agreement. Apart from that, WM capacity was also found to correlate significantly with their morphosyntactic accuracy. This paper sheds some light on the relationship between study abroad, WM capacity, and CI performance. Exploring the effect of study abroad on interpreting trainees and how various important factors correlate may help interpreting educators bring forward more targeted teaching paradigms for participants with different learning experiences.Keywords: study abroad experience, consecutive interpreting, working memory, inflectional agreement
Procedia PDF Downloads 10044 Experimental Analysis for the Inlet of the Brazilian Aerospace Vehicle 14-X B
Authors: João F. A. Martos, Felipe J. Costa, Sergio N. P. Laiton, Bruno C. Lima, Israel S. Rêgo, Paulo P. G. Toro
Abstract:
Nowadays, the scramjet is a topic that has attracted the attention of several scientific communities (USA, Australia, Germany, France, Japan, India, China, Russia), that are investing in this in this type of propulsion system due its interest to facilitate access to space and reach hypersonic speed, who have invested in this type of propulsion due to the interest in facilitating access to space. The Brazilian hypersonic scramjet aerospace vehicle 14-X B is a technological demonstrator of a hypersonic airbreathing propulsion system based on the supersonic combustion (scramjet) intended to be tested in flight into the Earth's atmosphere at 30 km altitude and Mach number 7. The 14-X B has been designed at the Prof. Henry T. Nagamatsu Laboratory of Aerothermodynamics and Hypersonics of the Institute for Advanced Studies (IEAv) in Brazil. The IEAv Hypersonic Shock Tunnel, named T3, is a ground-test facility able to reproduce the flight conditions as the Mach number as well as pressure and temperature in the test section close to those encountered during the test flight of the vehicle 14-X B into design conditions. A 1-m long stainless steel 14-X B model was experimentally investigated at T3 Hypersonic Shock Tunnel, for freestream Mach number 7. Static pressure measurements along the lower surface of the 14-X B model, along with high-speed schlieren photographs taken from the 5.5° leading edge and the 14.5° deflection compression ramp, provided experimental data that were compared to the analytical-theoretical solutions and the computational fluid dynamics (CFD) simulations. The results show a good qualitative agreement, and in consequence demonstrating the importance of these methods in the project of the 14-X B hypersonic aerospace vehicle.Keywords: 14-X, CFD, hypersonic, hypersonic shock tunnel, scramjet
Procedia PDF Downloads 35943 Physicochemical Stability of Pulse Spreads during Storage after Sous Vide Treatment and High Pressure Processing
Authors: Asnate Kirse, Daina Karklina, Sandra Muizniece-Brasava, Ruta Galoburda
Abstract:
Pulses are high in plant protein and dietary fiber, and contain slowly digestible starches. Innovative products from pulses could increase their consumption and benefit consumer health. This study was conducted to evaluate physicochemical stability of processed cowpea (Vigna unguiculata (L.) Walp. cv. Fradel) and maple pea (Pisum sativum var. arvense L. cv. Bruno) spreads at 5 °C temperature during 62-day storage. Physicochemical stability of pulse spreads was compared after sous vide treatment (80 °C/15 min) and high pressure processing (700 MPa/10 min/20 °C). Pulse spreads were made by homogenizing cooked pulses in a food processor together with salt, citric acid, oil, and bruschetta seasoning. A total of four different pulse spreads were studied: Cowpea spread without and with seasoning, maple pea spread without and with seasoning. Transparent PA/PE and light proof PET/ALU/PA/PP film pouches were used for packaging of pulse spreads under vacuum. The parameters investigated were pH, water activity and mass losses. Pulse spreads were tested on days 0, 15, 29, 42, 50, 57 and 62. The results showed that sous-vide treatment and high pressure processing had an insignificant influence on pH, water activity and mass losses after processing, irrespective of packaging material did not change (p>0.1). pH and water activity of sous-vide treated and high pressure processed pulse spreads in different packaging materials proved to be stable throughout the storage. Mass losses during storage accounted to 0.1% losses. Chosen sous-vide treatment and high pressure processing regimes and packaging materials are suitable to maintain consistent physicochemical quality of the new products during 62-day storage.Keywords: cowpea, flexible packaging, maple pea, water activity
Procedia PDF Downloads 27942 Benefits of a Topical Emollient Product in the Management of Canine Nasal Hyperkeratosis
Authors: Christelle Navarro, Sébastien Viaud, Carole Gard, Bruno Jahier
Abstract:
Background: Idiopathic or familial nasal hyperkeratosis (NHK) may be considered a cosmetic issue in its uncomplicated form. Nevertheless, prevention of secondary lesions such as fissures or infections could be advised by proper management. The objective of this open-field study is to evaluate the benefits of a moisturizing balm in privately owned dogs with NHK, using an original validation grid for both investigator and owner assessments. Methods: Dogs with idiopathic or familial NHK received a vegetable-based ointment (Sensiderm® Balm, MP Labo, France) BID for 60 days. A global dermatological score (GDS) was defined using the sum of 4 criteria (“dryness,” “lichenification”, “crusts,” and “affected area”) on a 0 (no) to 3 (severe or > 2/3 extension) scale. Evaluation of this GDS (0-12) on D0, D30, and D60, by owners and investigators was the main outcome. The score’s percentage decrease versus D0, the evolution of each individual score, the correlation between observers, and the evaluation of clinical improvement and animal discomfort on VAS (0-10) during follow-up were analysed. Results: The global dermatological score significantly decreased over time (p<0.0001) for all observers. The decrease reached 44.9% and 54.3% at D30 and 54.5% and 62.3% at D60, for investigators and owners, respectively. “Dryness”, “Lichenification,” and “Affected area scores” decreased significantly and steadily over time compared to Day 0 for both investigators and owners (p < 0.001 and p = 0.001 for investigator assessment of dryness). All but one score (lichenification) were correlated at all times between observers (only at D60 for crusts). Whoever the observer, clinical improvement was always above 7. At D30 and until D60, “animal discomfort” was more than halved. Owner satisfaction was high as soon as D30 (8.1/10). No adverse effects were reported. Conclusion and clinical importance: The positive results confirm the benefits and safety of a moisturizing balm when used in dogs with uncomplicated NHK.Keywords: hyperkeratosis, nose, dog, moisturizer
Procedia PDF Downloads 12941 Capturing Public Voices: The Role of Social Media in Heritage Management
Authors: Mahda Foroughi, Bruno de Anderade, Ana Pereira Roders
Abstract:
Social media platforms have been increasingly used by locals and tourists to express their opinions about buildings, cities, and built heritage in particular. Most recently, scholars have been using social media to conduct innovative research on built heritage and heritage management. Still, the application of artificial intelligence (AI) methods to analyze social media data for heritage management is seldom explored. This paper investigates the potential of short texts (sentences and hashtags) shared through social media as a data source and artificial intelligence methods for data analysis for revealing the cultural significance (values and attributes) of built heritage. The city of Yazd, Iran, was taken as a case study, with a particular focus on windcatchers, key attributes conveying outstanding universal values, as inscribed on the UNESCO World Heritage List. This paper has three subsequent phases: 1) state of the art on the intersection of public participation in heritage management and social media research; 2) methodology of data collection and data analysis related to coding people's voices from Instagram and Twitter into values of windcatchers over the last ten-years; 3) preliminary findings on the comparison between opinions of locals and tourists, sentiment analysis, and its association with the values and attributes of windcatchers. Results indicate that the age value is recognized as the most important value by all interest groups, while the political value is the least acknowledged. Besides, the negative sentiments are scarcely reflected (e.g., critiques) in social media. Results confirm the potential of social media for heritage management in terms of (de)coding and measuring the cultural significance of built heritage for windcatchers in Yazd. The methodology developed in this paper can be applied to other attributes in Yazd and also to other case studies.Keywords: social media, artificial intelligence, public participation, cultural significance, heritage, sentiment analysis
Procedia PDF Downloads 11340 Validity of a Timing System in the Alpine Ski Field: A Magnet-Based Timing System Using the Magnetometer Built into an Inertial Measurement Units
Authors: Carla Pérez-Chirinos Buxadé, Bruno Fernández-Valdés, Mónica Morral-Yepes, Sílvia Tuyà Viñas, Josep Maria Padullés Riu, Gerard Moras Feliu
Abstract:
There is a long way to explore all the possible applications inertial measurement units (IMUs) have in the sports field. The aim of this study was to evaluate the validity of a new application on the use of these wearable sensors, specifically it was to evaluate a magnet-based timing system (M-BTS) for timing gate-to-gate in an alpine ski slalom using the magnetometer embedded in an IMU. This was a validation study. The criterion validity of time measured by the M-BTS was assessed using the 95% error range against actual time obtained from photocells. The experiment was carried out with first-and second-year junior skiers performing a ski slalom on a ski training slope. Eight alpine skiers (17.4 ± 0.8 years, 176.4 ± 4.9 cm, 67.7 ± 2.0 kg, 128.8 ± 26.6 slalom FIS-Points) participated in the study. An IMU device was attached to the skier’s lower back. Skiers performed a 40-gate slalom from which four gates were assessed. The M-BTS consisted of placing four bar magnets buried into the snow surface on the inner side of each gate’s turning pole; the magnetometer built into the IMU detected the peak-shaped magnetic field when passing near the magnets at a certain speed. Four magnetic peaks were detected. The time compressed between peaks was calculated. Three inter-gate times were obtained for each system: photocells and M-BTS. The total time was defined as the time sum of the inter-gate times. The 95% error interval for the total time was 0.050 s for the ski slalom. The M-BTS is valid for timing gate-to-gate in an alpine ski slalom. Inter-gate times can provide additional data for analyzing a skier’s performance, such as asymmetries between left and right foot.Keywords: gate crossing time, inertial measurement unit, timing system, wearable sensor
Procedia PDF Downloads 184