Search results for: Jorge Arana
43 Simulation of Elastic Bodies through Discrete Element Method, Coupled with a Nested Overlapping Grid Fluid Flow Solver
Authors: Paolo Sassi, Jorge Freiria, Gabriel Usera
Abstract:
In this work, a finite volume fluid flow solver is coupled with a discrete element method module for the simulation of the dynamics of free and elastic bodies in interaction with the fluid and between themselves. The open source fluid flow solver, caffa3d.MBRi, includes the capability to work with nested overlapping grids in order to easily refine the grid in the region where the bodies are moving. To do so, it is necessary to implement a recognition function able to identify the specific mesh block in which the device is moving in. The set of overlapping finer grids might be displaced along with the set of bodies being simulated. The interaction between the bodies and the fluid is computed through a two-way coupling. The velocity field of the fluid is first interpolated to determine the drag force on each object. After solving the objects displacements, subject to the elastic bonding among them, the force is applied back onto the fluid through a Gaussian smoothing considering the cells near the position of each object. The fishnet is represented as lumped masses connected by elastic lines. The internal forces are derived from the elasticity of these lines, and the external forces are due to drag, gravity, buoyancy and the load acting on each element of the system. When solving the ordinary differential equations system, that represents the motion of the elastic and flexible bodies, it was found that the Runge Kutta solver of fourth order is the best tool in terms of performance, but requires a finer grid than the fluid solver to make the system converge, which demands greater computing power. The coupled solver is demonstrated by simulating the interaction between the fluid, an elastic fishnet and a set of free bodies being captured by the net as they are dragged by the fluid. The deformation of the net, as well as the wake produced in the fluid stream are well captured by the method, without requiring the fluid solver mesh to adapt for the evolving geometry. Application of the same strategy to the simulation of elastic structures subject to the action of wind is also possible with the method presented, and one such application is currently under development.Keywords: computational fluid dynamics, discrete element method, fishnets, nested overlapping grids
Procedia PDF Downloads 41642 Mortar Positioning Effects on Uniaxial Compression Behavior in Hollow Concrete Block Masonry
Authors: José Álvarez Pérez, Ramón García Cedeño, Gerardo Fajardo-San Miguel, Jorge H. Chávez Gómez, Franco A. Carpio Santamaría, Milena Mesa Lavista
Abstract:
The uniaxial compressive strength and modulus of elasticity in hollow concrete block masonry (HCBM) represent key mechanical properties for structural design considerations. These properties are obtained through experimental tests conducted on prisms or wallettes and depend on various factors, with the HCB contributing significantly to overall strength. One influential factor in the compressive behaviour of masonry is the thickness and method of mortar placement. Mexican regulations stipulate mortar placement over the entire net area (full-shell) for strength computation based on the gross area. However, in professional practice, there's a growing trend to place mortar solely on the lateral faces. Conversely, the United States of America standard dictates mortar placement and computation over the net area of HCB. The Canadian standard specifies mortar placement solely on the lateral face (Face-Shell-Bedding), where computation necessitates the use of the effective load area, corresponding to the mortar's placement area. This research aims to evaluate the influence of different mortar placement methods on the axial compression behaviour of HCBM. To achieve this, an experimental campaign was conducted, including: (1) 10 HCB specimens with mortar on the entire net area, (2) 10 HCB specimens with mortar placed on the lateral faces, (3) 10 prisms of 2-course HCB under axial compression with mortar in full-shell, (4) 10 prisms of 2-course HCB under axial compression with mortar in face-shell-bedding, (5) 10 prisms of 3-course HCB under axial compression with mortar in full-shell, (6) 10 prisms of 3-course HCB under axial compression with mortar in face-shell-bedding, (7) 10 prisms of 4-course HCB under axial compression with mortar in full-shell, and, (8) 10 prisms of 4-course HCB under axial compression with mortar in face-shell-bedding. A combination of sulphur and fly ash in a 2:1 ratio was used for the capping material, meeting the average compressive strength requirement of over 35 MPa as per NMX-C-036 standards. Additionally, a mortar with a strength of over 17 MPa was utilized for the prisms. The results indicate that prisms with mortar placed over the full-shell exhibit higher strength compared to those with mortar over the face-shell-bedding. However, the elastic modulus was lower for prisms with mortar placement over the full-shell compared to face-shell bedding.Keywords: masonry, hollow concrete blocks, mortar placement, prisms tests
Procedia PDF Downloads 6141 Positive Interactions among Plants in Pinegroves over Quarzitic Sands
Authors: Enrique González Pendás, Vidal Pérez Hernández, Jorge Ferro Díaz, Nelson Careaga Pendás
Abstract:
The investigation is carried out on the Protected Area of San Ubaldo, toward the interior of an open pinegrove with palm trees in a dry plainness of quar zitic sands, belonging to the Floristic Managed Reservation San Ubaldo-Sabanalamar, Guane, Pinar del Río, Cuba. This area is characterized by drastic seasonal variations, high temperatures and water evaporation, strong solar radiation, with sandy soils of almost pure quartz, which are very acid and poor in nutrients. The objective of the present work is to determine evidence of facilitation and its relationship with the structure and composition of plant communities in these peculiar ecosystems. For this study six lineal parallel transepts of 100 m are traced, in those, a general recording of the flora is carried out. To establish which plants act as nurses, is taken into account a height over 1 meter, canopy over 1.5 meter and the occurrence of several species under it. Covering was recorded using the line intercept method; the medium values of species richness for the taxa under nurses is compared with those that are located in open spaces among them. Then, it is determined which plants are better recruiter of other species (better nurses). An experiment is made to measure and compare some parameters in pine seedlings under the canopy of the Byrsonima crassifolia (L.) Kunth. and in open spaces, also the number of individuals is counted by species to calculate the frequency and total abundance in the study area. As a result, it is offered an up-to-date floristic list, a phylogenetic tree of the plant community showing a high phylodiversity, it is proven that the medium values of species richness and abundance of species under the nurses, is significantly superior to those occurring in open spaces. Furthermore, by means of phylogenetic trees it is shown that the species which cohabit under the nurses are not phylogenetically related. The former results are cited evidences of facilitation among plants, as well as it is one more time shown the importance of the nurse effect in preserving plant diversity on extreme environments.Keywords: facilitation, nurse plants, positive interactions, quarzitic sands
Procedia PDF Downloads 34140 Direct Laser Fabrication and Characterization of Cu-Al-Ni Shape Memory Alloy for Seismic Damping Applications
Authors: Gonzalo Reyes, Magdalena Walczak, Esteban Ramos-Moore, Jorge Ramos-Grez
Abstract:
Metal additive manufacture technologies have gained strong support and acceptance as a promising and alternative method to manufacture high performance complex geometry products. The main purpose of the present work is to study the microstructure and phase transformation temperatures of Cu-Al-Ni shape memory alloys fabricated from a direct laser additive process using metallic powders as precursors. The potential application is to manufacture self-centering seismic dampers for earthquake protection of buildings out of a copper based alloy by an additive process. In this process, the Cu-Al-Ni alloy is melted, inside of a high temperature and vacuum chamber with the aid of a high power fiber laser under inert atmosphere. The laser provides the energy to melt the alloy powder layer. The process allows fabricating fully dense, oxygen-free Cu-Al-Ni specimens using different laser power levels, laser powder interaction times, furnace ambient temperatures, and cooling rates as well as modifying concentration of the alloying elements. Two sets of specimens were fabricated with a nominal composition of Cu-13Al-3Ni and Cu-13Al-4Ni in wt.%, however, semi-quantitative chemical analysis using EDX examination showed that the specimens’ resulting composition was closer to Cu-12Al-5Ni and Cu-11Al-8Ni, respectively. In spite of that fact, it is expected that the specimens should still possess shape memory behavior. To confirm this hypothesis, phase transformation temperatures will be measured using DSC technique, to look for martensitic and austenitic phase transformations at 150°C. So far, metallographic analysis of the specimens showed defined martensitic microstructures. Moreover, XRD technique revealed diffraction peaks corresponding to (0 0 18) and (1 2 8) planes, which are too associated with the presence of martensitic phase. We conclude that it would be possible to obtain fully dense Cu-Al-Ni alloys having shape memory effect behavior by direct laser fabrication process, and to advance into fabrication of self centering seismic dampers by a controllable metal additive manufacturing process.Keywords: Cu-Al-Ni alloys, direct laser fabrication, shape memory alloy, self-centering seismic dampers
Procedia PDF Downloads 51639 Thermoelectric Blanket for Aiding the Treatment of Cerebral Hypoxia and Other Related Conditions
Authors: Sarayu Vanga, Jorge Galeano-Cabral, Kaya Wei
Abstract:
Cerebral hypoxia refers to a condition in which there is a decrease in oxygen supply to the brain. Patients suffering from this condition experience a decrease in their body temperature. While there isn't any cure to treat cerebral hypoxia as of date, certain procedures are utilized to help aid in the treatment of the condition. Regulating the body temperature is an example of one of those procedures. Hypoxia is well known to reduce the body temperature of mammals, although the neural origins of this response remain uncertain. In order to speed recovery from this condition, it is necessary to maintain a stable body temperature. In this study, we present an approach to regulating body temperature for patients who suffer from cerebral hypoxia or other similar conditions. After a thorough literature study, we propose the use of thermoelectric blankets, which are temperature-controlled thermal blankets based on thermoelectric devices. These blankets are capable of heating up and cooling down the patient to stabilize body temperature. This feature is possible through the reversible effect that thermoelectric devices offer while behaving as a thermal sensor, and it is an effective way to stabilize temperature. Thermoelectricity is the direct conversion of thermal to electrical energy and vice versa. This effect is now known as the Seebeck effect, and it is characterized by the Seebeck coefficient. In such a configuration, the device has cooling and heating sides with temperatures that can be interchanged by simply switching the direction of the current input in the system. This design integrates various aspects, including a humidifier, ventilation machine, IV-administered medication, air conditioning, circulation device, and a body temperature regulation system. The proposed design includes thermocouples that will trigger the blanket to increase or decrease a set temperature through a medical temperature sensor. Additionally, the proposed design allows an efficient way to control fluctuations in body temperature while being cost-friendly, with an expected cost of 150 dollars. We are currently working on developing a prototype of the design to collect thermal and electrical data under different conditions and also intend to perform an optimization analysis to improve the design even further. While this proposal was developed for treating cerebral hypoxia, it can also aid in the treatment of other related conditions, as fluctuations in body temperature appear to be a common symptom that patients have for many illnesses.Keywords: body temperature regulation, cerebral hypoxia, thermoelectric, blanket design
Procedia PDF Downloads 15638 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons
Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe
Abstract:
This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.Keywords: digital holography, quantum imaging, quantum holography, quantum metrology
Procedia PDF Downloads 9237 The Influence of Alvar Aalto on the Early Work of Álvaro Siza
Authors: Eduardo Jorge Cabral dos Santos Fernandes
Abstract:
The expression ‘Porto School’, usually associated with an educational institution, the School of Fine Arts of Porto, is applied for the first time with the sense of an architectural trend by Nuno Portas in a text published in 1983. The expression is used to characterize a set of works by Porto architects, in which common elements are found, namely the desire to reuse languages and forms of the German and Dutch rationalism of the twenties, using the work of Alvar Aalto as a mediation for the reinterpretation of these models. In the same year, Álvaro Siza classifies the Finnish architect as a miscegenation agent who transforms experienced models and introduces them to different realities in a text published in Jornal de Letras, Artes e Ideias. The influence of foreign models and their adaptation to the context has been a recurrent theme in Portuguese architecture, which finds important contributions in the writings of Alexandre Alves Costa, at this time. However, the identification of these characteristics in Siza’s work is not limited to the Portuguese theoretical production: it is the recognition of this attitude towards the context that leads Kenneth Frampton to include Siza in the restricted group of architects who embody Critical Regionalism (in his book Modern architecture: a critical history). For Frampton, his work focuses on the territory and on the consequences of the intervention in the context, viewing architecture as a tectonic fact rather than a series of scenographic episodes and emphasizing site-specific aspects (topography, light, climate). Therefore, the motto of this paper is the dichotomous opposition between foreign influences and adaptation to the context in the early work of Álvaro Siza (designed in the sixties) in which the influence (theoretical, methodological, and formal) of Alvar Aalto manifests itself in the form and the language: the pool at Quinta da Conceição, the Seaside Pools and the Tea House (three works in Leça da Palmeira) and the Lordelo Cooperative (in Porto). This work is part of a more comprehensive project, which considers several case studies throughout the Portuguese architect's vast career, built in Portugal and abroad, in order to obtain a holistic view.Keywords: Alvar Aalto, Álvaro Siza, foreign influences, adaptation to the context
Procedia PDF Downloads 3036 Anecic and Epigeic Earthworms as Potential Biocontrol Agents of Fusarium graminearum, Causal Agent of Fusarium Head Blight on Wheat
Authors: Gabriella Jorge, Carlos A. Pérez, Hanna Friberg, Sara Söderlund, Jan Lagerlöf
Abstract:
Fusarium Head Blight (FHB) is one of the most important Fusarium-caused diseases, which affects cereals with serious detrimental effects on yield and grain quality worldwide. Earthworms have been suggested as an alternative to control this disease, which requires a combination of preventive methods to reduce level of damage, although it has been proven that their effect is species dependent. Our objective was to evaluate the effect of the earthworms Aporrectodea longa and Lumbricus rubellus, on the inoculum of Fusarium graminearum on wheat straw. To test this we kept earthworms in vessels with soil, and F. graminearum-inoculated straw covering the surface, under controlled conditions for 6 weeks. Two factors were evaluated with a complete factorial design: earthworms (three levels: without earthworms, A. longa, and L. rubellus), and straw (two levels: inoculated with the pathogen, and sterile). The presence of L. rubellus significantly (P<0.05) reduced the amount of inoculated straw at the soil surface 31% after 6 weeks, while the presence of A. longa, most found in quiescence, did not have any significant effect on the amount of straw when compared to the control. After incubation, F. graminearum was detected by qPCR, only in the surface straw in those treatments inoculated with the pathogen but without earthworms. None of the treatments showed presence of Fusarium in the buried straw, soil or earthworm casts. Both earthworm species decreased in body weight during incubation, most likely due to the decrease in soil water content during the experiment, from 25% to 20%, and/or inadequate food supply, since no other source of food was added. However, this reduction in weight occurred indistinctly of the presence or not of Fusarium (P<0.05). This indicates that both species, of different ecological groups, anecic and epigeic, can reduce F. graminearum inoculum present in wheat straw, while their growth is not negatively affected by this pathogen. These promising results place A. longa, and L. rubellus as potential biocontrol agents of this fungal plant pathogen responsible for Fusarium Head Blight disease in wheat, although further ongoing experiments are needed to confirm the repeatability of these results.Keywords: Aporrectodea longa, biological control, fungal plant pathogen, Lumbricus rubellus, qPCR, wheat straw
Procedia PDF Downloads 27335 Evaluation of Natural Waste Materials for Ammonia Removal in Biofilters
Authors: R. F. Vieira, D. Lopes, I. Baptista, S. A. Figueiredo, V. F. Domingues, R. Jorge, C. Delerue-matos, O. M. Freitas
Abstract:
Odours are generated in municipal solid wastes management plants as a result of decomposition of organic matter, especially when anaerobic degradation occurs. Information was collected about the substances and respective concentration in the surrounding atmosphere of some management plants. The main components which are associated with these unpleasant odours were identified: ammonia, hydrogen sulfide and mercaptans. The first is the most common and the one that presents the highest concentrations, reaching values of 700 mg/m3. Biofiltration, which involves simultaneously biodegradation, absorption and adsorption processes, is a sustainable technology for the treatment of these odour emissions when a natural packing material is used. The packing material should ideally be cheap, durable, and allow the maximum microbiological activity and adsorption/absorption. The presence of nutrients and water is required for biodegradation processes. Adsorption and absorption are enhanced by high specific surface area, high porosity and low density. The main purpose of this work is the exploitation of natural waste materials, locally available, as packing media: heather (Erica lusitanica), chestnut bur (from Castanea sativa), peach pits (from Prunus persica) and eucalyptus bark (from Eucalyptus globulus). Preliminary batch tests of ammonia removal were performed in order to select the most interesting materials for biofiltration, which were then characterized. The following physical and chemical parameters were evaluated: density, moisture, pH, buffer and water retention capacity. The determination of equilibrium isotherms and the adjustment to Langmuir and Freundlich models was also performed. Both models can fit the experimental results. Based both in the material performance as adsorbent and in its physical and chemical characteristics, eucalyptus bark was considered the best material. It presents a maximum adsorption capacity of 0.78±0.45 mol/kg for ammonia. The results from its characterization are: 121 kg/m3 density, 9.8% moisture, pH equal to 5.7, buffer capacity of 0.370 mmol H+/kg of dry matter and water retention capacity of 1.4 g H2O/g of dry matter. The application of natural materials locally available, with little processing, in biofiltration is an economic and sustainable alternative that should be explored.Keywords: ammonia removal, biofiltration, natural materials, odour control
Procedia PDF Downloads 36834 Synthesis and Characterization of pH-Responsive Nanocarriers Based on POEOMA-b-PDPA Block Copolymers for RNA Delivery
Authors: Bruno Baptista, Andreia S. R. Oliveira, Patricia V. Mendonca, Jorge F. J. Coelho, Fani Sousa
Abstract:
Drug delivery systems are designed to allow adequate protection and controlled delivery of drugs to specific locations. These systems aim to reduce side effects and control the biodistribution profile of drugs, thus improving therapeutic efficacy. This study involved the synthesis of polymeric nanoparticles, based on amphiphilic diblock copolymers, comprising a biocompatible, poly (oligo (ethylene oxide) methyl ether methacrylate (POEOMA) as hydrophilic segment and a pH-sensitive block, the poly (2-diisopropylamino)ethyl methacrylate) (PDPA). The objective of this work was the development of polymeric pH-responsive nanoparticles to encapsulate and carry small RNAs as a model to further develop non-coding RNAs delivery systems with therapeutic value. The responsiveness of PDPA to pH allows the electrostatic interaction of these copolymers with nucleic acids at acidic pH, as a result of the protonation of the tertiary amine groups of this polymer at pH values below its pKa (around 6.2). Initially, the molecular weight parameters and chemical structure of the block copolymers were determined by size exclusion chromatography (SEC) and nuclear magnetic resonance (1H-NMR) spectroscopy, respectively. Then, the complexation with small RNAs was verified, generating polyplexes with sizes ranging from 300 to 600 nm and with encapsulation efficiencies around 80%, depending on the molecular weight of the polymers, their composition, and concentration used. The effect of pH on the morphology of nanoparticles was evaluated by scanning electron microscopy (SEM) being verified that at higher pH values, particles tend to lose their spherical shape. Since this work aims to develop systems for the delivery of non-coding RNAs, studies on RNA protection (contact with RNase, FBS, and Trypsin) and cell viability were also carried out. It was found that they induce some protection against constituents of the cellular environment and have no cellular toxicity. In summary, this research work contributes to the development of pH-sensitive polymers, capable of protecting and encapsulating RNA, in a relatively simple and efficient manner, to further be applied on drug delivery to specific sites where pH may have a critical role, as it can occur in several cancer environments.Keywords: drug delivery systems, pH-responsive polymers, POEOMA-b-PDPA, small RNAs
Procedia PDF Downloads 25933 Effects of Soaking of Maize on the Viscosity of Masa and Tortilla Physical Properties at Different Nixtamalization Times
Authors: Jorge Martínez-Rodríguez, Esther Pérez-Carrillo, Diana Laura Anchondo Álvarez, Julia Lucía Leal Villarreal, Mariana Juárez Dominguez, Luisa Fernanda Torres Hernández, Daniela Salinas Morales, Erick Heredia-Olea
Abstract:
Maize tortillas are a staple food in Mexico which are mostly made by nixtamalization, which includes the cooking and steeping of maize kernels in alkaline conditions. The cooking step in nixtamalization demands a lot of energy and also generates nejayote, a water pollutant, at the end of the process. The aim of this study was to reduce the cooking time by adding a maize soaking step before nixtamalization while maintaining the quality properties of masa and tortillas. Maize kernels were soaked for 36 h to increase moisture up to 36%. Then, the effect of different cooking times (0, 5, 10, 15, 20, 20, 25, 30, 35, 45-control and 50 minutes) was evaluated on viscosity profile (RVA) of masa to select the treatments with a profile similar or equal to control. All treatments were left steeping overnight and had the same milling conditions. Treatments selected were 20- and 25-min cooking times which had similar values for pasting temperature (79.23°C and 80.23°C), Maximum Viscosity (105.88 Cp and 96.25 Cp) and Final Viscosity (188.5 Cp and 174 Cp) to those of 45 min-control (77.65 °C, 110.08 Cp, and 186.70 Cp, respectively). Afterward, tortillas were produced with the chosen treatments (20 and 25 min) and for control, then were analyzed for texture, damage starch, colorimetry, thickness, and average diameter. Colorimetric analysis of tortillas only showed significant differences for yellow/blue coordinates (b* parameter) at 20 min (0.885), unlike the 25-minute treatment (1.122). Luminosity (L*) and red/green coordinates (a*) showed no significant differences from treatments with respect control (69.912 and 1.072, respectively); however, 25 minutes was closer in both parameters (73.390 and 1.122) than 20 minutes (74.08 and 0.884). For the color difference, (E), the 25 min value (3.84) was the most similar to the control. However, for tortilla thickness and diameter, the 20-minute with 1.57 mm and 13.12 cm respectively was closer to those of the control (1.69 mm and 13.86 cm) although smaller to it. On the other hand, the 25 min treatment tortilla was smaller than both 20 min and control with 1.51 mm thickness and 13.590 cm diameter. According to texture analyses, there was no difference in terms of stretchability (8.803-10.308 gf) and distance for the break (95.70-126.46 mm) among all treatments. However, for the breaking point, all treatments (317.1 gf and 276.5 gf for 25 and 20- min treatment, respectively) were significantly different from the control tortilla (392.2 gf). Results suggest that by adding a soaking step and reducing cooking time by 25 minutes, masa and tortillas obtained had similar functional and textural properties to the traditional nixtamalization process.Keywords: tortilla, nixtamalization, corn, lime cooking, RVA, colorimetry, texture, masa rheology
Procedia PDF Downloads 17632 The Effect of Emotional Stimuli Related to Body Imbalance in Postural Control and the Phenomenological Experience of Young Healthy Adults
Authors: David Martinez-Pernia, Alvaro Rivera-Rei, Alejandro Troncoso, Gonzalo Forno, Andrea Slachevsky, David Huepe, Victoria Silva-Mack, Jorge Calderon, Mayte Vergara, Valentina Carrera
Abstract:
Background: Recent theories in the field of emotions have taken the relevance of motor control beyond a system related to personal autonomy (walking, running, grooming), and integrate it into the emotional dimension. However, to our best knowledge, there are no studies that specifically investigate how emotional stimuli related to motor control modify emotional states in terms of postural control and phenomenological experience. Objective: The main aim of this work is to investigate the emotions produced by stimuli of bodily imbalance (neutral, pleasant and unpleasant) in the postural control and the phenomenological experience of young, healthy adults. Methodology: 46 healthy young people are shown emotional videos (neutral, pleasant, motor unpleasant, and non-motor unpleasant) related to the body imbalance. During the period of stimulation of each of the videos (60 seconds) the participant is standing on a force platform to collect temporal and spatial data of postural control. In addition, the electrophysiological activity of the heart and electrodermal activity is recorded. In relation to the two unpleasant conditions (motor versus non-motor), a phenomenological interview is carried out to collect the subjective experience of emotion and body perception. Results: Pleasant and unpleasant emotional videos have significant changes with respect to the neutral condition in terms of greater area, higher mean velocity, and greater mean frequency power on the anterior-posterior axis. The results obtained with respect to the electrodermal response was that the pleasurable and unpleasant conditions produced a significant increase in the phasic component with respect to the neutral condition. Regarding the electrophysiology of the heart, no significant change was found in any condition. Phenomenological experiences in the two unpleasant conditions differ in body perception and the emotional meaning of the experience. Conclusion: Emotional stimuli related to bodily imbalance produce changes in postural control, electrodermal activity, and phenomenological experience. This experimental setting could be relevant to be implemented in people with motor disorders (Parkinson, Stroke, TBI) to know how emotions affect motor control.Keywords: body imbalance stimuli, emotion, phenomenological experience, postural control
Procedia PDF Downloads 17331 Electroforming of 3D Digital Light Processing Printed Sculptures Used as a Low Cost Option for Microcasting
Authors: Cecile Meier, Drago Diaz Aleman, Itahisa Perez Conesa, Jose Luis Saorin Perez, Jorge De La Torre Cantero
Abstract:
In this work, two ways of creating small-sized metal sculptures are proposed: the first by means of microcasting and the second by electroforming from models printed in 3D using an FDM (Fused Deposition Modeling) printer or using a DLP (Digital Light Processing) printer. It is viable to replace the wax in the processes of the artistic foundry with 3D printed objects. In this technique, the digital models are manufactured with resin using a low-cost 3D FDM printer in polylactic acid (PLA). This material is used, because its properties make it a viable substitute to wax, within the processes of artistic casting with the technique of lost wax through Ceramic Shell casting. This technique consists of covering a sculpture of wax or in this case PLA with several layers of thermoresistant material. This material is heated to melt the PLA, obtaining an empty mold that is later filled with the molten metal. It is verified that the PLA models reduce the cost and time compared with the hand modeling of the wax. In addition, one can manufacture parts with 3D printing that are not possible to create with manual techniques. However, the sculptures created with this technique have a size limit. The problem is that when printed pieces with PLA are very small, they lose detail, and the laminar texture hides the shape of the piece. DLP type printer allows obtaining more detailed and smaller pieces than the FDM. Such small models are quite difficult and complex to melt using the lost wax technique of Ceramic Shell casting. But, as an alternative, there are microcasting and electroforming, which are specialized in creating small metal pieces such as jewelry ones. The microcasting is a variant of the lost wax that consists of introducing the model in a cylinder in which the refractory material is also poured. The molds are heated in an oven to melt the model and cook them. Finally, the metal is poured into the still hot cylinders that rotate in a machine at high speed to properly distribute all the metal. Because microcasting requires expensive material and machinery to melt a piece of metal, electroforming is an alternative for this process. The electroforming uses models in different materials; for this study, micro-sculptures printed in 3D are used. These are subjected to an electroforming bath that covers the pieces with a very thin layer of metal. This work will investigate the recommended size to use 3D printers, both with PLA and resin and first tests are being done to validate use the electroforming process of microsculptures, which are printed in resin using a DLP printer.Keywords: sculptures, DLP 3D printer, microcasting, electroforming, fused deposition modeling
Procedia PDF Downloads 13530 The Effect of Physical Guidance on Learning a Tracking Task in Children with Cerebral Palsy
Authors: Elham Azimzadeh, Hamidollah Hassanlouei, Hadi Nobari, Georgian Badicu, Jorge Pérez-Gómez, Luca Paolo Ardigò
Abstract:
Children with cerebral palsy (CP) have weak physical abilities and their limitations may have an effect on performing everyday motor activities. One of the most important and common debilitating factors in CP is the malfunction in the upper extremities to perform motor skills and there is strong evidence that task-specific training may lead to improve general upper limb function among this population. However, augmented feedback enhances the acquisition and learning of a motor task. Practice conditions may alter the difficulty, e.g., the reduced frequency of PG could be more challenging for this population to learn a motor task. So, the purpose of this study was to investigate the effect of physical guidance (PG) on learning a tracking task in children with cerebral palsy (CP). Twenty-five independently ambulant children with spastic hemiplegic CP aged 7-15 years were assigned randomly to five groups. After the pre-test, experimental groups participated in an intervention for eight sessions, 12 trials during each session. The 0% PG group received no PG; the 25% PG group received PG for three trials; the 50% PG group received PG for six trials; the 75% PG group received PG for nine trials; and the 100% PG group, received PG for all 12 trials. PG consisted of placing the experimenter's hand around the children's hand, guiding them to stay on track and complete the task. Learning was inferred by acquisition and delayed retention tests. The tests involved two blocks of 12 trials of the tracking task without any PG being performed by all participants. They were asked to make the movement as accurate as possible (i.e., fewer errors) and the number of total touches (errors) in 24 trials was calculated as the scores of the tests. The results showed that the higher frequency of PG led to more accurate performance during the practice phase. However, the group that received 75% PG had significantly better performance compared to the other groups in the retention phase. It is concluded that the optimal frequency of PG played a critical role in learning a tracking task in children with CP and likely this population may benefit from an optimal level of PG to get the appropriate amount of information confirming the challenge point framework (CPF), which state that too much or too little information will retard learning a motor skill. Therefore, an optimum level of PG may help these children to identify appropriate patterns of motor skill using extrinsic information they receive through PG and improve learning by activating the intrinsic feedback mechanisms.Keywords: cerebral palsy, challenge point framework, motor learning, physical guidance, tracking task
Procedia PDF Downloads 7029 Extraction and Quantification of Peramine Present in Dalaca pallens, a Pest of Grassland in Southtern Chile
Authors: Leonardo Parra, Daniel Martínez, Jorge Pizarro, Fernando Ortega, Manuel Chacón-Fuentes, Andrés Quiroz
Abstract:
Control of Dalaca pallens or blackworms, one of the most important hypogeous pest in grassland in southern Chile, is based on the use of broad-spectrum insecticides such as organophosphates and pyrethroids. However, the rapid development of insecticide resistance in field populations of this insect and public concern over the environmental impact of these insecticides has resulted in the search for other control methods. Specifically, the use of endophyte fungi for controlling pest has emerged as an interesting and promising strategy. Endophytes from ryegrass (Lolium perenne), establish a biotrophic relationship with the host, defined as mutualistic symbiosis. The plant-fungi association produces alkaloids where peramine is the main toxic substance against Listronotus bonariensis, the most important epigean pest of ryegrass. Nevertheless, the effect of peramina on others pest insects, such as D. pallens, to our knowledge has not been studied, and also its possible metabolization in the body of the larvae. Therefore, we addressed the following research question: Do larvae of D. pallens store peramine after consumption of ryegrass endophyte infected (E+)? For this, specimens of blackworms were fed with ryegrass plant of seven experimental lines and one commercial cultivar endophyte free (E-) sown at the Instituto de Investigaciones Agropecuarias Carillanca (Vilcún, Chile). Once the feeding period was over, ten larvae of each treatment were examined. Individuals were dissected, and their gut was removed to exclude any influence of remaining material. The rest of the larva's body was dried at 60°C by 24-48 h and ground into a fine powder using a mortar. 25 mg of dry powder was transferred to a microcentrifuge tube and extracted in 1 mL of a mixture of methanol:water:formic acid. Then, the samples were centrifuged at 16,000 rpm for 3 min, and the supernatant was colected and injected in the liquid chromatography of high resolution (HPLC). The results confirmed the presence of peramine in the larva's body of D. pallens. The insects that fed the experimental lines LQE-2 and LQE-6 were those where peramine was present in high proportion (0.205 and 0.199 ppm, respectively); while LQE-7 and LQE-3 obtained the lowest concentrations of the alkaloid (0.047 and 0.053 ppm, respectively). Peramine was not detected in the insects when the control cultivar Jumbo (E-) was tested. These results evidenced the storage and metabolism of peramine during consumption of the larvae. However, the effect of this alkaloid present in 'future ryegrass cultivars' (LQE-2 and LQE-6) on the performance and survival of blackworms must be studied and confirmed experimentally.Keywords: blackworms, HPLC, alkaloid, pest
Procedia PDF Downloads 30428 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries
Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman
Abstract:
There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems
Procedia PDF Downloads 14727 Topographic Coast Monitoring Using UAV Photogrammetry: A Case Study in Port of Veracruz Expansion Project
Authors: Francisco Liaño-Carrera, Jorge Enrique Baños-Illana, Arturo Gómez-Barrero, José Isaac Ramírez-Macías, Erik Omar Paredes-JuáRez, David Salas-Monreal, Mayra Lorena Riveron-Enzastiga
Abstract:
Topographical changes in coastal areas are usually assessed with airborne LIDAR and conventional photogrammetry. In recent times Unmanned Aerial Vehicles (UAV) have been used several in photogrammetric applications including coastline evolution. However, its use goes further by using the points cloud associated to generate beach Digital Elevation Models (DEM). We present a methodology for monitoring coastal topographic changes along a 50 km coastline in Veracruz, Mexico using high-resolution images (less than 10 cm ground resolution) and dense points cloud captured with an UAV. This monitoring develops in the context of the port of Veracruz expansion project which construction began in 2015 and intends to characterize coast evolution and prevent and mitigate project impacts on coastal environments. The monitoring began with a historical coastline reconstruction since 1979 to 2015 using aerial photography and Landsat imagery. We could define some patterns: the northern part of the study area showed accretion while the southern part of the study area showed erosion. Since the study area is located off the port of Veracruz, a touristic and economical Mexican urban city, where coastal development structures have been built since 1979 in a continuous way, the local beaches of the touristic area are been refilled constantly. Those areas were not described as accretion since every month sand-filled trucks refill the sand beaches located in front of the hotel area. The construction of marinas and the comitial port of Veracruz, the old and the new expansion were made in the erosion part of the area. Northward from the City of Veracruz the beaches were described as accretion areas while southward from the city, the beaches were described as erosion areas. One of the problems is the expansion of the new development in the southern area of the city using the beach view as an incentive to buy front beach houses. We assessed coastal changes between seasons using high-resolution images and also points clouds during 2016 and preliminary results confirm that UAVs can be used in permanent coast monitoring programs with excellent performance and detail.Keywords: digital elevation model, high-resolution images, topographic coast monitoring, unmanned aerial vehicle
Procedia PDF Downloads 27026 Analyzing the Impact of Bariatric Surgery in Obesity Associated Chronic Kidney Disease: A 2-Year Observational Study
Authors: Daniela Magalhaes, Jorge Pedro, Pedro Souteiro, Joao S. Neves, Sofia Castro-Oliveira, Vanessa Guerreiro, Rita Bettencourt- Silva, Maria M. Costa, Ana Varela, Joana Queiros, Paula Freitas, Davide Carvalho
Abstract:
Introduction: Obesity is an independent risk factor for renal dysfunction. Our aims were: (1) evaluate the impact of bariatric surgery (BS) on renal function; (2) clarify the factors determining the postoperative evolution of the glomerular filtration rate (GFR); (3) access the occurrence of oxalate-mediated renal complications. Methods: We investigated a cohort of 1448 obese patients who underwent bariatric surgery. Those with basal GFR (GFR0) < 30mL/min or without information about the GFR 2-year post-surgery (GFR2) were excluded. Results: We included 725 patients, of whom 647 (89.2%) women, with 41 (IQR 34-51) years, a median weight of 112.4 (IQR 103.0-125.0) kg and a median BMI of 43.4 (IQR 40.6-46.9) kg/m2. Of these, 459 (63.3%) performed gastric bypass (RYGB), 144 (19.9%) placed an adjustable gastric band (AGB) and 122 (16.8%) underwent vertical gastrectomy (VG). At 2-year post-surgery, excess weight loss (EWL) was 60.1 (IQR 43.7-72.4) %. There was a significant improve of metabolic and inflammatory status, as well as a significant decrease in the proportion of patients with diabetes, arterial hypertension and dyslipidemia (p < 0.0001). At baseline, 38 (5.2%) of subjects had hyperfiltration with a GFR0 ≥ 125mL/min/1.73m2, 492 (67.9%) had a GFR0 90-124 mL/min/1.73m2, 178 (24.6%) had a GFR0 60-89 mL/min/1.73m2, and 17 (2.3%) had a GFR0 < 60 mL/min/1.73m2. GFR decreased in 63.2% of patients with hyperfiltration (ΔGFR=-2.5±7.6), and increased in 96.6% (ΔGFR=22.2±12.0) and 82.4% (ΔGFR=24.3±30.0) of the subjects with GFR0 60-89 and < 60 mL/min/1.73m2, respectively ( p < 0.0001). This trend was maintained when adjustment was made for the type of surgery performed. Of 321 patients, 10 (3.3%) had a urinary albumin excretion (UAE) > 300 mg/dL (A3), 44 (14.6%) had a UAE 30-300 mg/dL (A2) and 247 (82.1%) has a UAE < 30 mg/dL (A1). Albuminuria decreased after surgery and at 2-year follow-up only 1 (0.3%) patient had A3, 17 (5.6%) had A2 and 283 (94%) had A1 (p < 0,0001). In multivariate analysis, the variables independently associated with ΔGFR were BMI (positively) and fasting plasma glucose (negatively). During the 2-year follow-up, only 57 of the 725 patients had transient urinary excretion of calcium oxalate crystals. None has records of oxalate-mediated renal complications at our center. Conclusions: The evolution of GFR after BS seems to depend on the initial renal function, as it decreases in subjects with hyperfiltration, but tends to increase in those with renal dysfunction. Our results suggest that BS is associated with improvement of renal outcomes, without significant increase of renal complications. So, apart the clear benefits in metabolic and inflammatory status, maybe obese adults with nondialysis-dependent CKD should be referred for bariatric surgery evaluation.Keywords: albuminuria, bariatric surgery, glomerular filtration rate, renal function
Procedia PDF Downloads 35925 Biofiltration Odour Removal at Wastewater Treatment Plant Using Natural Materials: Pilot Scale Studies
Authors: D. Lopes, I. I. R. Baptista, R. F. Vieira, J. Vaz, H. Varela, O. M. Freitas, V. F. Domingues, R. Jorge, C. Delerue-Matos, S. A. Figueiredo
Abstract:
Deodorization is nowadays a need in wastewater treatment plants. Nitrogen and sulphur compounds, volatile fatty acids, aldehydes and ketones are responsible for the unpleasant odours, being ammonia, hydrogen sulphide and mercaptans the most common pollutants. Although chemical treatments of the air extracted are efficient, these are more expensive than biological treatments, namely due the use of chemical reagents (commonly sulphuric acid, sodium hypochlorite and sodium hydroxide). Biofiltration offers the advantage of avoiding the use of reagents (only in some cases, nutrients are added in order to increase the treatment efficiency) and can be considered a sustainable process when the packing medium used is of natural origin. In this work the application of some natural materials locally available was studied both at laboratory and pilot scale, in a real wastewater treatment plant. The materials selected for this study were indigenous Portuguese forest materials derived from eucalyptus and pinewood, such as woodchips and bark, and coconut fiber was also used for comparison purposes. Their physico-chemical characterization was performed: density, moisture, pH, buffer and water retention capacity. Laboratory studies involved batch adsorption studies for ammonia and hydrogen sulphide removal and evaluation of microbiological activity. Four pilot-scale biofilters (1 cubic meter volume) were installed at a local wastewater treatment plant treating odours from the effluent receiving chamber. Each biofilter contained a different packing material consisting of mixtures of eucalyptus bark, pine woodchips and coconut fiber, with added buffering agents and nutrients. The odour treatment efficiency was monitored over time, as well as other operating parameters. The operation at pilot scale suggested that between the processes involved in biofiltration - adsorption, absorption and biodegradation - the first dominates at the beginning, while the biofilm is developing. When the biofilm is completely established, and the adsorption capacity of the material is reached, biodegradation becomes the most relevant odour removal mechanism. High odour and hydrogen sulphide removal efficiencies were achieved throughout the testing period (over 6 months), confirming the suitability of the materials selected, and mixtures thereof prepared, for biofiltration applications.Keywords: ammonia hydrogen sulphide and removal, biofiltration, natural materials, odour control in wastewater treatment plants
Procedia PDF Downloads 30224 De novo Transcriptome Assembly of Lumpfish (Cyclopterus lumpus L.) Brain Towards Understanding their Social and Cognitive Behavioural Traits
Authors: Likith Reddy Pinninti, Fredrik Ribsskog Staven, Leslie Robert Noble, Jorge Manuel de Oliveira Fernandes, Deepti Manjari Patel, Torstein Kristensen
Abstract:
Understanding fish behavior is essential to improve animal welfare in aquaculture research. Behavioral traits can have a strong influence on fish health and habituation. To identify the genes and biological pathways responsible for lumpfish behavior, we performed an experiment to understand the interspecies relationship (mutualism) between the lumpfish and salmon. Also, we tested the correlation between the gene expression data vs. observational/physiological data to know the essential genes that trigger stress and swimming behavior in lumpfish. After the de novo assembly of the brain transcriptome, all the samples were individually mapped to the available lumpfish (Cyclopterus lumpus L.) primary genome assembly (fCycLum1.pri, GCF_009769545.1). Out of ~16749 genes expressed in brain samples, we found 267 genes to be statistically significant (P > 0.05) found only in odor and control (1), model and control (41) and salmon and control (225) groups. However, genes with |LogFC| ≥0.5 were found to be only eight; these are considered as differentially expressed genes (DEG’s). Though, we are unable to find the differential genes related to the behavioral traits from RNA-Seq data analysis. From the correlation analysis, between the gene expression data vs. observational/physiological data (serotonin (5HT), dopamine (DA), 3,4-Dihydroxyphenylacetic acid (DOPAC), 5-hydroxy indole acetic acid (5-HIAA), Noradrenaline (NORAD)). We found 2495 genes found to be significant (P > 0.05) and among these, 1587 genes are positively correlated with the Noradrenaline (NORAD) hormone group. This suggests that Noradrenaline is triggering the change in pigmentation and skin color in lumpfish. Genes related to behavioral traits like rhythmic, locomotory, feeding, visual, pigmentation, stress, response to other organisms, taxis, dopamine synthesis and other neurotransmitter synthesis-related genes were obtained from the correlation analysis. In KEGG pathway enrichment analysis, we find important pathways, like the calcium signaling pathway and adrenergic signaling in cardiomyocytes, both involved in cell signaling, behavior, emotion, and stress. Calcium is an essential signaling molecule in the brain cells; it could affect the behavior of fish. Our results suggest that changes in calcium homeostasis and adrenergic receptor binding activity lead to changes in fish behavior during stress.Keywords: behavior, De novo, lumpfish, salmon
Procedia PDF Downloads 17323 Molecular Detection of Staphylococcus aureus in the Pork Chain Supply and the Potential Anti-Staphylococcal Activity of Natural Compounds
Authors: Valeria Velasco, Ana M. Bonilla, José L. Vergara, Alcides Lofa, Jorge Campos, Pedro Rojas-García
Abstract:
Staphylococcus aureus is both commensal bacterium and opportunistic pathogen that can cause different diseases in humans and can rapidly develop antimicrobial resistance. Since this bacterium has the ability to colonize the nares and skin of humans and animals, there is a risk of contamination of food in different steps of the food chain supply. Emerging strains have been detected in food-producing animals and meat, such as methicillin-resistant S. aureus (MRSA). The aim of this study was to determine the prevalence and oxacillin susceptibility of S. aureus in the pork chain supply in Chile and to suggest some natural antimicrobials for control. A total of 487 samples were collected from pigs (n=332), carcasses (n=85), and retail pork meat (n=70). Presumptive S. aureus colonies were isolated by selective enrichment and culture media. The confirmation was carried out by biochemical testing (Api® Staph) and molecular technique PCR (detection of nuc and mecA genes, associated with S. aureus and methicillin resistance, respectively). The oxacillin (β-lactam antibiotic that replaced methicillin) susceptibility was assessed by minimum inhibitory concentration (MIC) using the Epsilometer test (Etest). A preliminary assay was carried out to test thymol, carvacrol, oregano essential oil (Origanum vulgare L.), Maqui or Chilean wineberry extract (Aristotelia chilensis (Mol.) Stuntz) as anti-staphylococcal agents using the disc diffusion method at different concentrations. The overall prevalence of S. aureus in the pork chain supply reached 33.9%. A higher prevalence of S. aureus was determined in carcasses (56.5%) than in pigs (28.3%) and pork meat (32.9%) (P ≤ 0.05). The prevalence of S. aureus in pigs sampled at farms (40.6%) was higher than in pigs sampled at slaughterhouses (23.3%) (P ≤ 0.05). The contamination of no packaged meat with S. aureus (43.1%) was higher than in packaged meat (5.3%) (P ≤ 0.05). The mecA gene was not detected in S. aureus strains isolated in this study. Two S. aureus strains exhibited oxacillin resistance (MIC ≥ 4µg/mL). Anti-staphylococcal activity was detected in solutions of thymol, carvacrol, and oregano essential oil at all concentrations tested. No anti-staphylococcal activity was detected in Maqui extract. Finally, S. aureus is present in the pork chain supply in Chile. Although the mecA gene was not detected, oxacillin resistance was found in S. aureus and could be attributed to another resistance mechanism. Thymol, carvacrol, and oregano essential oil could be used as anti-staphylococcal agents at low concentrations. Research project Fondecyt No. 11140379.Keywords: antimicrobials, mecA gen, nuc gen, oxacillin susceptibility, pork meat
Procedia PDF Downloads 22822 The Genus Bacillus, Effect on Commercial Crops of Colombia
Authors: L. C. Sánchez, L. C. Corrales, A. G. Lancheros, E. Castañeda, Y. Ariza, L. S. Fuentes, L. Sierra, J. L. Cuervo
Abstract:
The importance of environment friendly alternatives in agricultural processes is the reason why the research group Ceparium, the Colegio Mayor de Cundinamarca University, Colombia, investigated the genus Bacillus and its applicability for improving crops of economic importance in Colombia. In this investigation, we presented a study in which the genus Bacillus plays a leading role as beneficial microorganism. The objective was to identify the biochemical potential of three indigenous species of Bacillus, which were able to carry out actions for biological control against pathogens and pests or promoted growth to improve productivity of crops in Colombia. The procedures were performed in three phases: first, the production of biomass of an indigenous strain and a reference strain starting from culture media for production of spores and toxins were made. Spore count was done in a Neubauer chamber, concentrations of spores of Bacillus sphaericus were prepared and a bioassay was done at the Laboratory of Entomology at the University Jorge Tadeo Lozano of Plutella xylostella larvae, insect pest of crucifers in several Colombian regions. The second phase included the extraction in the liquid state fermentation, a secondary metabolite that has antibiosis action against fungi, call iturin B, and was obtained from strains of Bacillus subtilis. The molecule was identified using High Resolution Chromatography (HPLC) and its biocontrol effect on Fusarium sp fungus causes vascular wilt in economically important plant varieties, was confirmed using testing of antagonism in Petri dish. In the third phase, an initial procedure in that let recover and identify microorganisms of the genus Bacillus from the rhizosphere in two aromatic herbs, Rosmarinus officinalis and Thymus vulgaris L. was used. Subsequently, testing of antagonism against Fusarium sp were made and an assay was done under greenhouse conditions to observe biocontrol and growth promoting action by comparing growth in length and dry weight. In the first experiment, native Bacillus sphaericus was lethal to 92% Plutella xylostella larvae in 10 DDA. In the second experiment, iturin B was identified and biological control of Fusarium sp was demonstrated. In the third study, all strains demonstrated biological control and the B14 strain identified as Bacillus megaterium increased root length and productivity of the two plants in terms of weight. It was concluded that the native microorganisms of the genus Bacillus has a great biochemical potential that provides a beneficial interactions with plants, improve their growth and development and therefore a greater impact on production.Keywords: genus bacillus, biological control, PGPRs, biochemical potential
Procedia PDF Downloads 43521 A Bayesian Approach for Health Workforce Planning in Portugal
Authors: Diana F. Lopes, Jorge Simoes, José Martins, Eduardo Castro
Abstract:
Health professionals are the keystone of any health system, by delivering health services to the population. Given the time and cost involved in training new health professionals, the planning process of the health workforce is particularly important as it ensures a proper balance between the supply and demand of these professionals and it plays a central role on the Health 2020 policy. In the past 40 years, the planning of the health workforce in Portugal has been conducted in a reactive way lacking a prospective vision based on an integrated, comprehensive and valid analysis. This situation may compromise not only the productivity and the overall socio-economic development but the quality of the healthcare services delivered to patients. This is even more critical given the expected shortage of the health workforce in the future. Furthermore, Portugal is facing an aging context of some professional classes (physicians and nurses). In 2015, 54% of physicians in Portugal were over 50 years old, and 30% of all members were over 60 years old. This phenomenon associated to an increasing emigration of young health professionals and a change in the citizens’ illness profiles and expectations must be considered when planning resources in healthcare. The perspective of sudden retirement of large groups of professionals in a short time is also a major problem to address. Another challenge to embrace is the health workforce imbalances, in which Portugal has one of the lowest nurse to physician ratio, 1.5, below the European Region and the OECD averages (2.2 and 2.8, respectively). Within the scope of the HEALTH 2040 project – which aims to estimate the ‘Future needs of human health resources in Portugal till 2040’ – the present study intends to get a comprehensive dynamic approach of the problem, by (i) estimating the needs of physicians and nurses in Portugal, by specialties and by quinquenium till 2040; (ii) identifying the training needs of physicians and nurses, in medium and long term, till 2040, and (iii) estimating the number of students that must be admitted into medicine and nursing training systems, each year, considering the different categories of specialties. The development of such approach is significantly more critical in the context of limited budget resources and changing health care needs. In this context, this study presents the drivers of the healthcare needs’ evolution (such as the demographic and technological evolution, the future expectations of the users of the health systems) and it proposes a Bayesian methodology, combining the best available data with experts opinion, to model such evolution. Preliminary results considering different plausible scenarios are presented. The proposed methodology will be integrated in a user-friendly decision support system so it can be used by politicians, with the potential to measure the impact of health policies, both at the regional and the national level.Keywords: bayesian estimation, health economics, health workforce planning, human health resources planning
Procedia PDF Downloads 25220 Prospective Service Evaluation of Physical Healthcare In Adult Community Mental Health Services in a UK-Based Mental Health Trust
Authors: Gracie Tredget, Raymond McGrath, Karen Ang, Julie Williams, Nick Sevdalis, Fiona Gaughran, Jorge Aria de la Torre, Ioannis Bakolis, Andy Healey, Zarnie Khadjesari, Euan Sadler, Natalia Stepan
Abstract:
Background: Preventable physical health problems have been found to increase morbidity rates amongst adults living with serious mental illness (SMI). Community mental health clinicians have a role in identifying, and preventing physical health problems worsening, and supporting primary care services to administer routine physical health checks for their patients. However, little is known about how mental health staff perceive and approach their role when providing physical healthcare amongst patients with SMI, or the impact these attitudes have on routine practice. Methods: The present study involves a prospective service evaluation specific to Adult Community Mental Health Services at South London and Maudsley NHS Foundation Trust (SLaM). A qualitative methodology will use semi-structured interviews, focus groups and observations to explore attitudes, perceptions and experiences of staff, patients, and carers (n=64) towards physical healthcare, and barriers or facilitators that impact upon it. 1South London and Maudsley NHS Foundation Trust, London, SE5 8AZ, UK 2 Centre for Implementation Science, King’s College London, London, SE5 8AF, UK 3 Psychosis Studies, King's College London, London, SE5 8AF, UK 4 Department of Biostatistics and Health Informatics, King’s College London, London, SE5 8AF, UK 5 Kings Health Economics, King's College London, London, SE5 8AF, UK 6 Behavioural and Implementation Science (BIS) research group, University of East Anglia, Norwich, UK 7 Department of Nursing, Midwifery and Health, University of Southampton, Southampton, UK 8 Mind and Body Programme, King’s Health Partners, Guy’s Hospital, London, SE1 9RT *[email protected] Analysis: Data from across qualitative tasks will be synthesised using Framework Analysis methodologies. Staff, patients, and carers will be invited to participate in co-development of recommendations that can improve routine physical healthcare within Adult Community Mental Health Teams at SLaM. Results: Data collection is underway at present. At the time of the conference, early findings will be available to discuss. Conclusions: An integrated approach to mind and body care is needed to reduce preventable deaths amongst people with SMI. This evaluation will seek to provide a framework that better equips staff to approach physical healthcare within a mental health setting.Keywords: severe mental illness, physical healthcare, adult community mental health, nursing
Procedia PDF Downloads 9519 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 14618 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 14217 Exploratory Tests on Structures Resistance during Forest Fires
Authors: Luis M. Ribeiro, Jorge Raposo, Ricardo Oliveira, David Caballero, Domingos X. Viegas
Abstract:
Under the scope of European project WUIWATCH a set of experimental tests on house vulnerability was performed in order to assess the resistance of selected house components during the passage of a forest fire. Among the individual elements most affected by the passage of a wildfire the windows are the ones with greater exposure. In this sense, a set of exploratory experimental tests was designed to assess some particular aspects related to the vulnerability of windows and blinds. At the same time, the importance of leaving them closed (as well as the doors inside a house) during a wild fire was explored in order to give some scientific background to guidelines for homeowners. Three sets of tests were performed: 1. Windows and blinds resistance to heat. Three types of protective blinds were tested (aluminium, PVC and wood) on 2 types of windows (single and double pane). The objective was to assess the structures resistance. 2. The influence of air flow on the transport of burning embers inside a house. A room was built to scale, and placed inside a wind tunnel, with one window and one door on opposite sides. The objective was to assess the importance of leaving an inside door opened on the probability of burning embers entering the room. 3. The influence of the dimension of openings on a window or door related to the probability of ignition inside a house. The objective was to assess the influence of different window openings in relation to the amount of burning particles that can enter a house. The main results were: 1. The purely radiative heat source provides 1.5 KW/m2 of heat impact in the structure, while the real fire generates 10 Kw/m2. When protected by the blind, the single pane window reaches 30ºC on both sides, and the double pane window has a differential of 10º from the side facing the heat (30ºC) and the opposite side (40ºC). Unprotected window constantly increases temperature until the end of the test. Window blinds reach considerably higher temperatures. PVC loses its consistency above 150ºC and melts. 2. Leaving the inside door closed results in a positive pressure differential of +1Pa from the outside to the inside, inhibiting the air flow. Opening the door in half or full reverts the pressure differential to -6 and -8 times respectively, favouring the air flow from the outside to the inside. The number of particles entering the house follows the same tendency. 3. As the bottom opening in a window increases from 0,5 cm to 4 cm the number of particles that enter the house per second also increases greatly. From 5 cm until 80cm there is no substantial increase in the number of entering particles. This set of exploratory tests proved to be an added value in supporting guidelines for home owners, regarding self-protection in WUI areas.Keywords: forest fire, wildland urban interface, house vulnerability, house protective elements
Procedia PDF Downloads 28116 A Complex Network Approach to Structural Inequality of Educational Deprivation
Authors: Harvey Sanchez-Restrepo, Jorge Louca
Abstract:
Equity and education are major focus of government policies around the world due to its relevance for addressing the sustainable development goals launched by Unesco. In this research, we developed a primary analysis of a data set of more than one hundred educational and non-educational factors associated with learning, coming from a census-based large-scale assessment carried on in Ecuador for 1.038.328 students, their families, teachers, and school directors, throughout 2014-2018. Each participating student was assessed by a standardized computer-based test. Learning outcomes were calibrated through item response theory with two-parameters logistic model for getting raw scores that were re-scaled and synthetized by a learning index (LI). Our objective was to develop a network for modelling educational deprivation and analyze the structure of inequality gaps, as well as their relationship with socioeconomic status, school financing, and student's ethnicity. Results from the model show that 348 270 students did not develop the minimum skills (prevalence rate=0.215) and that Afro-Ecuadorian, Montuvios and Indigenous students exhibited the highest prevalence with 0.312, 0.278 and 0.226, respectively. Regarding the socioeconomic status of students (SES), modularity class shows clearly that the system is out of equilibrium: the first decile (the poorest) exhibits a prevalence rate of 0.386 while rate for decile ten (the richest) is 0.080, showing an intense negative relationship between learning and SES given by R= –0.58 (p < 0.001). Another interesting and unexpected result is the average-weighted degree (426.9) for both private and public schools attending Afro-Ecuadorian students, groups that got the highest PageRank (0.426) and pointing out that they suffer the highest educational deprivation due to discrimination, even belonging to the richest decile. The model also found the factors which explain deprivation through the highest PageRank and the greatest degree of connectivity for the first decile, they are: financial bonus for attending school, computer access, internet access, number of children, living with at least one parent, books access, read books, phone access, time for homework, teachers arriving late, paid work, positive expectations about schooling, and mother education. These results provide very accurate and clear knowledge about the variables affecting poorest students and the inequalities that it produces, from which it might be defined needs profiles, as well as actions on the factors in which it is possible to influence. Finally, these results confirm that network analysis is fundamental for educational policy, especially linking reliable microdata with social macro-parameters because it allows us to infer how gaps in educational achievements are driven by students’ context at the time of assigning resources.Keywords: complex network, educational deprivation, evidence-based policy, large-scale assessments, policy informatics
Procedia PDF Downloads 12215 Assessment of DNA Sequence Encoding Techniques for Machine Learning Algorithms Using a Universal Bacterial Marker
Authors: Diego Santibañez Oyarce, Fernanda Bravo Cornejo, Camilo Cerda Sarabia, Belén Díaz Díaz, Esteban Gómez Terán, Hugo Osses Prado, Raúl Caulier-Cisterna, Jorge Vergara-Quezada, Ana Moya-Beltrán
Abstract:
The advent of high-throughput sequencing technologies has revolutionized genomics, generating vast amounts of genetic data that challenge traditional bioinformatics methods. Machine learning addresses these challenges by leveraging computational power to identify patterns and extract information from large datasets. However, biological sequence data, being symbolic and non-numeric, must be converted into numerical formats for machine learning algorithms to process effectively. So far, some encoding methods, such as one-hot encoding or k-mers, have been explored. This work proposes additional approaches for encoding DNA sequences in order to compare them with existing techniques and determine if they can provide improvements or if current methods offer superior results. Data from the 16S rRNA gene, a universal marker, was used to analyze eight bacterial groups that are significant in the pulmonary environment and have clinical implications. The bacterial genes included in this analysis are Prevotella, Abiotrophia, Acidovorax, Streptococcus, Neisseria, Veillonella, Mycobacterium, and Megasphaera. These data were downloaded from the NCBI database in Genbank file format, followed by a syntactic analysis to selectively extract relevant information from each file. For data encoding, a sequence normalization process was carried out as the first step. From approximately 22,000 initial data points, a subset was generated for testing purposes. Specifically, 55 sequences from each bacterial group met the length criteria, resulting in an initial sample of approximately 440 sequences. The sequences were encoded using different methods, including one-hot encoding, k-mers, Fourier transform, and Wavelet transform. Various machine learning algorithms, such as support vector machines, random forests, and neural networks, were trained to evaluate these encoding methods. The performance of these models was assessed using multiple metrics, including the confusion matrix, ROC curve, and F1 Score, providing a comprehensive evaluation of their classification capabilities. The results show that accuracies between encoding methods vary by up to approximately 15%, with the Fourier transform obtaining the best results for the evaluated machine learning algorithms. These findings, supported by the detailed analysis using the confusion matrix, ROC curve, and F1 Score, provide valuable insights into the effectiveness of different encoding methods and machine learning algorithms for genomic data analysis, potentially improving the accuracy and efficiency of bacterial classification and related genomic studies.Keywords: DNA encoding, machine learning, Fourier transform, Fourier transformation
Procedia PDF Downloads 2314 Effects of Oxidized LDL in M2 Macrophages: Implications in Atherosclerosis
Authors: Fernanda Gonçalves, Karla Alcântara, Vanessa Moura, Patrícia Nolasco, Jorge Kalil, Maristela Hernandez
Abstract:
Introduction: Atherosclerosis is a chronic disease where two striking features are observed: retention of lipids and inflammation. Understanding the interaction between immune cells and lipoproteins involved in atherogenesis are urgent challenges, since cardiovascular diseases are the leading cause of death worldwide. Macrophages are critical to the development of atherosclerotic plaques and in the perpetuation of inflammation in these lesions. These cells are also directly involved in unstable plaque rupture. Recently different populations of macrophages are being identified in atherosclerotic lesions. Although the presence of M2 macrophages (macrophages activated by the alternative pathway, eg. The IL-4) has been identified, the function of these cells in atherosclerosis is not yet defined. M2 macrophages have a high endocytic capacity, they promote remodeling of tissues and to have anti-inflammatory activity. However, in atherosclerosis, especially unstable plaques, severe inflammatory reaction, accumulation of cellular debris and intense degradation of the tissue is observed. Thus, it is possible that the M2 macrophages have altered function (phenotype) in atherosclerosis. Objective: Our aim is to evaluate if the presence of oxidized LDL alters the phenotype and function of M2 macrophages in vitro. Methods: For this, we will evaluate whether the addition of lipoprotein in M2 macrophages differentiated in vitro with IL -4 induces 1) a reduction in the secretion of anti-inflammatory cytokines (CBA and ELISA), 2) secretion of inflammatory cytokines (CBA and ELISA), 3) expression of cell activation markers (Flow cytometry), 4) alteration in gene expression of molecules adhesion and extracellular matrix (Real-Time PCR) and 5) Matrix degradation (confocal microscopy). Results: In oxLDL stimulated M2 macrophages cultures we did not find any differences in the expression of the cell surface markers tested, including: HLA-DR, CD80, CD86, CD206, CD163 and CD36. Also, cultures stimulated with oxLDL had similar phagocytic capacity when compared to unstimulated cells. However, in the supernatant of these cultures an increase in the secretion of the pro-inflammatory cytokine IL-8 was detected. No significant changes where observed in IL-6, IL-10, IL-12 and IL-1b levels. The culture supernatant also induced massive extracellular matrix (produced by mouse embryo fibroblast) filaments degradation. When evaluating the expression of 84 extracellular matrix and adhesion molecules genes, we observed that the stimulation of oxLDL in M2 macrophages decreased 47% of the genes and increased the expression of only 3% of the genes. In particular we noted that oxLDL inhibit the expression of 60% of the genes constituents of extracellular matrix and collagen expressed by these cells, including fibronectin1 and collagen VI. We also observed a decrease in the expression of matrix protease inhibitors, such as TIMP 2. On the opposite, the matricellular protein thrombospondin had a 12 fold increase in gene expression. In the presence of native LDL 90% of the genes had no altered expression. Conclusion: M2 macrophages stimulated with oxLDL secrete the pro-inflammatory cytokine IL-8, have an altered extracellular matrix constituents gene expression, and promote the degradation of extracellular matrix. M2 macrophages may contribute to the perpetuation of inflammation in atherosclerosis and to plaque rupture.Keywords: atherosclerosis, LDL, macrophages, m2
Procedia PDF Downloads 335