Search results for: Fernando Ortega
46 Women’s Sport on the Brazilian Governmental Agenda
Authors: Giovanna X. De Moura, Fernando A. Starepravo
Abstract:
In recent years, the discussion of women in sports has been part of the political agenda in several countries. However, in the Brazilian scope, it is possible to say that women's sport has not become a social problem recognized by political actors and, therefore, it has not entered the country's governmental agenda. Thus, this work aimed to analyze why sport for women is not on the Brazilian government's agenda. For this, it was interviewed six women considered to be stakeholders in sports, that is, women who influence or are influenced by sports. The interviews were based on a semi-structured script and carried out in the year 2022. Due to the difficulties of commuting and of the schedule of the interviewees, some interviews were carried out in person, others by video call or telephone and others by WhatsApp. The interviews were transcribed and analyzed using Bardin's Content Analysis. As a result, from the stakeholders' perception, it was ascertained that women's sport is not considered a political problem because both sport and politics are considered masculinized fields, making it difficult for women to be present in both spaces. Besides, not only the sport of women but sport in general, is seen as just a marketing tool and a way of getting financial return for companies, being neglected in government plans. Due to this fact, private institutions, corporative means, federations and confederations have been mobilized in the creation of policies that seek changes in the current scenario. Despite this, two PLs (PL 6263/2019 and PL 5297/2020) have been in the process since 2019 but have not been approved yet due to the failure to submit amendments within the established deadline. In order to change this reality, the ones surveyed suggested that there should be not only different types of women represented on the most varied fronts of sports but also more visibility of the issue of women in this field. Furthermore, they mentioned the importance of the creation of specific plans and policies that guarantee a safe place for women and that are consolidated as State policies. In addition, the need for more women in political decision-making positions was also mentioned. It was concluded that women's sport appears on the agenda at a secondary level since it is included on the legislative, and political agenda but not in the executive branch. In addition, there is not enough movement and mobilization in favor of women's sports for it to become a discussion in the field of politics. Regarding the Multiple Streams Model, women's sport is present only in the ideas stream, as there are solutions and ideas for improvements in this field. Finally, it was pointed that there is still a strong dependence on the State for the creation of policies that seek improvements in the participation of girls and women in sport, hence, being necessary the creation of multicentric policies, including non-governmental agents in the process of elaborating policies.Keywords: agenda, politics, stakeholders, women’s sport
Procedia PDF Downloads 8645 An Infinite Mixture Model for Modelling Stutter Ratio in Forensic Data Analysis
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Forensic DNA analysis has received much attention over the last three decades, due to its incredible usefulness in human identification. The statistical interpretation of DNA evidence is recognised as one of the most mature fields in forensic science. Peak heights in an Electropherogram (EPG) are approximately proportional to the amount of template DNA in the original sample being tested. A stutter is a minor peak in an EPG, which is not masking as an allele of a potential contributor, and considered as an artefact that is presumed to be arisen due to miscopying or slippage during the PCR. Stutter peaks are mostly analysed in terms of stutter ratio that is calculated relative to the corresponding parent allele height. Analysis of mixture profiles has always been problematic in evidence interpretation, especially with the presence of PCR artefacts like stutters. Unlike binary and semi-continuous models; continuous models assign a probability (as a continuous weight) for each possible genotype combination, and significantly enhances the use of continuous peak height information resulting in more efficient reliable interpretations. Therefore, the presence of a sound methodology to distinguish between stutters and real alleles is essential for the accuracy of the interpretation. Sensibly, any such method has to be able to focus on modelling stutter peaks. Bayesian nonparametric methods provide increased flexibility in applied statistical modelling. Mixture models are frequently employed as fundamental data analysis tools in clustering and classification of data and assume unidentified heterogeneous sources for data. In model-based clustering, each unknown source is reflected by a cluster, and the clusters are modelled using parametric models. Specifying the number of components in finite mixture models, however, is practically difficult even though the calculations are relatively simple. Infinite mixture models, in contrast, do not require the user to specify the number of components. Instead, a Dirichlet process, which is an infinite-dimensional generalization of the Dirichlet distribution, is used to deal with the problem of a number of components. Chinese restaurant process (CRP), Stick-breaking process and Pólya urn scheme are frequently used as Dirichlet priors in Bayesian mixture models. In this study, we illustrate an infinite mixture of simple linear regression models for modelling stutter ratio and introduce some modifications to overcome weaknesses associated with CRP.Keywords: Chinese restaurant process, Dirichlet prior, infinite mixture model, PCR stutter
Procedia PDF Downloads 33144 Winkler Springs for Embedded Beams Subjected to S-Waves
Authors: Franco Primo Soffietti, Diego Fernando Turello, Federico Pinto
Abstract:
Shear waves that propagate through the ground impose deformations that must be taken into account in the design and assessment of buried longitudinal structures such as tunnels, pipelines, and piles. Conventional engineering approaches for seismic evaluation often rely on a Euler-Bernoulli beam models supported by a Winkler foundation. This approach, however, falls short in capturing the distortions induced when the structure is subjected to shear waves. To overcome these limitations, in the present work an analytical solution is proposed considering a Timoshenko beam and including transverse and rotational springs. The present research proposes ground springs derived as closed-form analytical solutions of the equations of elasticity including the seismic wavelength. These proposed springs extend the applicability of previous plane-strain models. By considering variations in displacements along the longitudinal direction, the presented approach ensures the springs do not approach zero at low frequencies. This characteristic makes them suitable for assessing pseudo-static cases, which typically govern structural forces in kinematic interaction analyses. The results obtained, validated against existing literature and a 3D Finite Element model, reveal several key insights: i) the cutoff frequency significantly influences transverse and rotational springs; ii) neglecting displacement variations along the structure axis (i.e., assuming plane-strain deformation) results in unrealistically low transverse springs, particularly for wavelengths shorter than the structure length; iii) disregarding lateral displacement components in rotational springs and neglecting variations along the structure axis leads to inaccurately low spring values, misrepresenting interaction phenomena; iv) transverse springs exhibit a notable drop in resonance frequency, followed by increasing damping as frequency rises; v) rotational springs show minor frequency-dependent variations, with radiation damping occurring beyond resonance frequencies, starting from negative values. This comprehensive analysis sheds light on the complex behavior of embedded longitudinal structures when subjected to shear waves and provides valuable insights for the seismic assessment.Keywords: shear waves, Timoshenko beams, Winkler springs, sol-structure interaction
Procedia PDF Downloads 6343 Start-Up: The Perception of Brazilian Entrepreneurs about the Start-Up Brasil Program
Authors: Fernando Nobre Cavalcante
Abstract:
In Brazil, and more recently in the city of Fortaleza, there is a new form of entrepreneurship that is focused on the information and communication technology service sector and that draws the attention of young people, investors, governments, authors and media companies: it is known as the start-up movement. Today, it is considered to be a driving force behind the creative economy. Rooted on progressive discourse, the words enterprise and innovation seduce new economic agents motivated by success stories from Silicon Valley in America along with increasing commercial activity for digital goods and services. This article assesses, from a sociological point of view, the new productive wave problematized by the light of Manuel Castells’ informational capitalism. Considering the skeptical as well as the optimistic opinions about the impact of this new entrepreneurial rearrangement, the following question is asked: How Brazilian entrepreneurs evaluate public policy incentives for startups Brazilian Federal Government? The raised hypotheses are based on employability factors as well as cultural, economical, and political matters related to innovation and technology. This study has produced a nationwide quantitative assessment with a special focus on the reality of these Ceará firms; as well as comparative qualitative interviews on Brazilian experiences lived by identified agents. This article outlines the public incentive policy of the federal government, the Start-up Brasil Program, from the perspective of these companies and provides details as to the discipline methods of the new enterprising way born in the United States. The startups are very young companies that are headed towards the economic sustainment of the productive sector services. These companies are dropping the seeds that will produce the re-enchantment of young people and bring them back to participation in political debate; they provide relief and reheats the job market; and they produce a democratization of the entrepreneurial ‘Do-It-Yourself’ culture. They capitalize the pivot of the wall street wolves and of agents being charged for new masks. There are developmental logic’s prophylaxis in the face of dreadful innovation stagnation. The lack of continuity in Brazilian governmental politics and cultural nuances related to entrepreneurship are barring the desired regional success of this ecosystem.Keywords: creative economy, entrepreneurship, informationalism, innovation, startups, start-up brasil program
Procedia PDF Downloads 36942 Left Atrial Appendage Occlusion vs Oral Anticoagulants in Atrial Fibrillation and Coronary Stenting. The DESAFIO Registry
Authors: José Ramón López-Mínguez, Estrella Suárez-Corchuelo, Sergio López-Tejero, Luis Nombela-Franco, Xavier Freixa-Rofastes, Guillermo Bastos-Fernández, Xavier Millán-Álvarez, Raúl Moreno-Gómez, José Antonio Fernández-Díaz, Ignacio Amat-Santos, Tomás Benito-González, Fernando Alfonso-Manterola, Pablo Salinas-Sanguino, Pedro Cepas-Guillén, Dabit Arzamendi, Ignacio Cruz-González, Juan Manuel Nogales-Asensio
Abstract:
Background and objectives: The treatment of patients with non-valvular atrial fibrillation (NVAF) who need coronary stenting is challenging. The objective of the study was to determine whether left atrial appendage occlusion (LAAO) could be a feasible option and benefit these patients. To this end, we studied the impact of LAAO plus antiplatelet drugs vs oral anticoagulants (OAC) (including direct OAC) plus antiplatelet drugs in these patients’ long-term outcomes. Methods: The results of 207 consecutive patients with NVAF who underwent coronary stenting were analyzed. A total of 146 patients were treated with OAC (75 with acenocoumarol, 71 with direct OAC) while 61 underwent LAAO. The median follow-up was 35 months. Patients also received antiplatelet therapy as prescribed by their cardiologist. The study received the proper ethical oversight. Results: Age (mean 75.7 years), and the past medical history of stroke were similar in both groups. However, the LAAO group had more unfavorable characteristics (history of coronary artery disease [CHA2DS2-VASc], and significant bleeding [BARC ≥ 2] and HAS-BLED). The occurrence of major adverse events (death, stroke/transient ischemic events, major bleeding) and major cardiovascular events (cardiac death, stroke/transient ischemic attack, and myocardial infarction) were significantly higher in the OAC group compared to the LAAO group: 19.75% vs 9.06% (HR, 2.18; P = .008) and 6.37% vs 1.91% (HR, 3.34; P = .037), respectively. Conclusions: In patients with NVAF undergoing coronary stenting, LAAO plus antiplatelet therapy produced better long-term outcomes compared to treatment with OAC plus antiplatelet therapy despite the unfavorable baseline characteristics of the LAAO group.Keywords: stents, atrial fibrillation, anticoagulants, left atrial appendage occlusion
Procedia PDF Downloads 7041 Virtual Reality and Other Real-Time Visualization Technologies for Architecture Energy Certifications
Authors: Román Rodríguez Echegoyen, Fernando Carlos López Hernández, José Manuel López Ujaque
Abstract:
Interactive management of energy certification ratings has remained on the sidelines of the evolution of virtual reality (VR) despite related advances in architecture in other areas such as BIM and real-time working programs. This research studies to what extent VR software can help the stakeholders to better understand energy efficiency parameters in order to obtain reliable ratings assigned to the parts of the building. To evaluate this hypothesis, the methodology has included the construction of a software prototype. Current energy certification systems do not follow an intuitive data entry system; neither do they provide a simple or visual verification of the technical values included in the certification by manufacturers or other users. This software, by means of real-time visualization and a graphical user interface, proposes different improvements to the current energy certification systems that ease the understanding of how the certification parameters work in a building. Furthermore, the difficulty of using current interfaces, which are not friendly or intuitive for the user, means that untrained users usually get a poor idea of the grounds for certification and how the program works. In addition, the proposed software allows users to add further information, such as financial and CO₂ savings, energy efficiency, and an explanatory analysis of results for the least efficient areas of the building through a new visual mode. The software also helps the user to evaluate whether or not an investment to improve the materials of an installation is worth the cost of the different energy certification parameters. The evaluated prototype (named VEE-IS) shows promising results when it comes to representing in a more intuitive and simple manner the energy rating of the different elements of the building. Users can also personalize all the inputs necessary to create a correct certification, such as floor materials, walls, installations, or other important parameters. Working in real-time through VR allows for efficiently comparing, analyzing, and improving the rated elements, as well as the parameters that we must enter to calculate the final certification. The prototype also allows for visualizing the building in efficiency mode, which lets us move over the building to analyze thermal bridges or other energy efficiency data. This research also finds that the visual representation of energy efficiency certifications makes it easy for the stakeholders to examine improvements progressively, which adds value to the different phases of design and sale.Keywords: energetic certification, virtual reality, augmented reality, sustainability
Procedia PDF Downloads 18840 Development and Structural Characterization of a Snack Food with Added Type 4 Extruded Resistant Starch
Authors: Alberto A. Escobar Puentes, G. Adriana García, Luis F. Cuevas G., Alejandro P. Zepeda, Fernando B. Martínez, Susana A. Rincón
Abstract:
Snack foods are usually classified as ‘junk food’ because have little nutritional value. However, due to the increase on the demand and third generation (3G) snacks market, low price and easy to prepare, can be considered as carriers of compounds with certain nutritional value. Resistant starch (RS) is classified as a prebiotic fiber it helps to control metabolic problems and has anti-cancer colon properties. The active compound can be developed by chemical cross-linking of starch with phosphate salts to obtain a type 4 resistant starch (RS4). The chemical reaction can be achieved by extrusion, a process widely used to produce snack foods, since it's versatile and a low-cost procedure. Starch is the major ingredient for snacks 3G manufacture, and the seeds of sorghum contain high levels of starch (70%), the most drought-tolerant gluten-free cereal. Due to this, the aim of this research was to develop a snack (3G), with RS4 in optimal conditions extrusion (previously determined) from sorghum starch, and carry on a sensory, chemically and structural characterization. A sample (200 g) of sorghum starch was conditioned with 4% sodium trimetaphosphate/ sodium tripolyphosphate (99:1) and set to 28.5% of moisture content. Then, the sample was processed in a single screw extruder equipped with rectangular die. The inlet, transport and output temperatures were 60°C, 134°C and 70°C, respectively. The resulting pellets were expanded in a microwave oven. The expansion index (EI), penetration force (PF) and sensory analysis were evaluated in the expanded pellets. The pellets were milled to obtain flour and RS content, degree of substitution (DS), and percentage of phosphorus (% P) were measured. Spectroscopy [Fourier Transform Infrared (FTIR)], X-ray diffraction, differential scanning calorimetry (DSC) and scanning electron microscopy (SEM) analysis were performed in order to determine structural changes after the process. The results in 3G were as follows: RS, 17.14 ± 0.29%; EI, 5.66 ± 0.35 and PF, 5.73 ± 0.15 (N). Groups of phosphate were identified in the starch molecule by FTIR: DS, 0.024 ± 0.003 and %P, 0.35±0.15 [values permitted as food additives (<4 %P)]. In this work an increase of the gelatinization temperature after the crosslinking of starch was detected; the loss of granular and vapor bubbles after expansion were observed by SEM; By using X-ray diffraction, loss of crystallinity was observed after extrusion process. Finally, a snack (3G) was obtained with RS4 developed by extrusion technology. The sorghum starch was efficient for snack 3G production.Keywords: extrusion, resistant starch, snack (3G), Sorghum
Procedia PDF Downloads 31239 Destroying the Body for the Salvation of the Soul: A Modern Theological Approach
Authors: Angelos Mavropoulos
Abstract:
Apostle Paul repeatedly mentioned the bodily sufferings that he voluntarily went through for Christ, as his body was in chains for the ‘mystery of Christ’ (Col 4:3), while on his flesh he gladly carried the ‘thorn’ and all his pains and weaknesses, which prevent him from being proud (2 Cor 12:7). In his view, God’s power ‘is made perfect in weakness’ and when we are physically weak, this is when we are spiritually strong (2 Cor 12:9-10). In addition, we all bear the death of Jesus in our bodies so that His life can be ‘revealed in our mortal body’ (2 Cor 4:10-11), and if we indeed share in His sufferings, we will share in His glory as well (Rom 8:17). Based on these passages, several Christian writers projected bodily suffering, pain, death, and martyrdom, in general, as the means to a noble Christian life and the way to attain God. Even more, Christian tradition is full of instances of voluntary self-harm, mortification of the flesh, and body mutilation for the sake of the soul by several pious men and women, as an imitation of Christ’s earthly suffering. It is a fact, therefore, that, for Christianity, he or she who not only endures but even inflicts earthly pains for God is highly appreciated and will be rewarded in the afterlife. Nevertheless, more recently, Gaudium et Spes and Veritatis Splendor decisively and totally overturned the Catholic Church’s view on the matter. The former characterised the practices that violate ‘the integrity of the human person, such as mutilation, torments inflicted on body or mind’ as ‘infamies’ (Gaudium et Spes, 27), while the latter, after confirming that there are some human acts that are ‘intrinsically evil’, that is, they are always wrong, regardless of ‘the ulterior intentions of the one acting and the circumstances’, included in this category, among others, ‘whatever violates the integrity of the human person, such as mutilation, physical and mental torture and attempts to coerce the spirit.’ ‘All these and the like’, the encyclical concludes, ‘are a disgrace… and are a negation of the honour due to the Creator’ (Veritatis Splendor, 80). For the Catholic Church, therefore, willful bodily sufferings and mutilations infringe human integrity and are intrinsically evil acts, while intentional harm, based on the principle that ‘evil may not be done for the sake of good’, is always unreasonable. On the other hand, many saints who engaged in these practices are still honoured for their ascetic and noble life, while, even today, similar practices are found, such as the well-known Good Friday self-flagellation and nailing to the cross, performed in San Fernando, Philippines. So, the viewpoint of modern Theology about these practices and the question of whether Christians should hurt their body for the salvation of their soul is the question that this paper will attempt to answer.Keywords: human body, human soul, torture, pain, salvation
Procedia PDF Downloads 9238 Evaluation of Occupational Doses in Interventional Radiology
Authors: Fernando Antonio Bacchim Neto, Allan Felipe Fattori Alves, Maria Eugênia Dela Rosa, Regina Moura, Diana Rodrigues De Pina
Abstract:
Interventional Radiology is the radiology modality that provides the highest dose values to medical staff. Recent researches show that personal dosimeters may underestimate dose values in interventional physicians, especially in extremities (hands and feet) and eye lens. The aim of this work was to study radiation exposure levels of medical staff in different interventional radiology procedures and estimate the annual maximum numbers of procedures (AMN) that each physician could perform without exceed the annual limits of dose established by normative. For this purpose LiF:Mg,Ti (TLD-100) dosimeters were positioned in different body regions of the interventional physician (eye lens, thyroid, chest, gonads, hand and foot) above the radiological protection vests as lead apron and thyroid shield. Attenuation values for lead protection vests were based on international guidelines. Based on these data were chosen as 90% attenuation of the lead vests and 60% attenuation of the protective glasses. 25 procedures were evaluated: 10 diagnostics, 10 angioplasty, and 5-aneurysm treatment. The AMN of diagnostic procedures was 641 for the primary interventional radiologist and 930 for the assisting interventional radiologist. For the angioplasty procedures, the AMN for primary interventional radiologist was 445 and for assisting interventional radiologist was 1202. As for the procedures of aneurism treatment, the AMN for the primary interventional radiologist was 113 and for the assisting interventional radiologist were 215. All AMN were limited by the eye lens doses already considering the use of protective glasses. In all categories evaluated, the higher dose values are found in gonads and in the lower regions of professionals, both for the primary interventionist and for the assisting, but the eyes lens dose limits are smaller than these regions. Additional protections as mobile barriers, which can be positioned between the interventionist and the patient, can decrease the exposures in the eye lens, providing a greater protection for the medical staff. The alternation of professionals to perform each type of procedure can reduce the dose values received by them over a period. The analysis of dose profiles proposed in this work showed that personal dosimeters positioned in chest might underestimate dose values in other body parts of the interventional physician, especially in extremities and eye lens. As each body region of the interventionist is subject to different levels of exposure, dose distribution in each region provides a better approach to what actions are necessary to ensure the radiological protection of medical staff.Keywords: interventional radiology, radiation protection, occupationally exposed individual, hemodynamic
Procedia PDF Downloads 39437 Psychophysiological Adaptive Automation Based on Fuzzy Controller
Authors: Liliana Villavicencio, Yohn Garcia, Pallavi Singh, Luis Fernando Cruz, Wilfrido Moreno
Abstract:
Psychophysiological adaptive automation is a concept that combines human physiological data and computer algorithms to create personalized interfaces and experiences for users. This approach aims to enhance human learning by adapting to individual needs and preferences and optimizing the interaction between humans and machines. According to neurosciences, the working memory demand during the student learning process is modified when the student is learning a new subject or topic, managing and/or fulfilling a specific task goal. A sudden increase in working memory demand modifies the level of students’ attention, engagement, and cognitive load. The proposed psychophysiological adaptive automation system will adapt the task requirements to optimize cognitive load, the process output variable, by monitoring the student's brain activity. Cognitive load changes according to the student’s previous knowledge, the type of task, the difficulty level of the task, and the overall psychophysiological state of the student. Scaling the measured cognitive load as low, medium, or high; the system will assign a task difficulty level to the next task according to the ratio between the previous-task difficulty level and student stress. For instance, if a student becomes stressed or overwhelmed during a particular task, the system detects this through signal measurements such as brain waves, heart rate variability, or any other psychophysiological variables analyzed to adjust the task difficulty level. The control of engagement and stress are considered internal variables for the hypermedia system which selects between three different types of instructional material. This work assesses the feasibility of a fuzzy controller to track a student's physiological responses and adjust the learning content and pace accordingly. Using an industrial automation approach, the proposed fuzzy logic controller is based on linguistic rules that complement the instrumentation of the system to monitor and control the delivery of instructional material to the students. From the test results, it can be proved that the implemented fuzzy controller can satisfactorily regulate the delivery of academic content based on the working memory demand without compromising students’ health. This work has a potential application in the instructional design of virtual reality environments for training and education.Keywords: fuzzy logic controller, hypermedia control system, personalized education, psychophysiological adaptive automation
Procedia PDF Downloads 8236 Analysis of Ancient and Present Lightning Protection Systems of Large Heritage Stupas in Sri Lanka
Authors: J.R.S.S. Kumara, M.A.R.M. Fernando, S.Venkatesh, D.K. Jayaratne
Abstract:
Protection of heritage monuments against lightning has become extremely important as far as their historical values are concerned. When such structures are large and tall, the risk of lightning initiated from both cloud and ground can be high. This paper presents a lightning risk analysis of three giant stupas in Anuradhapura era (fourth century BC onwards) in Sri Lanka. The three stupas are Jethawaaramaya (269-296 AD), Abayagiriya (88-76 BC) and Ruwanweliseya (161-137 BC), the third, fifth and seventh largest ancient structures in the world. These stupas are solid brick structures consisting of a base, a near hemispherical dome and a conical spire on the top. The ancient stupas constructed with a dielectric crystal on the top and connected to the ground through a conducting material, was considered as the hypothesis for their original lightning protection technique. However, at present, all three stupas are protected with Franklin rod type air termination systems located on top of the spire. First, a risk analysis was carried out according to IEC 62305 by considering the isokeraunic level of the area and the height of the stupas. Then the standard protective angle method and rolling sphere method were used to locate the possible touching points on the surface of the stupas. The study was extended to estimate the critical current which could strike on the unprotected areas of the stupas. The equations proposed by (Uman 2001) and (Cooray2007) were used to find the striking distances. A modified version of rolling sphere method was also applied to see the effects of upward leaders. All these studies were carried out for two scenarios: with original (i.e. ancient) lightning protection system and with present (i.e. new) air termination system. The field distribution on the surface of the stupa in the presence of a downward leader was obtained using finite element based commercial software COMSOL Multiphysics for further investigations of lightning risks. The obtained results were analyzed and compared each other to evaluate the performance of ancient and new lightning protection methods and identify suitable methods to design lightning protection systems for stupas. According to IEC standards, all three stupas with new and ancient lightning protection system has Level IV protection as per protection angle method. However according to rolling sphere method applied with Uman’s equation protection level is III. The same method applied with Cooray’s equation always shows a high risk with respect to Uman’s equation. It was found that there is a risk of lightning strikes on the dome and square chamber of the stupa, and the corresponding critical current values were different with respect to the equations used in the rolling sphere method and modified rolling sphere method.Keywords: Stupa, heritage, lightning protection, rolling sphere method, protection level
Procedia PDF Downloads 25535 NK Cells Expansion Model from PBMC Led to a Decrease of CD4+ and an Increase of CD8+ and CD25+CD127- T-Reg Lymphocytes in Patients with Ovarian Neoplasia
Authors: Rodrigo Fernandes da Silva, Daniela Maira Cardozo, Paulo Cesar Martins Alves, Sophie Françoise Derchain, Fernando Guimarães
Abstract:
T-reg lymphocytes are important for the control of peripheral tolerance. They control the adaptive immune system and prevent autoimmunity through its suppressive action on CD4+ and CD8+ lymphocytes. The suppressive action also includes B lymphocytes, dendritic cells, monocytes/macrophages and recently, studies have shown that T-reg are also able to inhibit NK cells, therefore they exert their control of the immune response from innate to adaptive response. Most tumors express self-ligands, therefore it is believed that T-reg cells induce tolerance of the immune system, hindering the development of successful immunotherapies. T-reg cells have been linked to the suppression mechanisms of the immune response against tumors, including ovarian cancer. The goal of this study was to disclose the sub-population of the expanded CD3+ lymphocytes reported by previous studies, using the long-term culture model designed by Carlens et al 2001, to generate effector cell suspensions enriched with cytotoxic CD3-CD56+ NK cells, from PBMC of ovarian neoplasia patients. Methods and Results: Blood was collected from 12 patients with ovarian neoplasia after signed consent: 7 benign (Bng) and 5 malignant (Mlg). Mononuclear cells were separated by Ficoll-Paque gradient. Long-term culture was conducted by a 21 day culturing process with SCGM CellGro medium supplemented with anti-CD3 (10ng/ml, first 5 days), IL-2 (1000UI/ml) and FBS (10%). After 21 days of expansion, there was an increase in the population of CD3+ lymphocytes in the benign and malignant group. Within CD3+ population, there was a significant decrease in the population of CD4+ lymphocytes in the benign (median Bgn D-0=73.68%, D-21=21.05%) (p<0.05) and malignant (median Mlg D-0=64.00%, D-21=11.97%) (p < 0.01) group. Inversely, after 21 days of expansion, there was an increase in the population of CD8+ lymphocytes within the CD3+ population in the benign (median Bgn D-0=16.80%, D-21=38.56%) and malignant (median Mlg D-0=27.12%, D-21=72.58%) group. However, this increase was only significant on the malignant group (p<0.01). Within the CD3+CD4+ population, there was a significant increase (p < 0.05) in the population of T-reg lymphocytes in the benign (median Bgn D-0=9.84%, D-21=39.47%) and malignant (median Mlg D-0=3.56%, D-21=16.18%) group. Statistical analysis inter groups was performed by Kruskal-Wallis test and intra groups by Mann Whitney test. Conclusion: The CD4+ and CD8+ sub-population of CD3+ lymphocytes shifts with the culturing process. This might be due to the process of the immune system to produce a cytotoxic response. At the same time, T-reg lymphocytes increased within the CD4+ population, suggesting a modulation of the immune response towards cells of the immune system. The expansion of the T-reg population can hinder an immune response against cancer. Therefore, an immunotherapy using this expansion procedure should aim to halt the expansion of T-reg or its immunosuppresion capability.Keywords: regulatory T cells, CD8+ T cells, CD4+ T cells, NK cell expansion
Procedia PDF Downloads 45234 Assess the Risk Behaviours and Safer Sex Practices among Male Attendees in a Sexual Health Setting
Authors: B. M. M. D. Mendis, L. I. Rajapaksa, P. S. K. Gunathunga, R. C. Fernando, M. Jayalath
Abstract:
Background / introduction: During the year 2011, 8511 males received services from the sexual health clinics island wide. At present there is only limited information on the risk behaviours of male attendees. Information on risk behaviours related to STI /HIV transmission is helpful in planning suitable prevention interventions. Aim(s)/objectives: The objectives were to determines the sexual partners (other than the marital partner and regular partners) responsible for transmitting STI( Sexually transmitted infections)/ HIV and to understand the practice of safer sex. Methods: Study was a clinic based prospective study conducted for a one year period using an interviewer administered questionnaire. Results: 983 attendees were interviewed. . Mean age was 34.02 years. 75% of the sample had completed GCE O/L (ordinary level examination). Skilled labourers, drivers and forces/police comprised 40% of the sample. 50% admitted sex with a casual female, 12% with a casual male, and 13% with CSW (commercial sex workers) while MSW (male sex workers) exposures were minimal. It was identified that younger males had more contacts with males, and regular female partners while more older males with CSW. Anal sex among males was reported by 11.5%. 20.5% used alcohol frequently and 5.9% used drugs and 1.4% injected. Common STI were genital herpes (7.9%), Non gonococcal urethritis (6.2%) and gonorrhoea (6.2%). Among those who had contacts with FSW 6.7% gonorrhoea (GC), 8.2% non gonococcal urethritis (NGU), 7.5% genital herpes and 0.7% HIV. Non regular partner exposures 3.7% had gonorrhoea, 8.3% NGU, 6.6% genital herpes and 0.8% HIV. Among MSM contacts 10.6% had GC, 4.5% NGU, 5.3% genital herpes, 5.3% secondary syphilis and 0.8% HIV. Only 9.0% used condoms correctly. Friends, doctors, newspapers, internet, and forces were important sources of information on condoms. Non use of condoms were due to worry about satisfaction (24.6%) and faith in the partner (25.6%). Discussion/conclusion: Casual partners for unsafe sex is a concern. MSM and CSW are remained as an important source of infection. Early Syphilis and gonorrhoea infections were mostly seen among MSM exposures. The findings indicate that the male population in the sample had satisfactory education. However, still the unsafe sexual contacts are common. . Newspapers, internet were more important sources of information on condoms. Low condom use remains another concern.. More males contracted STI through casual partners. Therefore strategies used for prevention need to be revisited also emphasizing on general population where casual partners represent. . Increasing awareness of men and women through mass media and primary health care teams may be important strategies that can be used to keep the HIV epidemic in a low level.Keywords: STI, HIV, Males, safe sex practices
Procedia PDF Downloads 33833 The Need For Higher Education Stem Integrated into the Social Science
Authors: Luis Fernando Calvo Prieto, Raul Herrero Martínez, Mónica Santamarta Llorente, Sergio Paniagua Bermejo
Abstract:
The project that is presented starts from the questioning about the compartmentalization of knowledge that occurs in university higher education. There are several authors who describe the problems associated with this reality (Rodamillans, M) indicating a lack of integration of the knowledge acquired by students throughout the subjects taken in their university degree. Furthermore, this disintegration is accentuated by the enrollment system of some Faculties and/or Schools of Engineering, which allows the student to take subjects outside the recommended curricular path. This problem is accentuated in an ostentatious way when trying to find an integration between humanistic subjects and the world of experimental sciences or engineering. This abrupt separation between humanities and sciences can be observed in any study plan of Spanish degrees. Except for subjects such as economics or English, in the Faculties of Sciences and the Schools of Engineering, the absence of any humanistic content is striking. At some point it was decided that the only value to take into account when designing their study plans was “usefulness”, considering the humanities systematically useless for their training, and therefore banishing them from the study plans. forgetting the role they have on the capacity of both Leadership and Civic Humanism in our professionals of tomorrow. The teaching guides for the different subjects in the branch of science or engineering do not include any competency, not even transversal, related to leadership capacity or the need, in today's world, for social, civic and humanitarian knowledge part of the people who will offer medical, pharmaceutical, environmental, biotechnological or engineering solutions to a society that is generated thanks to more or less complex relationships based on human relationships and historical events that have occurred so far. If we want professionals who know how to deal effectively and rationally with their leadership tasks and who, in addition, find and develop an ethically civic sense and a humanistic profile in their functions and scientific tasks, we must not leave aside the importance that it has, for the themselves, know the causes, facts and consequences of key events in the history of humanity. The words of the humanist Paul Preston are well known: “he who does not know his history is condemned to repeat the mistakes of the past.” The idea, therefore, that today there can be men of science in the way that the scientists of the Renaissance were, becomes, at the very least, difficult to conceive. To think that a Leonardo da Vinci can be repeated in current times is a more than crazy idea; and although at first it may seem that the specialization of a professional is inevitable but beneficial, there are authors who consider (Sánchez Inarejos) that it has an extremely serious negative side effect: the entrenchment behind the different postulates of each area of knowledge, disdaining everything. what is foreign to it.Keywords: STEM, higher education, social sciences, history
Procedia PDF Downloads 6732 Microbiological Analysis on Anatomical Specimens of Cats for Use in Veterinary Surgery
Authors: Raphael C. Zero, Marita V. Cardozo, Thiago A. S. S. Rocha, Mariana T. Kihara, Fernando A. Ávila, Fabrício S. Oliveira
Abstract:
There are several fixative and preservative solutions for use on cadavers, many of them using formaldehyde as the fixative or anatomical part preservative. In some countries, such as Brazil, this toxic agent has been increasingly restricted. The objective of this study was to microbiologically identify and quantify the key agents in tanks containing 96GL ethanol or sodium chloride solutions, used respectively as fixatives and preservatives of cat cadavers. Eight adult cat corpses, three females and five males, with an average weight of 4.3 kg, were used. After injection via the external common carotid artery (120 ml/kg, 95% 96GL ethyl alcohol and 5% pure glycerin), the cadavers were fixed in a plastic tank with 96GL ethanol for 60 days. After fixing, they were stored in a 30% sodium chloride aqueous solution for 120 days in a similar tank. Samples were collected at the start of the experiment - before the animals were placed in the ethanol tanks, and monthly thereafter. The bacterial count was performed by Pour Plate Method in BHI agar (Brain Heart Infusion) and the plates were incubated aerobically and anaerobically for 24h at 37ºC. MacConkey agar, SPS agar (Sulfite Polymyxin Sulfadizine) and MYP Agar Base were used to isolate the microorganisms. There was no microbial growth in the samples prior to alcohol fixation. After 30 days of fixation in the alcohol solution, total aerobic and anaerobic (<1.0 x 10 CFU/ml) were found and Pseudomonas sp., Staphylococcus sp., Clostridium sp. were the identified agents. After 60 days in the alcohol fixation solution, total aerobes (<1.0 x 10 CFU/ml) and total anaerobes (<2.2 x 10 CFU/mL) were found, and the identified agents were the same. After 30 days of storage in the aqueous solution of 30% sodium chloride, total aerobic (<5.2 x 10 CFU/ml) and total anaerobes (<3.7 x 10 CFU/mL) were found and the agents identified were Staphylococcus sp., Clostridium sp., and fungi. After 60 days of sodium chloride storage, total aerobic (<3.0 x 10 CFU / ml) and total anaerobes (<7.0 x 10 CFU/mL) were found and the identified agents remained the same: Staphylococcus sp., Clostridium sp., and fungi. The microbiological count was low and visual inspection did not reveal signs of contamination in the tanks. There was no strong odor or purification, which proved the technique to be microbiologically effective in fixing and preserving the cat cadavers for the four-month period in which they are provided to undergraduate students of University of Veterinary Medicine for surgery practice. All experimental procedures were approved by the Municipal Legal Department (protocol 02.2014.000027-1). The project was funded by FAPESP (protocol 2015-08259-9).Keywords: anatomy, fixation, microbiology, small animal, surgery
Procedia PDF Downloads 29131 Evaluation of the Energy Performance and Emissions of an Aircraft Engine: J69 Using Fuel Blends of Jet A1 and Biodiesel
Authors: Gabriel Fernando Talero Rojas, Vladimir Silva Leal, Camilo Bayona-Roa, Juan Pava, Mauricio Lopez Gomez
Abstract:
The substitution of conventional aviation fuels with biomass-derived alternative fuels is an emerging field of study in the aviation transport, mainly due to its energy consumption, the contribution to the global Greenhouse Gas - GHG emissions and the fossil fuel price fluctuations. Nevertheless, several challenges remain as the biofuel production cost and its degradative effect over the fuel systems that alter the operating safety. Moreover, experimentation on full-scale aeronautic turbines are expensive and complex, leading to most of the research to the testing of small-size turbojets with a major absence of information regarding the effects in the energy performance and the emissions. The main purpose of the current study is to present the results of experimentation in a full-scale military turbojet engine J69-T-25A (presented in Fig. 1) with 640 kW of power rating and using blends of Jet A1 with oil palm biodiesel. The main findings are related to the thrust specific fuel consumption – TSFC, the engine global efficiency – η, the air/fuel ratio – AFR and the volume fractions of O2, CO2, CO, and HC. Two fuels are used in the present study: a commercial Jet A1 and a Colombian palm oil biodiesel. The experimental plan is conducted using the biodiesel volume contents - w_BD from 0 % (B0) to 50 % (B50). The engine operating regimes are set to Idle, Cruise, and Take-off conditions. The turbojet engine J69 is used by the Colombian Air Force and it is installed in a testing bench with the instrumentation that corresponds to the technical manual of the engine. The increment of w_BD from 0 % to 50 % reduces the η near 3,3 % and the thrust force in a 26,6 % at Idle regime. These variations are related to the reduction of the 〖HHV〗_ad of the fuel blend. The evolved CO and HC tend to be reduced in all the operating conditions when increasing w_BD. Furthermore, a reduction of the atomization angle is presented in Fig. 2, indicating a poor atomization in the fuel nozzle injectors when using a higher biodiesel content as the viscosity of fuel blend increases. An evolution of cloudiness is also observed during the shutdown procedure as presented in Fig. 3a, particularly after 20 % of biodiesel content in the fuel blend. This promotes the contamination of some components of the combustion chamber of the J69 engine with soot and unburned matter (Fig. 3). Thus, the substitution of biodiesel content above 20 % is not recommended in order to avoid a significant decrease of η and the thrust force. A more detail examination of the mechanical wearing of the main components of the engine is advised in further studies.Keywords: aviation, air to fuel ratio, biodiesel, energy performance, fuel atomization, gas turbine
Procedia PDF Downloads 11030 The Debureaucratization Strategy for the Portuguese Health Service through Effective Communication
Authors: Fernando Araujo, Sandra Cardoso, Fátima Fonseca, Sandra Cavaca
Abstract:
A debureaucratization strategy for the Portuguese Health Service was assumed by the Executive Board of the SNS, in deep articulation with the Shared Services of the Ministry of Health. Two of the main dimensions were focused on sick leaves (SL), that transform primary health care (PHC) in administrative institutions, limiting access to patients. The self-declaration of illness (SDI) project, through the National Health Service Contact Centre (SNS24), began on May 1, 2023, and has already resulted in the issuance of more than 300,000 SDI without the need to allocate resources from the National Health Service (NHS). This political decision allows each citizen, in a maximum 2 times/year, and 3 days each time, if ill, through their own responsibility, report their health condition in a dematerialized way, and by this way justified the absence to work, although by Portuguese law in these first three days, there is no payment of salary. Using a digital approach, it is now feasible without the need to go to the PHC and occupy the time of the PHC only to obtain an SL. Through this measure, bureaucracy has been reduced, and the system has been focused on users, improving the lives of citizens and reducing the administrative burden on PHC, which now has more consultation times for users who need it. The second initiative, which began on March 1, 2024, allows the SL to be issued in emergency departments (ED) of public hospitals and in the health institutions of the social and private sectors. This project is intended to allow the user who has suffered a situation of acute urgent illness and who has been observed in an ED of a public hospital or in a private or social entity no longer need to go to PHC only to apply for the respective SL. Since March 1, 54,453 SLs have been issued, 242 in private or social sector institutions and 6,918 in public hospitals, of which 134 were in ED and 47,292 in PHC. This approach has proven to be technically robust, allows immediate resolution of problems and differentiates the performance of doctors. However, it is important to continue to qualify the proper functioning of the ED, preventing non-urgent users from going there only to obtain SL. Thus, in order to make better use of existing resources, it was operationalizing this extension of its issuance in a balanced way, allowing SL to be issued in the ED of hospitals only to critically ill patients or patients referred by INEM, SNS24, or PHC. In both cases, an intense public campaign was implemented to explain the way it works and the benefits for patients. In satisfaction surveys, more than 95% of patients and doctors were satisfied with the solutions, asking for extensions to other areas. The administrative simplification agenda of the NHS continues its effective development. For the success of this debureaucratization agenda, the key factors are effective communication and the ability to reach patients and health professionals in order to increase health literacy and the correct use of NHS.Keywords: debureaucratization strategy, self-declaration of illness, sick leaves, SNS24
Procedia PDF Downloads 7329 Sustainability from Ecocity to Ecocampus: An Exploratory Study on Spanish Universities' Water Management
Authors: Leyla A. Sandoval Hamón, Fernando Casani
Abstract:
Sustainability has been integrated into the cities’ agenda due to the impact that they generate. The dimensions of greater proliferation of sustainability, which are taken as a reference, are economic, social and environmental. Thus, the decisions of management of the sustainable cities search a balance between these dimensions in order to provide environment-friendly alternatives. In this context, urban models (where water consumption, energy consumption, waste production, among others) that have emerged in harmony with the environment, are known as Ecocity. A similar model, but on a smaller scale, is ‘Ecocampus’ that is developed in universities (considered ‘small cities’ due to its complex structure). So, sustainable practices are being implemented in the management of university campus activities, following different relevant lines of work. The universities have a strategic role in society, and their activities can strengthen policies, strategies, and measures of sustainability, both internal and external to the organization. Because of their mission in knowledge creation and transfer, these institutions can promote and disseminate more advanced activities in sustainability. This model replica also implies challenges in the sustainable management of water, energy, waste, transportation, among others, inside the campus. The challenge that this paper focuses on is the water management, taking into account that the universities consume big amounts of this resource. The purpose of this paper is to analyze the sustainability experience, with emphasis on water management, of two different campuses belonging to two different Spanish universities - one urban campus in a historic city and the other a suburban campus in the outskirts of a large city. Both universities are in the top hundred of international rankings of sustainable universities. The methodology adopts a qualitative method based on the technique of in-depth interviews and focus-group discussions with administrative and academic staff of the ‘Ecocampus’ offices, the organizational units for sustainability management, from the two Spanish universities. The hypotheses indicate that sustainable policies in terms of water management are best in campuses without big green spaces and where the buildings are built or rebuilt with modern style. The sustainability efforts of the university are independent of the kind of (urban – suburban) campus but an important aspect to improve is the degree of awareness of the university community about water scarcity. In general, the paper suggests that higher institutions adapt their sustainability policies depending on the location and features of the campus and their engagement with the water conservation. Many Spanish universities have proposed policies, good practices, and measures of sustainability. In fact, some offices or centers of Ecocampus have been founded. The originality of this study is to learn from the different experiences of sustainability policies of universities.Keywords: ecocampus, ecocity, sustainability, water management
Procedia PDF Downloads 22228 Real-Time Working Environment Risk Analysis with Smart Textiles
Authors: Jose A. Diaz-Olivares, Nafise Mahdavian, Farhad Abtahi, Kaj Lindecrantz, Abdelakram Hafid, Fernando Seoane
Abstract:
Despite new recommendations and guidelines for the evaluation of occupational risk assessments and their prevention, work-related musculoskeletal disorders are still one of the biggest causes of work activity disruption, productivity loss, sick leave and chronic work disability. It affects millions of workers throughout Europe, with a large-scale economic and social burden. These specific efforts have failed to produce significant results yet, probably due to the limited availability and high costs of occupational risk assessment at work, especially when the methods are complex, consume excessive resources or depend on self-evaluations and observations of poor accuracy. To overcome these limitations, a pervasive system of risk assessment tools in real time has been developed, which has the characteristics of a systematic approach, with good precision, usability and resource efficiency, essential to facilitate the prevention of musculoskeletal disorders in the long term. The system allows the combination of different wearable sensors, placed on different limbs, to be used for data collection and evaluation by a software solution, according to the needs and requirements in each individual working environment. This is done in a non-disruptive manner for both the occupational health expert and the workers. The creation of this solution allows us to attend different research activities that require, as an essential starting point, the recording of data with ergonomic value of very diverse origin, especially in real work environments. The software platform is here presented with a complimentary smart clothing system for data acquisition, comprised of a T-shirt containing inertial measurement units (IMU), a vest sensorized with textile electronics, a wireless electrocardiogram (ECG) and thoracic electrical bio-impedance (TEB) recorder and a glove sensorized with variable resistors, dependent on the angular position of the wrist. The collected data is processed in real-time through a mobile application software solution, implemented in commercially available Android-based smartphones and tablet platforms. Based on the collection of this information and its analysis, real-time risk assessment and feedback about postural improvement is possible, adapted to different contexts. The result is a tool which provides added value to ergonomists and occupational health agents, as in situ analysis of postural behavior can assist in a quantitative manner in the evaluation of work techniques and the occupational environment.Keywords: ergonomics, mobile technologies, risk assessment, smart textiles
Procedia PDF Downloads 11927 Analytical Study of the Structural Response to Near-Field Earthquakes
Authors: Isidro Perez, Maryam Nazari
Abstract:
Numerous earthquakes, which have taken place across the world, led to catastrophic damage and collapse of structures (e.g., 1971 San Fernando; 1995 Kobe-Japan; and 2010 Chile earthquakes). Engineers are constantly studying methods to moderate the effect this phenomenon has on structures to further reduce damage, costs, and ultimately to provide life safety to occupants. However, there are regions where structures, cities, or water reservoirs are built near fault lines. When an earthquake occurs near the fault lines, they can be categorized as near-field earthquakes. In contrary, a far-field earthquake occurs when the region is further away from the seismic source. A near-field earthquake generally has a higher initial peak resulting in a larger seismic response, when compared to a far-field earthquake ground motion. These larger responses may result in serious consequences in terms of structural damage which can result in a high risk for the public’s safety. Unfortunately, the response of structures subjected to near-field records are not properly reflected in the current building design specifications. For example, in ASCE 7-10, the design response spectrum is mostly based on the far-field design-level earthquakes. This may result in the catastrophic damage of structures that are not properly designed for near-field earthquakes. This research investigates the knowledge that the effect of near-field earthquakes has on the response of structures. To fully examine this topic, a structure was designed following the current seismic building design specifications, e.g. ASCE 7-10 and ACI 318-14, being analytically modeled, utilizing the SAP2000 software. Next, utilizing the FEMA P695 report, several near-field and far-field earthquakes were selected, and the near-field earthquake records were scaled to represent the design-level ground motions. Upon doing this, the prototype structural model, created using SAP2000, was subjected to the scaled ground motions. A Linear Time History Analysis and Pushover analysis were conducted on SAP2000 for evaluation of the structural seismic responses. On average, the structure experienced an 8% and 1% increase in story drift and absolute acceleration, respectively, when subjected to the near-field earthquake ground motions. The pushover analysis was ran to find and aid in properly defining the hinge formation in the structure when conducting the nonlinear time history analysis. A near-field ground motion is characterized by a high-energy pulse, making it unique to other earthquake ground motions. Therefore, pulse extraction methods were used in this research to estimate the maximum response of structures subjected to near-field motions. The results will be utilized in the generation of a design spectrum for the estimation of design forces for buildings subjected to NF ground motions.Keywords: near-field, pulse, pushover, time-history
Procedia PDF Downloads 14726 Sorghum Polyphenols Encapsulated by Spray Drying, Using Modified Starches as Wall Materials
Authors: Adriana Garcia G., Alberto A. Escobar P., Amira D. Calvo L., Gabriel Lizama U., Alejandro Zepeda P., Fernando Martínez B., Susana Rincón A.
Abstract:
Different studies have recently been focused on the use of antioxidants such as polyphenols because of to its anticarcinogenic capacity. However, these compounds are highly sensible to environmental factors such as light and heat, so lose its long-term stability, besides possess an astringent and bitter taste. Nevertheless, the polyphenols can be protected by microcapsule formulation. In this sense, a rich source of polyphenols is sorghum, besides presenting a high starch content. Due to the above, the aim of this work was to obtain modified starches from sorghum by extrusion to encapsulate polyphenols the sorghum by spray drying. Polyphenols were extracted by ethanol solution from sorghum (Pajarero/red) and determined by the method of Folin-Ciocalteu, obtaining GAE at 30 mg/g. Moreover, was extracted starch of sorghum (Sinaloense/white) through wet milling (yield 32 %). The hydrolyzed starch was modified with three treatments: acetic anhydride (2.5g/100g), sodium tripolyphosphate (4g/100g), and sodium tripolyphosphate/ acetic anhydride (2g/1.25g by each 100 g) by extrusion. Processing conditions of extrusion were as follows: barrel temperatures were of 60, 130 and 170 °C at the feeding, transition, and high-pressure extrusion zones, respectively. Analysis of Fourier Transform Infrared spectroscopy (FTIR), showed bands exhibited of acetyl groups (1735 cm-1) and phosphates (1170 cm-1, 910 cm-1 and 525 cm-1), indicating the respective modification of starch. Besides, all modified starches not developed viscosity, which is a characteristic required for use in the encapsulation of polyphenols using the spray drying technique. As result of the modification starch, was obtained a water solubility index (WSI) from 33.8 to 44.8 %, and crystallinity from 8 to 11 %, indicating the destruction of the starch granule. Afterwards, microencapsulation of polyphenols was developed by spray drying, with a blend of 10 g of modified starch, 60 ml polyphenol extract and 30 ml of distilled water. Drying conditions were as follows: inlet air temperature 150 °C ± 1, outlet air temperature 80°C ± 5. As result of the microencapsulation: were obtained yields of 56.8 to 77.4 % and an efficiency of encapsulation from 84.6 to 91.4 %. The FTIR analysis showed evidence of microcapsules loaded with polyphenols in bands 1042 cm-1, 1038 cm-1 and 1148 cm-1. Analysis Differential scanning calorimetry (DSC) showed transition temperatures from 144.1 to 173.9 °C. For the order hand, analysis of Scanning Electron Microscopy (SEM), were observed rounded surfaces with concavities, typical feature of microcapsules produced by spray drying, how result of rapid evaporation of water. Finally, the modified starches were obtained by extrusion with good characteristics for use as cover materials by spray drying, where the phosphorylated starch was the best treatment in this work, according to the encapsulation yield, efficiency, and transition temperature.Keywords: encapsulation, extrusion, modified starch, polyphenols, spray drying
Procedia PDF Downloads 31025 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 14524 Kidnapping of Migrants by Drug Cartels in Mexico as a New Trend in Contemporary Slavery
Authors: Itze Coronel Salomon
Abstract:
The rise of organized crime and violence related to drug cartels in Mexico has created serious challenges for the authorities to provide security to those who live within its borders. However, to achieve a significant improvement in security is absolute respect for fundamental human rights by the authorities. Irregular migrants in Mexico are at serious risk of abuse. Research by Amnesty International as well as reports of the NHRC (National Human Rights) in Mexico, have indicated the major humanitarian crisis faced by thousands of migrants traveling in the shadows. However, the true extent of the problem remains invisible to the general population. The fact that federal and state governments leave no proper record of abuse and do not publish reliable data contributes to ignorance and misinformation, often spread by the media that portray migrants as the source of crime rather than their victims. Discrimination and intolerance against irregular migrants can generate greater hostility and exclusion. According to the modus operandi that has been recorded criminal organizations and criminal groups linked to drug trafficking structures deprive migrants of their liberty for forced labor and illegal activities related to drug trafficking, even some have been kidnapped for be trained as murderers . If the victim or their family cannot pay the ransom, the kidnapped person may suffer torture, mutilation and amputation of limbs or death. Migrant women are victims of sexual abuse during her abduction as well. In 2011, at least 177 bodies were identified in the largest mass grave found in Mexico, located in the town of San Fernando, in the border state of Tamaulipas, most of the victims were killed by blunt instruments, and most seemed to be immigrants and travelers passing through the country. With dozens of small graves discovered in northern Mexico, this may suggest a change in tactics between organized crime groups to the different means of obtaining revenue and reduce murder profile methods. Competition and conflict over territorial control drug trafficking can provide strong incentives for organized crime groups send signals of violence to the authorities and rival groups. However, as some Mexican organized crime groups are increasingly looking to take advantage of income and vulnerable groups, such as Central American migrants seem less interested in advertising his work to authorities and others, and more interested in evading detection and confrontation. This paper pretends to analyze the introduction of this new trend of kidnapping migrants for forced labors by drug cartels in Mexico into the forms of contemporary slavery and its implications.Keywords: international law, migration, transnational organized crime
Procedia PDF Downloads 41823 An Approach to Determine the in Transit Vibration to Fresh Produce Using Long Range Radio (LORA) Wireless Transducers
Authors: Indika Fernando, Jiangang Fei, Roger Stanely, Hossein Enshaei
Abstract:
Ever increasing demand for quality fresh produce by the consumers, had increased the gravity on the post-harvest supply chains in multi-fold in the recent years. Mechanical injury to fresh produce was a critical factor for produce wastage, especially with the expansion of supply chains, physically extending to thousands of miles. The impact of vibration damages in transit was identified as a specific area of focus which results in wastage of significant portion of the fresh produce, at times ranging from 10% to 40% in some countries. Several studies were concentrated on quantifying the impact of vibration to fresh produce, and it was a challenge to collect vibration impact data continuously due to the limitations in battery life or the memory capacity in the devices. Therefore, the study samples were limited to a stretch of the transit passage or a limited time of the journey. This may or may not give an accurate understanding of the vibration impacts encountered throughout the transit passage, which limits the accuracy of the results. Consequently, an approach which can extend the capacity and ability of determining vibration signals in the transit passage would contribute to accurately analyze the vibration damage along the post-harvest supply chain. A mechanism was developed to address this challenge, which is capable of measuring the in transit vibration continuously through the transit passage subject to a minimum acceleration threshold (0.1g). A system, consisting six tri-axel vibration transducers installed in different locations inside the cargo (produce) pallets in the truck, transmits vibration signals through LORA (Long Range Radio) technology to a central device installed inside the container. The central device processes and records the vibration signals transmitted by the portable transducers, along with the GPS location. This method enables to utilize power consumption for the portable transducers to maximize the capability of measuring the vibration impacts in the transit passage extending to days in the distribution process. The trial tests conducted using the approach reveals that it is a reliable method to measure and quantify the in transit vibrations along the supply chain. The GPS capability enables to identify the locations in the supply chain where the significant vibration impacts were encountered. This method contributes to determining the causes, susceptibility and intensity of vibration impact damages to fresh produce in the post-harvest supply chain. Extensively, the approach could be used to determine the vibration impacts not limiting to fresh produce, but for products in supply chains, which may extend from few hours to several days in transit.Keywords: post-harvest, supply chain, wireless transducers, LORA, fresh produce
Procedia PDF Downloads 26722 Displaying Compostela: Literature, Tourism and Cultural Representation, a Cartographic Approach
Authors: Fernando Cabo Aseguinolaza, Víctor Bouzas Blanco, Alberto Martí Ezpeleta
Abstract:
Santiago de Compostela became a stable object of literary representation during the period between 1840 and 1915, approximately. This study offers a partial cartographical look at this process, suggesting that a cultural space like Compostela’s becoming an object of literary representation paralleled the first stages of its becoming a tourist destination. We use maps as a method of analysis to show the interaction between a corpus of novels and the emerging tradition of tourist guides on Compostela during the selected period. Often, the novels constitute ways to present a city to the outside, marking it for the gaze of others, as guidebooks do. That leads us to examine the ways of constructing and rendering communicable the local in other contexts. For that matter, we should also acknowledge the fact that a good number of the narratives in the corpus evoke the representation of the city through the figure of one who comes from elsewhere: a traveler, a student or a professor. The guidebooks coincide in this with the emerging fiction, of which the mimesis of a city is a key characteristic. The local cannot define itself except through a process of symbolic negotiation, in which recognition and self-recognition play important roles. Cartography shows some of the forms that these processes of symbolic representation take through the treatment of space. The research uses GIS to find significant models of representation. We used the program ArcGIS for the mapping, defining the databases starting from an adapted version of the methodology applied by Barbara Piatti and Lorenz Hurni’s team at the University of Zurich. First, we designed maps that emphasize the peripheral position of Compostela from a historical and institutional perspective using elements found in the texts of our corpus (novels and tourist guides). Second, other maps delve into the parallels between recurring techniques in the fictional texts and characteristic devices of the guidebooks (sketching itineraries and the selection of zones and indexicalization), like a foreigner’s visit guided by someone who knows the city or the description of one’s first entrance into the city’s premises. Last, we offer a cartography that demonstrates the connection between the best known of the novels in our corpus (Alejandro Pérez Lugín’s 1915 novel La casa de la Troya) and the first attempt to create package tourist tours with Galicia as a destination, in a joint venture of Galician and British business owners, in the years immediately preceding the Great War. Literary cartography becomes a crucial instrument for digging deeply into the methods of cultural production of places. Through maps, the interaction between discursive forms seemingly so far removed from each other as novels and tourist guides becomes obvious and suggests the need to go deeper into a complex process through which a city like Compostela becomes visible on the contemporary cultural horizon.Keywords: compostela, literary geography, literary cartography, tourism
Procedia PDF Downloads 39321 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 14720 Food Safety in Wine: Removal of Ochratoxin a in Contaminated White Wine Using Commercial Fining Agents
Authors: Antònio Inês, Davide Silva, Filipa Carvalho, Luís Filipe-Riberiro, Fernando M. Nunes, Luís Abrunhosa, Fernanda Cosme
Abstract:
The presence of mycotoxins in foodstuff is a matter of concern for food safety. Mycotoxins are toxic secondary metabolites produced by certain molds, being ochratoxin A (OTA) one of the most relevant. Wines can also be contaminated with these toxicants. Several authors have demonstrated the presence of mycotoxins in wine, especially ochratoxin A. Its chemical structure is a dihydro-isocoumarin connected at the 7-carboxy group to a molecule of L-β-phenylalanine via an amide bond. As these toxicants can never be completely removed from the food chain, many countries have defined levels in food in order to attend health concerns. OTA contamination of wines might be a risk to consumer health, thus requiring treatments to achieve acceptable standards for human consumption. The maximum acceptable level of OTA in wines is 2.0 μg/kg according to the Commission regulation No. 1881/2006. Therefore, the aim of this work was to reduce OTA to safer levels using different fining agents, as well as their impact on white wine physicochemical characteristics. To evaluate their efficiency, 11 commercial fining agents (mineral, synthetic, animal and vegetable proteins) were used to get new approaches on OTA removal from white wine. Trials (including a control without addition of a fining agent) were performed in white wine artificially supplemented with OTA (10 µg/L). OTA analyses were performed after wine fining. Wine was centrifuged at 4000 rpm for 10 min and 1 mL of the supernatant was collected and added of an equal volume of acetonitrile/methanol/acetic acid (78:20:2 v/v/v). Also, the solid fractions obtained after fining, were centrifuged (4000 rpm, 15 min), the resulting supernatant discarded, and the pellet extracted with 1 mL of the above solution and 1 mL of H2O. OTA analysis was performed by HPLC with fluorescence detection. The most effective fining agent in removing OTA (80%) from white wine was a commercial formulation that contains gelatin, bentonite and activated carbon. Removals between 10-30% were obtained with potassium caseinate, yeast cell walls and pea protein. With bentonites, carboxymethylcellulose, polyvinylpolypyrrolidone and chitosan no considerable OTA removal was verified. Following, the effectiveness of seven commercial activated carbons was also evaluated and compared with the commercial formulation that contains gelatin, bentonite and activated carbon. The different activated carbons were applied at the concentration recommended by the manufacturer in order to evaluate their efficiency in reducing OTA levels. Trial and OTA analysis were performed as explained previously. The results showed that in white wine all activated carbons except one reduced 100% of OTA. The commercial formulation that contains gelatin, bentonite and activated carbon reduced only 73% of OTA concentration. These results may provide useful information for winemakers, namely for the selection of the most appropriate oenological product for OTA removal, reducing wine toxicity and simultaneously enhancing food safety and wine quality.Keywords: wine, ota removal, food safety, fining
Procedia PDF Downloads 54219 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 13718 Application of Typha domingensis Pers. in Artificial Floating for Sewage Treatment
Authors: Tatiane Benvenuti, Fernando Hamerski, Alexandre Giacobbo, Andrea M. Bernardes, Marco A. S. Rodrigues
Abstract:
Population growth in urban areas has caused damages to the environment, a consequence of the uncontrolled dumping of domestic and industrial wastewater. The capacity of some plants to purify domestic and agricultural wastewater has been demonstrated by several studies. Since natural wetlands have the ability to transform, retain and remove nutrients, constructed wetlands have been used for wastewater treatment. They are widely recognized as an economical, efficient and environmentally acceptable means of treating many different types of wastewater. T. domingensis Pers. species have shown a good performance and low deployment cost to extract, detoxify and sequester pollutants. Constructed Floating Wetlands (CFWs) consist of emergent vegetation established upon a buoyant structure, floating on surface waters. The upper parts of the vegetation grow and remain primarily above the water level, while the roots extend down in the water column, developing an extensive under water-level root system. Thus, the vegetation grows hydroponically, performing direct nutrient uptake from the water column. Biofilm is attached on the roots and rhizomes, and as physical and biochemical processes take place, the system functions as a natural filter. The aim of this study is to diagnose the application of macrophytes in artificial floating in the treatment of domestic sewage in south Brazil. The T. domingensis Pers. plants were placed in a flotation system (polymer structure), in full scale, in a sewage treatment plant. The sewage feed rate was 67.4 m³.d⁻¹ ± 8.0, and the hydraulic retention time was 11.5 d ± 1.3. This CFW treat the sewage generated by 600 inhabitants, which corresponds to 12% of the population served by this municipal treatment plant. During 12 months, samples were collected every two weeks, in order to evaluate parameters as chemical oxygen demand (COD), biochemical oxygen demand in 5 days (BOD5), total Kjeldahl nitrogen (TKN), total phosphorus, total solids, and metals. The average removal of organic matter was around 55% for both COD and BOD5. For nutrients, TKN was reduced in 45.9% what was similar to the total phosphorus removal, while for total solids the reduction was 33%. For metals, aluminum, copper, and cadmium, besides in low concentrations, presented the highest percentage reduction, 82.7, 74.4 and 68.8% respectively. Chromium, iron, and manganese removal achieved values around 40-55%. The use of T. domingensis Pers. in artificial floating for sewage treatment is an effective and innovative alternative in Brazilian sewage treatment systems. The evaluation of additional parameters in the treatment system may give useful information in order to improve the removal efficiency and increase the quality of the water bodies.Keywords: constructed wetland, floating system, sewage treatment, Typha domingensis Pers.
Procedia PDF Downloads 21217 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 530