Search results for: Fernando Campa-Planas
36 The Need For Higher Education Stem Integrated into the Social Science
Authors: Luis Fernando Calvo Prieto, Raul Herrero Martínez, Mónica Santamarta Llorente, Sergio Paniagua Bermejo
Abstract:
The project that is presented starts from the questioning about the compartmentalization of knowledge that occurs in university higher education. There are several authors who describe the problems associated with this reality (Rodamillans, M) indicating a lack of integration of the knowledge acquired by students throughout the subjects taken in their university degree. Furthermore, this disintegration is accentuated by the enrollment system of some Faculties and/or Schools of Engineering, which allows the student to take subjects outside the recommended curricular path. This problem is accentuated in an ostentatious way when trying to find an integration between humanistic subjects and the world of experimental sciences or engineering. This abrupt separation between humanities and sciences can be observed in any study plan of Spanish degrees. Except for subjects such as economics or English, in the Faculties of Sciences and the Schools of Engineering, the absence of any humanistic content is striking. At some point it was decided that the only value to take into account when designing their study plans was “usefulness”, considering the humanities systematically useless for their training, and therefore banishing them from the study plans. forgetting the role they have on the capacity of both Leadership and Civic Humanism in our professionals of tomorrow. The teaching guides for the different subjects in the branch of science or engineering do not include any competency, not even transversal, related to leadership capacity or the need, in today's world, for social, civic and humanitarian knowledge part of the people who will offer medical, pharmaceutical, environmental, biotechnological or engineering solutions to a society that is generated thanks to more or less complex relationships based on human relationships and historical events that have occurred so far. If we want professionals who know how to deal effectively and rationally with their leadership tasks and who, in addition, find and develop an ethically civic sense and a humanistic profile in their functions and scientific tasks, we must not leave aside the importance that it has, for the themselves, know the causes, facts and consequences of key events in the history of humanity. The words of the humanist Paul Preston are well known: “he who does not know his history is condemned to repeat the mistakes of the past.” The idea, therefore, that today there can be men of science in the way that the scientists of the Renaissance were, becomes, at the very least, difficult to conceive. To think that a Leonardo da Vinci can be repeated in current times is a more than crazy idea; and although at first it may seem that the specialization of a professional is inevitable but beneficial, there are authors who consider (Sánchez Inarejos) that it has an extremely serious negative side effect: the entrenchment behind the different postulates of each area of knowledge, disdaining everything. what is foreign to it.Keywords: STEM, higher education, social sciences, history
Procedia PDF Downloads 6635 Microbiological Analysis on Anatomical Specimens of Cats for Use in Veterinary Surgery
Authors: Raphael C. Zero, Marita V. Cardozo, Thiago A. S. S. Rocha, Mariana T. Kihara, Fernando A. Ávila, Fabrício S. Oliveira
Abstract:
There are several fixative and preservative solutions for use on cadavers, many of them using formaldehyde as the fixative or anatomical part preservative. In some countries, such as Brazil, this toxic agent has been increasingly restricted. The objective of this study was to microbiologically identify and quantify the key agents in tanks containing 96GL ethanol or sodium chloride solutions, used respectively as fixatives and preservatives of cat cadavers. Eight adult cat corpses, three females and five males, with an average weight of 4.3 kg, were used. After injection via the external common carotid artery (120 ml/kg, 95% 96GL ethyl alcohol and 5% pure glycerin), the cadavers were fixed in a plastic tank with 96GL ethanol for 60 days. After fixing, they were stored in a 30% sodium chloride aqueous solution for 120 days in a similar tank. Samples were collected at the start of the experiment - before the animals were placed in the ethanol tanks, and monthly thereafter. The bacterial count was performed by Pour Plate Method in BHI agar (Brain Heart Infusion) and the plates were incubated aerobically and anaerobically for 24h at 37ºC. MacConkey agar, SPS agar (Sulfite Polymyxin Sulfadizine) and MYP Agar Base were used to isolate the microorganisms. There was no microbial growth in the samples prior to alcohol fixation. After 30 days of fixation in the alcohol solution, total aerobic and anaerobic (<1.0 x 10 CFU/ml) were found and Pseudomonas sp., Staphylococcus sp., Clostridium sp. were the identified agents. After 60 days in the alcohol fixation solution, total aerobes (<1.0 x 10 CFU/ml) and total anaerobes (<2.2 x 10 CFU/mL) were found, and the identified agents were the same. After 30 days of storage in the aqueous solution of 30% sodium chloride, total aerobic (<5.2 x 10 CFU/ml) and total anaerobes (<3.7 x 10 CFU/mL) were found and the agents identified were Staphylococcus sp., Clostridium sp., and fungi. After 60 days of sodium chloride storage, total aerobic (<3.0 x 10 CFU / ml) and total anaerobes (<7.0 x 10 CFU/mL) were found and the identified agents remained the same: Staphylococcus sp., Clostridium sp., and fungi. The microbiological count was low and visual inspection did not reveal signs of contamination in the tanks. There was no strong odor or purification, which proved the technique to be microbiologically effective in fixing and preserving the cat cadavers for the four-month period in which they are provided to undergraduate students of University of Veterinary Medicine for surgery practice. All experimental procedures were approved by the Municipal Legal Department (protocol 02.2014.000027-1). The project was funded by FAPESP (protocol 2015-08259-9).Keywords: anatomy, fixation, microbiology, small animal, surgery
Procedia PDF Downloads 28934 Evaluation of the Energy Performance and Emissions of an Aircraft Engine: J69 Using Fuel Blends of Jet A1 and Biodiesel
Authors: Gabriel Fernando Talero Rojas, Vladimir Silva Leal, Camilo Bayona-Roa, Juan Pava, Mauricio Lopez Gomez
Abstract:
The substitution of conventional aviation fuels with biomass-derived alternative fuels is an emerging field of study in the aviation transport, mainly due to its energy consumption, the contribution to the global Greenhouse Gas - GHG emissions and the fossil fuel price fluctuations. Nevertheless, several challenges remain as the biofuel production cost and its degradative effect over the fuel systems that alter the operating safety. Moreover, experimentation on full-scale aeronautic turbines are expensive and complex, leading to most of the research to the testing of small-size turbojets with a major absence of information regarding the effects in the energy performance and the emissions. The main purpose of the current study is to present the results of experimentation in a full-scale military turbojet engine J69-T-25A (presented in Fig. 1) with 640 kW of power rating and using blends of Jet A1 with oil palm biodiesel. The main findings are related to the thrust specific fuel consumption – TSFC, the engine global efficiency – η, the air/fuel ratio – AFR and the volume fractions of O2, CO2, CO, and HC. Two fuels are used in the present study: a commercial Jet A1 and a Colombian palm oil biodiesel. The experimental plan is conducted using the biodiesel volume contents - w_BD from 0 % (B0) to 50 % (B50). The engine operating regimes are set to Idle, Cruise, and Take-off conditions. The turbojet engine J69 is used by the Colombian Air Force and it is installed in a testing bench with the instrumentation that corresponds to the technical manual of the engine. The increment of w_BD from 0 % to 50 % reduces the η near 3,3 % and the thrust force in a 26,6 % at Idle regime. These variations are related to the reduction of the 〖HHV〗_ad of the fuel blend. The evolved CO and HC tend to be reduced in all the operating conditions when increasing w_BD. Furthermore, a reduction of the atomization angle is presented in Fig. 2, indicating a poor atomization in the fuel nozzle injectors when using a higher biodiesel content as the viscosity of fuel blend increases. An evolution of cloudiness is also observed during the shutdown procedure as presented in Fig. 3a, particularly after 20 % of biodiesel content in the fuel blend. This promotes the contamination of some components of the combustion chamber of the J69 engine with soot and unburned matter (Fig. 3). Thus, the substitution of biodiesel content above 20 % is not recommended in order to avoid a significant decrease of η and the thrust force. A more detail examination of the mechanical wearing of the main components of the engine is advised in further studies.Keywords: aviation, air to fuel ratio, biodiesel, energy performance, fuel atomization, gas turbine
Procedia PDF Downloads 10933 The Debureaucratization Strategy for the Portuguese Health Service through Effective Communication
Authors: Fernando Araujo, Sandra Cardoso, Fátima Fonseca, Sandra Cavaca
Abstract:
A debureaucratization strategy for the Portuguese Health Service was assumed by the Executive Board of the SNS, in deep articulation with the Shared Services of the Ministry of Health. Two of the main dimensions were focused on sick leaves (SL), that transform primary health care (PHC) in administrative institutions, limiting access to patients. The self-declaration of illness (SDI) project, through the National Health Service Contact Centre (SNS24), began on May 1, 2023, and has already resulted in the issuance of more than 300,000 SDI without the need to allocate resources from the National Health Service (NHS). This political decision allows each citizen, in a maximum 2 times/year, and 3 days each time, if ill, through their own responsibility, report their health condition in a dematerialized way, and by this way justified the absence to work, although by Portuguese law in these first three days, there is no payment of salary. Using a digital approach, it is now feasible without the need to go to the PHC and occupy the time of the PHC only to obtain an SL. Through this measure, bureaucracy has been reduced, and the system has been focused on users, improving the lives of citizens and reducing the administrative burden on PHC, which now has more consultation times for users who need it. The second initiative, which began on March 1, 2024, allows the SL to be issued in emergency departments (ED) of public hospitals and in the health institutions of the social and private sectors. This project is intended to allow the user who has suffered a situation of acute urgent illness and who has been observed in an ED of a public hospital or in a private or social entity no longer need to go to PHC only to apply for the respective SL. Since March 1, 54,453 SLs have been issued, 242 in private or social sector institutions and 6,918 in public hospitals, of which 134 were in ED and 47,292 in PHC. This approach has proven to be technically robust, allows immediate resolution of problems and differentiates the performance of doctors. However, it is important to continue to qualify the proper functioning of the ED, preventing non-urgent users from going there only to obtain SL. Thus, in order to make better use of existing resources, it was operationalizing this extension of its issuance in a balanced way, allowing SL to be issued in the ED of hospitals only to critically ill patients or patients referred by INEM, SNS24, or PHC. In both cases, an intense public campaign was implemented to explain the way it works and the benefits for patients. In satisfaction surveys, more than 95% of patients and doctors were satisfied with the solutions, asking for extensions to other areas. The administrative simplification agenda of the NHS continues its effective development. For the success of this debureaucratization agenda, the key factors are effective communication and the ability to reach patients and health professionals in order to increase health literacy and the correct use of NHS.Keywords: debureaucratization strategy, self-declaration of illness, sick leaves, SNS24
Procedia PDF Downloads 7132 Sustainability from Ecocity to Ecocampus: An Exploratory Study on Spanish Universities' Water Management
Authors: Leyla A. Sandoval Hamón, Fernando Casani
Abstract:
Sustainability has been integrated into the cities’ agenda due to the impact that they generate. The dimensions of greater proliferation of sustainability, which are taken as a reference, are economic, social and environmental. Thus, the decisions of management of the sustainable cities search a balance between these dimensions in order to provide environment-friendly alternatives. In this context, urban models (where water consumption, energy consumption, waste production, among others) that have emerged in harmony with the environment, are known as Ecocity. A similar model, but on a smaller scale, is ‘Ecocampus’ that is developed in universities (considered ‘small cities’ due to its complex structure). So, sustainable practices are being implemented in the management of university campus activities, following different relevant lines of work. The universities have a strategic role in society, and their activities can strengthen policies, strategies, and measures of sustainability, both internal and external to the organization. Because of their mission in knowledge creation and transfer, these institutions can promote and disseminate more advanced activities in sustainability. This model replica also implies challenges in the sustainable management of water, energy, waste, transportation, among others, inside the campus. The challenge that this paper focuses on is the water management, taking into account that the universities consume big amounts of this resource. The purpose of this paper is to analyze the sustainability experience, with emphasis on water management, of two different campuses belonging to two different Spanish universities - one urban campus in a historic city and the other a suburban campus in the outskirts of a large city. Both universities are in the top hundred of international rankings of sustainable universities. The methodology adopts a qualitative method based on the technique of in-depth interviews and focus-group discussions with administrative and academic staff of the ‘Ecocampus’ offices, the organizational units for sustainability management, from the two Spanish universities. The hypotheses indicate that sustainable policies in terms of water management are best in campuses without big green spaces and where the buildings are built or rebuilt with modern style. The sustainability efforts of the university are independent of the kind of (urban – suburban) campus but an important aspect to improve is the degree of awareness of the university community about water scarcity. In general, the paper suggests that higher institutions adapt their sustainability policies depending on the location and features of the campus and their engagement with the water conservation. Many Spanish universities have proposed policies, good practices, and measures of sustainability. In fact, some offices or centers of Ecocampus have been founded. The originality of this study is to learn from the different experiences of sustainability policies of universities.Keywords: ecocampus, ecocity, sustainability, water management
Procedia PDF Downloads 22131 Real-Time Working Environment Risk Analysis with Smart Textiles
Authors: Jose A. Diaz-Olivares, Nafise Mahdavian, Farhad Abtahi, Kaj Lindecrantz, Abdelakram Hafid, Fernando Seoane
Abstract:
Despite new recommendations and guidelines for the evaluation of occupational risk assessments and their prevention, work-related musculoskeletal disorders are still one of the biggest causes of work activity disruption, productivity loss, sick leave and chronic work disability. It affects millions of workers throughout Europe, with a large-scale economic and social burden. These specific efforts have failed to produce significant results yet, probably due to the limited availability and high costs of occupational risk assessment at work, especially when the methods are complex, consume excessive resources or depend on self-evaluations and observations of poor accuracy. To overcome these limitations, a pervasive system of risk assessment tools in real time has been developed, which has the characteristics of a systematic approach, with good precision, usability and resource efficiency, essential to facilitate the prevention of musculoskeletal disorders in the long term. The system allows the combination of different wearable sensors, placed on different limbs, to be used for data collection and evaluation by a software solution, according to the needs and requirements in each individual working environment. This is done in a non-disruptive manner for both the occupational health expert and the workers. The creation of this solution allows us to attend different research activities that require, as an essential starting point, the recording of data with ergonomic value of very diverse origin, especially in real work environments. The software platform is here presented with a complimentary smart clothing system for data acquisition, comprised of a T-shirt containing inertial measurement units (IMU), a vest sensorized with textile electronics, a wireless electrocardiogram (ECG) and thoracic electrical bio-impedance (TEB) recorder and a glove sensorized with variable resistors, dependent on the angular position of the wrist. The collected data is processed in real-time through a mobile application software solution, implemented in commercially available Android-based smartphones and tablet platforms. Based on the collection of this information and its analysis, real-time risk assessment and feedback about postural improvement is possible, adapted to different contexts. The result is a tool which provides added value to ergonomists and occupational health agents, as in situ analysis of postural behavior can assist in a quantitative manner in the evaluation of work techniques and the occupational environment.Keywords: ergonomics, mobile technologies, risk assessment, smart textiles
Procedia PDF Downloads 11730 Analytical Study of the Structural Response to Near-Field Earthquakes
Authors: Isidro Perez, Maryam Nazari
Abstract:
Numerous earthquakes, which have taken place across the world, led to catastrophic damage and collapse of structures (e.g., 1971 San Fernando; 1995 Kobe-Japan; and 2010 Chile earthquakes). Engineers are constantly studying methods to moderate the effect this phenomenon has on structures to further reduce damage, costs, and ultimately to provide life safety to occupants. However, there are regions where structures, cities, or water reservoirs are built near fault lines. When an earthquake occurs near the fault lines, they can be categorized as near-field earthquakes. In contrary, a far-field earthquake occurs when the region is further away from the seismic source. A near-field earthquake generally has a higher initial peak resulting in a larger seismic response, when compared to a far-field earthquake ground motion. These larger responses may result in serious consequences in terms of structural damage which can result in a high risk for the public’s safety. Unfortunately, the response of structures subjected to near-field records are not properly reflected in the current building design specifications. For example, in ASCE 7-10, the design response spectrum is mostly based on the far-field design-level earthquakes. This may result in the catastrophic damage of structures that are not properly designed for near-field earthquakes. This research investigates the knowledge that the effect of near-field earthquakes has on the response of structures. To fully examine this topic, a structure was designed following the current seismic building design specifications, e.g. ASCE 7-10 and ACI 318-14, being analytically modeled, utilizing the SAP2000 software. Next, utilizing the FEMA P695 report, several near-field and far-field earthquakes were selected, and the near-field earthquake records were scaled to represent the design-level ground motions. Upon doing this, the prototype structural model, created using SAP2000, was subjected to the scaled ground motions. A Linear Time History Analysis and Pushover analysis were conducted on SAP2000 for evaluation of the structural seismic responses. On average, the structure experienced an 8% and 1% increase in story drift and absolute acceleration, respectively, when subjected to the near-field earthquake ground motions. The pushover analysis was ran to find and aid in properly defining the hinge formation in the structure when conducting the nonlinear time history analysis. A near-field ground motion is characterized by a high-energy pulse, making it unique to other earthquake ground motions. Therefore, pulse extraction methods were used in this research to estimate the maximum response of structures subjected to near-field motions. The results will be utilized in the generation of a design spectrum for the estimation of design forces for buildings subjected to NF ground motions.Keywords: near-field, pulse, pushover, time-history
Procedia PDF Downloads 14629 Sorghum Polyphenols Encapsulated by Spray Drying, Using Modified Starches as Wall Materials
Authors: Adriana Garcia G., Alberto A. Escobar P., Amira D. Calvo L., Gabriel Lizama U., Alejandro Zepeda P., Fernando Martínez B., Susana Rincón A.
Abstract:
Different studies have recently been focused on the use of antioxidants such as polyphenols because of to its anticarcinogenic capacity. However, these compounds are highly sensible to environmental factors such as light and heat, so lose its long-term stability, besides possess an astringent and bitter taste. Nevertheless, the polyphenols can be protected by microcapsule formulation. In this sense, a rich source of polyphenols is sorghum, besides presenting a high starch content. Due to the above, the aim of this work was to obtain modified starches from sorghum by extrusion to encapsulate polyphenols the sorghum by spray drying. Polyphenols were extracted by ethanol solution from sorghum (Pajarero/red) and determined by the method of Folin-Ciocalteu, obtaining GAE at 30 mg/g. Moreover, was extracted starch of sorghum (Sinaloense/white) through wet milling (yield 32 %). The hydrolyzed starch was modified with three treatments: acetic anhydride (2.5g/100g), sodium tripolyphosphate (4g/100g), and sodium tripolyphosphate/ acetic anhydride (2g/1.25g by each 100 g) by extrusion. Processing conditions of extrusion were as follows: barrel temperatures were of 60, 130 and 170 °C at the feeding, transition, and high-pressure extrusion zones, respectively. Analysis of Fourier Transform Infrared spectroscopy (FTIR), showed bands exhibited of acetyl groups (1735 cm-1) and phosphates (1170 cm-1, 910 cm-1 and 525 cm-1), indicating the respective modification of starch. Besides, all modified starches not developed viscosity, which is a characteristic required for use in the encapsulation of polyphenols using the spray drying technique. As result of the modification starch, was obtained a water solubility index (WSI) from 33.8 to 44.8 %, and crystallinity from 8 to 11 %, indicating the destruction of the starch granule. Afterwards, microencapsulation of polyphenols was developed by spray drying, with a blend of 10 g of modified starch, 60 ml polyphenol extract and 30 ml of distilled water. Drying conditions were as follows: inlet air temperature 150 °C ± 1, outlet air temperature 80°C ± 5. As result of the microencapsulation: were obtained yields of 56.8 to 77.4 % and an efficiency of encapsulation from 84.6 to 91.4 %. The FTIR analysis showed evidence of microcapsules loaded with polyphenols in bands 1042 cm-1, 1038 cm-1 and 1148 cm-1. Analysis Differential scanning calorimetry (DSC) showed transition temperatures from 144.1 to 173.9 °C. For the order hand, analysis of Scanning Electron Microscopy (SEM), were observed rounded surfaces with concavities, typical feature of microcapsules produced by spray drying, how result of rapid evaporation of water. Finally, the modified starches were obtained by extrusion with good characteristics for use as cover materials by spray drying, where the phosphorylated starch was the best treatment in this work, according to the encapsulation yield, efficiency, and transition temperature.Keywords: encapsulation, extrusion, modified starch, polyphenols, spray drying
Procedia PDF Downloads 30828 Effects of Temperature and the Use of Bacteriocins on Cross-Contamination from Animal Source Food Processing: A Mathematical Model
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cerdova
Abstract:
The contamination of food by microbial agents is a common problem in the industry, especially regarding the elaboration of animal source products. Incorrect manipulation of the machinery or on the raw materials can cause a decrease in production or an epidemiological outbreak due to intoxication. In order to improve food product quality, different methods have been used to reduce or, at least, to slow down the growth of the pathogens, especially deteriorated, infectious or toxigenic bacteria. These methods are usually carried out under low temperatures and short processing time (abiotic agents), along with the application of antibacterial substances, such as bacteriocins (biotic agents). This, in a controlled and efficient way that fulfills the purpose of bacterial control without damaging the final product. Therefore, the objective of the present study is to design a secondary mathematical model that allows the prediction of both the biotic and abiotic factor impact associated with animal source food processing. In order to accomplish this objective, the authors propose a three-dimensional differential equation model, whose components are: bacterial growth, release, production and artificial incorporation of bacteriocins and changes in pH levels of the medium. These three dimensions are constantly being influenced by the temperature of the medium. Secondly, this model adapts to an idealized situation of cross-contamination animal source food processing, with the study agents being both the animal product and the contact surface. Thirdly, the stochastic simulations and the parametric sensibility analysis are compared with referential data. The main results obtained from the analysis and simulations of the mathematical model were to discover that, although bacterial growth can be stopped in lower temperatures, even lower ones are needed to eradicate it. However, this can be not only expensive, but counterproductive as well in terms of the quality of the raw materials and, on the other hand, higher temperatures accelerate bacterial growth. In other aspects, the use and efficiency of bacteriocins are an effective alternative in the short and medium terms. Moreover, an indicator of bacterial growth is a low-level pH, since lots of deteriorating bacteria are lactic acids. Lastly, the processing times are a secondary agent of concern when the rest of the aforementioned agents are under control. Our main conclusion is that when acclimating a mathematical model within the context of the industrial process, it can generate new tools that predict bacterial contamination, the impact of bacterial inhibition, and processing method times. In addition, the mathematical modeling proposed logistic input of broad application, which can be replicated on non-meat food products, other pathogens or even on contamination by crossed contact of allergen foods.Keywords: bacteriocins, cross-contamination, mathematical model, temperature
Procedia PDF Downloads 14427 Kidnapping of Migrants by Drug Cartels in Mexico as a New Trend in Contemporary Slavery
Authors: Itze Coronel Salomon
Abstract:
The rise of organized crime and violence related to drug cartels in Mexico has created serious challenges for the authorities to provide security to those who live within its borders. However, to achieve a significant improvement in security is absolute respect for fundamental human rights by the authorities. Irregular migrants in Mexico are at serious risk of abuse. Research by Amnesty International as well as reports of the NHRC (National Human Rights) in Mexico, have indicated the major humanitarian crisis faced by thousands of migrants traveling in the shadows. However, the true extent of the problem remains invisible to the general population. The fact that federal and state governments leave no proper record of abuse and do not publish reliable data contributes to ignorance and misinformation, often spread by the media that portray migrants as the source of crime rather than their victims. Discrimination and intolerance against irregular migrants can generate greater hostility and exclusion. According to the modus operandi that has been recorded criminal organizations and criminal groups linked to drug trafficking structures deprive migrants of their liberty for forced labor and illegal activities related to drug trafficking, even some have been kidnapped for be trained as murderers . If the victim or their family cannot pay the ransom, the kidnapped person may suffer torture, mutilation and amputation of limbs or death. Migrant women are victims of sexual abuse during her abduction as well. In 2011, at least 177 bodies were identified in the largest mass grave found in Mexico, located in the town of San Fernando, in the border state of Tamaulipas, most of the victims were killed by blunt instruments, and most seemed to be immigrants and travelers passing through the country. With dozens of small graves discovered in northern Mexico, this may suggest a change in tactics between organized crime groups to the different means of obtaining revenue and reduce murder profile methods. Competition and conflict over territorial control drug trafficking can provide strong incentives for organized crime groups send signals of violence to the authorities and rival groups. However, as some Mexican organized crime groups are increasingly looking to take advantage of income and vulnerable groups, such as Central American migrants seem less interested in advertising his work to authorities and others, and more interested in evading detection and confrontation. This paper pretends to analyze the introduction of this new trend of kidnapping migrants for forced labors by drug cartels in Mexico into the forms of contemporary slavery and its implications.Keywords: international law, migration, transnational organized crime
Procedia PDF Downloads 41626 An Approach to Determine the in Transit Vibration to Fresh Produce Using Long Range Radio (LORA) Wireless Transducers
Authors: Indika Fernando, Jiangang Fei, Roger Stanely, Hossein Enshaei
Abstract:
Ever increasing demand for quality fresh produce by the consumers, had increased the gravity on the post-harvest supply chains in multi-fold in the recent years. Mechanical injury to fresh produce was a critical factor for produce wastage, especially with the expansion of supply chains, physically extending to thousands of miles. The impact of vibration damages in transit was identified as a specific area of focus which results in wastage of significant portion of the fresh produce, at times ranging from 10% to 40% in some countries. Several studies were concentrated on quantifying the impact of vibration to fresh produce, and it was a challenge to collect vibration impact data continuously due to the limitations in battery life or the memory capacity in the devices. Therefore, the study samples were limited to a stretch of the transit passage or a limited time of the journey. This may or may not give an accurate understanding of the vibration impacts encountered throughout the transit passage, which limits the accuracy of the results. Consequently, an approach which can extend the capacity and ability of determining vibration signals in the transit passage would contribute to accurately analyze the vibration damage along the post-harvest supply chain. A mechanism was developed to address this challenge, which is capable of measuring the in transit vibration continuously through the transit passage subject to a minimum acceleration threshold (0.1g). A system, consisting six tri-axel vibration transducers installed in different locations inside the cargo (produce) pallets in the truck, transmits vibration signals through LORA (Long Range Radio) technology to a central device installed inside the container. The central device processes and records the vibration signals transmitted by the portable transducers, along with the GPS location. This method enables to utilize power consumption for the portable transducers to maximize the capability of measuring the vibration impacts in the transit passage extending to days in the distribution process. The trial tests conducted using the approach reveals that it is a reliable method to measure and quantify the in transit vibrations along the supply chain. The GPS capability enables to identify the locations in the supply chain where the significant vibration impacts were encountered. This method contributes to determining the causes, susceptibility and intensity of vibration impact damages to fresh produce in the post-harvest supply chain. Extensively, the approach could be used to determine the vibration impacts not limiting to fresh produce, but for products in supply chains, which may extend from few hours to several days in transit.Keywords: post-harvest, supply chain, wireless transducers, LORA, fresh produce
Procedia PDF Downloads 26525 Displaying Compostela: Literature, Tourism and Cultural Representation, a Cartographic Approach
Authors: Fernando Cabo Aseguinolaza, Víctor Bouzas Blanco, Alberto Martí Ezpeleta
Abstract:
Santiago de Compostela became a stable object of literary representation during the period between 1840 and 1915, approximately. This study offers a partial cartographical look at this process, suggesting that a cultural space like Compostela’s becoming an object of literary representation paralleled the first stages of its becoming a tourist destination. We use maps as a method of analysis to show the interaction between a corpus of novels and the emerging tradition of tourist guides on Compostela during the selected period. Often, the novels constitute ways to present a city to the outside, marking it for the gaze of others, as guidebooks do. That leads us to examine the ways of constructing and rendering communicable the local in other contexts. For that matter, we should also acknowledge the fact that a good number of the narratives in the corpus evoke the representation of the city through the figure of one who comes from elsewhere: a traveler, a student or a professor. The guidebooks coincide in this with the emerging fiction, of which the mimesis of a city is a key characteristic. The local cannot define itself except through a process of symbolic negotiation, in which recognition and self-recognition play important roles. Cartography shows some of the forms that these processes of symbolic representation take through the treatment of space. The research uses GIS to find significant models of representation. We used the program ArcGIS for the mapping, defining the databases starting from an adapted version of the methodology applied by Barbara Piatti and Lorenz Hurni’s team at the University of Zurich. First, we designed maps that emphasize the peripheral position of Compostela from a historical and institutional perspective using elements found in the texts of our corpus (novels and tourist guides). Second, other maps delve into the parallels between recurring techniques in the fictional texts and characteristic devices of the guidebooks (sketching itineraries and the selection of zones and indexicalization), like a foreigner’s visit guided by someone who knows the city or the description of one’s first entrance into the city’s premises. Last, we offer a cartography that demonstrates the connection between the best known of the novels in our corpus (Alejandro Pérez Lugín’s 1915 novel La casa de la Troya) and the first attempt to create package tourist tours with Galicia as a destination, in a joint venture of Galician and British business owners, in the years immediately preceding the Great War. Literary cartography becomes a crucial instrument for digging deeply into the methods of cultural production of places. Through maps, the interaction between discursive forms seemingly so far removed from each other as novels and tourist guides becomes obvious and suggests the need to go deeper into a complex process through which a city like Compostela becomes visible on the contemporary cultural horizon.Keywords: compostela, literary geography, literary cartography, tourism
Procedia PDF Downloads 39224 Skull Extraction for Quantification of Brain Volume in Magnetic Resonance Imaging of Multiple Sclerosis Patients
Authors: Marcela De Oliveira, Marina P. Da Silva, Fernando C. G. Da Rocha, Jorge M. Santos, Jaime S. Cardoso, Paulo N. Lisboa-Filho
Abstract:
Multiple Sclerosis (MS) is an immune-mediated disease of the central nervous system characterized by neurodegeneration, inflammation, demyelination, and axonal loss. Magnetic resonance imaging (MRI), due to the richness in the information details provided, is the gold standard exam for diagnosis and follow-up of neurodegenerative diseases, such as MS. Brain atrophy, the gradual loss of brain volume, is quite extensive in multiple sclerosis, nearly 0.5-1.35% per year, far off the limits of normal aging. Thus, the brain volume quantification becomes an essential task for future analysis of the occurrence atrophy. The analysis of MRI has become a tedious and complex task for clinicians, who have to manually extract important information. This manual analysis is prone to errors and is time consuming due to various intra- and inter-operator variability. Nowadays, computerized methods for MRI segmentation have been extensively used to assist doctors in quantitative analyzes for disease diagnosis and monitoring. Thus, the purpose of this work was to evaluate the brain volume in MRI of MS patients. We used MRI scans with 30 slices of the five patients diagnosed with multiple sclerosis according to the McDonald criteria. The computational methods for the analysis of images were carried out in two steps: segmentation of the brain and brain volume quantification. The first image processing step was to perform brain extraction by skull stripping from the original image. In the skull stripper for MRI images of the brain, the algorithm registers a grayscale atlas image to the grayscale patient image. The associated brain mask is propagated using the registration transformation. Then this mask is eroded and used for a refined brain extraction based on level-sets (edge of the brain-skull border with dedicated expansion, curvature, and advection terms). In the second step, the brain volume quantification was performed by counting the voxels belonging to the segmentation mask and converted in cc. We observed an average brain volume of 1469.5 cc. We concluded that the automatic method applied in this work can be used for the brain extraction process and brain volume quantification in MRI. The development and use of computer programs can contribute to assist health professionals in the diagnosis and monitoring of patients with neurodegenerative diseases. In future works, we expect to implement more automated methods for the assessment of cerebral atrophy and brain lesions quantification, including machine-learning approaches. Acknowledgements: This work was supported by a grant from Brazilian agency Fundação de Amparo à Pesquisa do Estado de São Paulo (number 2019/16362-5).Keywords: brain volume, magnetic resonance imaging, multiple sclerosis, skull stripper
Procedia PDF Downloads 14623 Food Safety in Wine: Removal of Ochratoxin a in Contaminated White Wine Using Commercial Fining Agents
Authors: Antònio Inês, Davide Silva, Filipa Carvalho, Luís Filipe-Riberiro, Fernando M. Nunes, Luís Abrunhosa, Fernanda Cosme
Abstract:
The presence of mycotoxins in foodstuff is a matter of concern for food safety. Mycotoxins are toxic secondary metabolites produced by certain molds, being ochratoxin A (OTA) one of the most relevant. Wines can also be contaminated with these toxicants. Several authors have demonstrated the presence of mycotoxins in wine, especially ochratoxin A. Its chemical structure is a dihydro-isocoumarin connected at the 7-carboxy group to a molecule of L-β-phenylalanine via an amide bond. As these toxicants can never be completely removed from the food chain, many countries have defined levels in food in order to attend health concerns. OTA contamination of wines might be a risk to consumer health, thus requiring treatments to achieve acceptable standards for human consumption. The maximum acceptable level of OTA in wines is 2.0 μg/kg according to the Commission regulation No. 1881/2006. Therefore, the aim of this work was to reduce OTA to safer levels using different fining agents, as well as their impact on white wine physicochemical characteristics. To evaluate their efficiency, 11 commercial fining agents (mineral, synthetic, animal and vegetable proteins) were used to get new approaches on OTA removal from white wine. Trials (including a control without addition of a fining agent) were performed in white wine artificially supplemented with OTA (10 µg/L). OTA analyses were performed after wine fining. Wine was centrifuged at 4000 rpm for 10 min and 1 mL of the supernatant was collected and added of an equal volume of acetonitrile/methanol/acetic acid (78:20:2 v/v/v). Also, the solid fractions obtained after fining, were centrifuged (4000 rpm, 15 min), the resulting supernatant discarded, and the pellet extracted with 1 mL of the above solution and 1 mL of H2O. OTA analysis was performed by HPLC with fluorescence detection. The most effective fining agent in removing OTA (80%) from white wine was a commercial formulation that contains gelatin, bentonite and activated carbon. Removals between 10-30% were obtained with potassium caseinate, yeast cell walls and pea protein. With bentonites, carboxymethylcellulose, polyvinylpolypyrrolidone and chitosan no considerable OTA removal was verified. Following, the effectiveness of seven commercial activated carbons was also evaluated and compared with the commercial formulation that contains gelatin, bentonite and activated carbon. The different activated carbons were applied at the concentration recommended by the manufacturer in order to evaluate their efficiency in reducing OTA levels. Trial and OTA analysis were performed as explained previously. The results showed that in white wine all activated carbons except one reduced 100% of OTA. The commercial formulation that contains gelatin, bentonite and activated carbon reduced only 73% of OTA concentration. These results may provide useful information for winemakers, namely for the selection of the most appropriate oenological product for OTA removal, reducing wine toxicity and simultaneously enhancing food safety and wine quality.Keywords: wine, ota removal, food safety, fining
Procedia PDF Downloads 53822 Mathematical Modelling of Bacterial Growth in Products of Animal Origin in Storage and Transport: Effects of Temperature, Use of Bacteriocins and pH Level
Authors: Benjamin Castillo, Luis Pastenes, Fernando Cordova
Abstract:
The pathogen growth in animal source foods is a common problem in the food industry, causing monetary losses due to the spoiling of products or food intoxication outbreaks in the community. In this sense, the quality of the product is reflected by the population of deteriorating agents present in it, which are mainly bacteria. The factors which are likely associated with freshness in animal source foods are temperature and processing, storage, and transport times. However, the level of deterioration of products depends, in turn, on the characteristics of the bacterial population, causing the decomposition or spoiling, such as pH level and toxins. Knowing the growth dynamics of the agents that are involved in product contamination allows the monitoring for more efficient processing. This means better quality and reasonable costs, along with a better estimation of necessary time and temperature intervals for transport and storage in order to preserve product quality. The objective of this project is to design a secondary model that allows measuring the impact on temperature bacterial growth and the competition for pH adequacy and release of bacteriocins in order to describe such phenomenon and, thus, estimate food product half-life with the least possible risk of deterioration or spoiling. In order to achieve this objective, the authors propose an analysis of a three-dimensional ordinary differential which includes; logistic bacterial growth extended by the inhibitory action of bacteriocins including the effect of the medium pH; change in the medium pH levels through an adaptation of the Luedeking-Piret kinetic model; Bacteriocin concentration modeled similarly to pH levels. These three dimensions are being influenced by the temperature at all times. Then, this differential system is expanded, taking into consideration the variable temperature and the concentration of pulsed bacteriocins, which represent characteristics inherent of the modeling, such as transport and storage, as well as the incorporation of substances that inhibit bacterial growth. The main results lead to the fact that temperature changes in an early stage of transport increased the bacterial population significantly more than if it had increased during the final stage. On the other hand, the incorporation of bacteriocins, as in other investigations, proved to be efficient in the short and medium-term since, although the population of bacteria decreased, once the bacteriocins were depleted or degraded over time, the bacteria eventually returned to their regular growth rate. The efficacy of the bacteriocins at low temperatures decreased slightly, which equates with the fact that their natural degradation rate also decreased. In summary, the implementation of the mathematical model allowed the simulation of a set of possible bacteria present in animal based products, along with their properties, in various transport and storage situations, which led us to state that for inhibiting bacterial growth, the optimum is complementary low constant temperatures and the initial use of bacteriocins.Keywords: bacterial growth, bacteriocins, mathematical modelling, temperature
Procedia PDF Downloads 13521 Application of Typha domingensis Pers. in Artificial Floating for Sewage Treatment
Authors: Tatiane Benvenuti, Fernando Hamerski, Alexandre Giacobbo, Andrea M. Bernardes, Marco A. S. Rodrigues
Abstract:
Population growth in urban areas has caused damages to the environment, a consequence of the uncontrolled dumping of domestic and industrial wastewater. The capacity of some plants to purify domestic and agricultural wastewater has been demonstrated by several studies. Since natural wetlands have the ability to transform, retain and remove nutrients, constructed wetlands have been used for wastewater treatment. They are widely recognized as an economical, efficient and environmentally acceptable means of treating many different types of wastewater. T. domingensis Pers. species have shown a good performance and low deployment cost to extract, detoxify and sequester pollutants. Constructed Floating Wetlands (CFWs) consist of emergent vegetation established upon a buoyant structure, floating on surface waters. The upper parts of the vegetation grow and remain primarily above the water level, while the roots extend down in the water column, developing an extensive under water-level root system. Thus, the vegetation grows hydroponically, performing direct nutrient uptake from the water column. Biofilm is attached on the roots and rhizomes, and as physical and biochemical processes take place, the system functions as a natural filter. The aim of this study is to diagnose the application of macrophytes in artificial floating in the treatment of domestic sewage in south Brazil. The T. domingensis Pers. plants were placed in a flotation system (polymer structure), in full scale, in a sewage treatment plant. The sewage feed rate was 67.4 m³.d⁻¹ ± 8.0, and the hydraulic retention time was 11.5 d ± 1.3. This CFW treat the sewage generated by 600 inhabitants, which corresponds to 12% of the population served by this municipal treatment plant. During 12 months, samples were collected every two weeks, in order to evaluate parameters as chemical oxygen demand (COD), biochemical oxygen demand in 5 days (BOD5), total Kjeldahl nitrogen (TKN), total phosphorus, total solids, and metals. The average removal of organic matter was around 55% for both COD and BOD5. For nutrients, TKN was reduced in 45.9% what was similar to the total phosphorus removal, while for total solids the reduction was 33%. For metals, aluminum, copper, and cadmium, besides in low concentrations, presented the highest percentage reduction, 82.7, 74.4 and 68.8% respectively. Chromium, iron, and manganese removal achieved values around 40-55%. The use of T. domingensis Pers. in artificial floating for sewage treatment is an effective and innovative alternative in Brazilian sewage treatment systems. The evaluation of additional parameters in the treatment system may give useful information in order to improve the removal efficiency and increase the quality of the water bodies.Keywords: constructed wetland, floating system, sewage treatment, Typha domingensis Pers.
Procedia PDF Downloads 21020 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals
Authors: Christine F. Boos, Fernando M. Azevedo
Abstract:
Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing
Procedia PDF Downloads 52819 Kinetic Evaluation of Sterically Hindered Amines under Partial Oxy-Combustion Conditions
Authors: Sara Camino, Fernando Vega, Mercedes Cano, Benito Navarrete, José A. Camino
Abstract:
Carbon capture and storage (CCS) technologies should play a relevant role towards low-carbon systems in the European Union by 2030. Partial oxy-combustion emerges as a promising CCS approach to mitigate anthropogenic CO₂ emissions. Its advantages respect to other CCS technologies rely on the production of a higher CO₂ concentrated flue gas than these provided by conventional air-firing processes. The presence of more CO₂ in the flue gas increases the driving force in the separation process and hence it might lead to further reductions of the energy requirements of the overall CO₂ capture process. A higher CO₂ concentrated flue gas should enhance the CO₂ capture by chemical absorption in solvent kinetic and CO₂ cyclic capacity. They have impact on the performance of the overall CO₂ absorption process by reducing the solvent flow-rate required for a specific CO₂ removal efficiency. Lower solvent flow-rates decreases the reboiler duty during the regeneration stage and also reduces the equipment size and pumping costs. Moreover, R&D activities in this field are focused on novel solvents and blends that provide lower CO₂ absorption enthalpies and therefore lower energy penalties associated to the solvent regeneration. In this respect, sterically hindered amines are considered potential solvents for CO₂ capture. They provide a low energy requirement during the regeneration process due to its molecular structure. However, its absorption kinetics are slow and they must be promoted by blending with faster solvents such as monoethanolamine (MEA) and piperazine (PZ). In this work, the kinetic behavior of two sterically hindered amines were studied under partial oxy-combustion conditions and compared with MEA. A lab-scale semi-batch reactor was used. The CO₂ composition of the synthetic flue gas varied from 15%v/v – conventional coal combustion – to 60%v/v – maximum CO₂ concentration allowable for an optimal partial oxy-combustion operation. Firstly, 2-amino-2-methyl-1-propanol (AMP) showed a hybrid behavior with fast kinetics and a low enthalpy of CO₂ absorption. The second solvent was Isophrondiamine (IF), which has a steric hindrance in one of the amino groups. Its free amino group increases its cyclic capacity. In general, the presence of higher CO₂ concentration in the flue gas accelerated the CO₂ absorption phenomena, producing higher CO₂ absorption rates. In addition, the evolution of the CO2 loading also exhibited higher values in the experiments using higher CO₂ concentrated flue gas. The steric hindrance causes a hybrid behavior in this solvent, between both fast and slow kinetic solvents. The kinetics rates observed in all the experiments carried out using AMP were higher than MEA, but lower than the IF. The kinetic enhancement experienced by AMP at a high CO2 concentration is slightly over 60%, instead of 70% – 80% for IF. AMP also improved its CO₂ absorption capacity by 24.7%, from 15%v/v to 60%v/v, almost double the improvements achieved by MEA. In IF experiments, the CO₂ loading increased around 10% from 15%v/v to 60%v/v CO₂ and it changed from 1.10 to 1.34 mole CO₂ per mole solvent, more than 20% of increase. This hybrid kinetic behavior makes AMP and IF promising solvents for partial oxy–combustion applications.Keywords: absorption, carbon capture, partial oxy-combustion, solvent
Procedia PDF Downloads 19018 A 500 MWₑ Coal-Fired Power Plant Operated under Partial Oxy-Combustion: Methodology and Economic Evaluation
Authors: Fernando Vega, Esmeralda Portillo, Sara Camino, Benito Navarrete, Elena Montavez
Abstract:
The European Union aims at strongly reducing their CO₂ emissions from energy and industrial sector by 2030. The energy sector contributes with more than two-thirds of the CO₂ emission share derived from anthropogenic activities. Although efforts are mainly focused on the use of renewables by energy production sector, carbon capture and storage (CCS) remains as a frontline option to reduce CO₂ emissions from industrial process, particularly from fossil-fuel power plants and cement production. Among the most feasible and near-to-market CCS technologies, namely post-combustion and oxy-combustion, partial oxy-combustion is a novel concept that can potentially reduce the overall energy requirements of the CO₂ capture process. This technology consists in the use of higher oxygen content in the oxidizer that should increase the CO₂ concentration of the flue gas once the fuel is burnt. The CO₂ is then separated from the flue gas downstream by means of a conventional CO₂ chemical absorption process. The production of a higher CO₂ concentrated flue gas should enhance the CO₂ absorption into the solvent, leading to further reductions of the CO₂ separation performance in terms of solvent flow-rate, equipment size, and energy penalty related to the solvent regeneration. This work evaluates a portfolio of CCS technologies applied to fossil-fuel power plants. For this purpose, an economic evaluation methodology was developed in detail to determine the main economical parameters for CO₂ emission removal such as the levelized cost of electricity (LCOE) and the CO₂ captured and avoided costs. ASPEN Plus™ software was used to simulate the main units of power plant and solve the energy and mass balance. Capital and investment costs were determined from the purchased cost of equipment, also engineering costs and project and process contingencies. The annual capital cost and operating and maintenance costs were later obtained. A complete energy balance was performed to determine the net power produced in each case. The baseline case consists of a supercritical 500 MWe coal-fired power plant using anthracite as a fuel without any CO₂ capture system. Four cases were proposed: conventional post-combustion capture, oxy-combustion and partial oxy-combustion using two levels of oxygen-enriched air (40%v/v and 75%v/v). CO₂ chemical absorption process using monoethanolamine (MEA) was used as a CO₂ separation process whereas the O₂ requirement was achieved using a conventional air separation unit (ASU) based on Linde's cryogenic process. Results showed a reduction of 15% of the total investment cost of the CO₂ separation process when partial oxy-combustion was used. Oxygen-enriched air production also reduced almost half the investment costs required for ASU in comparison with oxy-combustion cases. Partial oxy-combustion has a significant impact on the performance of both CO₂ separation and O₂ production technologies, and it can lead to further energy reductions using new developments on both CO₂ and O₂ separation processes.Keywords: carbon capture, cost methodology, economic evaluation, partial oxy-combustion
Procedia PDF Downloads 14717 Seasonal Variability of Picoeukaryotes Community Structure Under Coastal Environmental Disturbances
Authors: Benjamin Glasner, Carlos Henriquez, Fernando Alfaro, Nicole Trefault, Santiago Andrade, Rodrigo De La Iglesia
Abstract:
A central question in ecology refers to the relative importance that local-scale variables have over community composition, when compared with regional-scale variables. In coastal environments, strong seasonal abiotic influence dominates these systems, weakening the impact of other parameters like micronutrients. After the industrial revolution, micronutrients like trace metals have increased in ocean as pollutants, with strong effects upon biotic entities and biological processes in coastal regions. Coastal picoplankton communities had been characterized as a cyanobacterial dominated fraction, but in recent years the eukaryotic component of this size fraction has gained relevance due to their high influence in carbon cycle, although, diversity patterns and responses to disturbances are poorly understood. South Pacific upwelling coastal environments represent an excellent model to study seasonal changes due to a strong influence in the availability of macro- and micronutrients between seasons. In addition, some well constrained coastal bays of this region have been subjected to strong disturbances due to trace metal inputs. In this study, we aim to compare the influence of seasonality and trace metals concentrations, on the community structure of planktonic picoeukaryotes. To describe seasonal patterns in the study area, satellite data in a 6 years time series and in-situ measurements with a traditional oceanographic approach such as CTDO equipment were performed. In addition, trace metal concentrations were analyzed trough ICP-MS analysis, for the same region. For biological data collection, field campaigns were performed in 2011-2012 and the picoplankton community was described by flow cytometry and taxonomical characterization with next-generation sequencing of ribosomal genes. The relation between the abiotic and biotic components was finally determined by multivariate statistical analysis. Our data show strong seasonal fluctuations in abiotic parameters such as photosynthetic active radiation and superficial sea temperature, with a clear differentiation of seasons. However, trace metal analysis allows identifying strong differentiation within the study area, dividing it into two zones based on trace metals concentration. Biological data indicate that there are no major changes in diversity but a significant fluctuation in evenness and community structure. These changes are related mainly with regional parameters, like temperature, but by analyzing the metal influence in picoplankton community structure, we identify a differential response of some plankton taxa to metal pollution. We propose that some picoeukaryotic plankton groups respond differentially to metal inputs, by changing their nutritional status and/or requirements under disturbances as a derived outcome of toxic effects and tolerance.Keywords: Picoeukaryotes, plankton communities, trace metals, seasonal patterns
Procedia PDF Downloads 17316 Isoflavonoid Dynamic Variation in Red Clover Genotypes
Authors: Andrés Quiroz, Emilio Hormazábal, Ana Mutis, Fernando Ortega, Loreto Méndez, Leonardo Parra
Abstract:
Red clover root borer, Hylastinus obscurus Marsham (Coleoptera: Curculionidae), is the main insect pest associated to red clover, Trifolium pratense L. An average of 1.5 H. obscurus per plant can cause 5.5% reduction in forage yield in pastures of two to three years old. Moreover, insect attack can reach 70% to 100% of the plants. To our knowledge, there is no a chemical strategy for controlling this pest. Therefore alternative strategies for controlling H. obscurus are a high priority for red clover producers. One of this alternative is related to the study of secondary metabolites involved in intrinsic chemical defenses developed by plants, such as isoflavonoids. The isoflavonoids formononetin and daidzein have elicited an antifeedant and phagostimult effect on H. obscurus respectively. However, we do not know how is the dynamic variation of these isoflavonoids under field conditions. The main objective of this work was to evaluate the variation of the antifeedant isoflavonoids formononetin, the phagostimulant isoflavonoids daidzein, and their respective glycosides over time in different ecotypes of red clover. Fourteen red clover ecotypes (8 cultivars and 6 experimental lines), were collected at INIA-Carillanca (La Araucanía, Chile). These plants were established in October 2015 under irrigated conditions. The cultivars were distributed in a randomized complete block with three replicates. The whole plants were sampled in four times: 15th October 2016, 12th December 2016, 27th January 2017 and 16th March 2017 with sufficient amount of soil to avoid root damage. A polar fraction of isoflavonoid was obtained from 20 mg of lyophilized root tissue extracted with 2 mL of 80% MeOH for 16 h using an orbital shaker in the dark at room temperature. After, an aliquot of 1.4 mL of the supernatant was evaporated, and the residue was resuspended in 300 µL of 45% MeOH. The identification and quantification of isoflavonoid root extracts were performed by the injection of 20 µL into a Shimadzu HPLC equipped with a C-18 column. The sample was eluted with a mobile phase composed of AcOH: H₂O (1:9 v/v) as solvent A and CH₃CN as solvent B. The detection was performed at 260 nm. The results showed that the amount of aglycones was higher than the respective glycosides. This result is according to the biosynthetic pathway of flavonoids, where the formation of glycoside is further to the glycosides biosynthesis. The amount of formononetin was higher than daidzein. In roots, where H. obscurus spent the most part of its live cycle, the highest content of formononetin was found in G 27, Pawera, Sabtoron High, Redqueli-INIA and Superqueli-INIA cvs. (2.1, 1.8, 1.8, 1.6 and 1.0 mg g⁻¹ respectively); and the lowest amount of daidzein were found Superqueli-INIA (0.32 mg g⁻¹) and in the experimental line Sel Syn Int4 (0.24 mg g⁻¹). This ecotype showed a high content of formononetin (0.9 mg g⁻¹). This information, associated with cultural practices, could help farmers and breeders to reduce H. obscurus in grassland, selecting ecotypes with high content of formononetin and low amount of daidzein in the roots of red clover plants. Acknowledgements: FONDECYT 1141245 and 11130715.Keywords: daidzein, formononetin, isoflavonoid glycosides, trifolium pratense
Procedia PDF Downloads 21715 ‘Call Before, Save Lives’: Reducing Emergency Department Visits through Effective Communication
Authors: Sandra Cardoso, Gaspar Pais, Judite Neves, Sandra Cavaca, Fernando Araújo
Abstract:
In 2021, Portugal has 63 emergency department (ED) visits per 100 people annually, the highest numbers in Europe. While EDs provide a critical service, high use is indicative of inappropriate and inefficient healthcare. In Portugal, all ED have the Manchester Triage System (MTS), a clinical risk management tool to enable that patients are seen in order of clinical priority. In 2023, more than 40% of the ED visits were of non-urgent conditions (blue and green), that could be better managed in primary health care (PHC), meaning wrong use of resources and lack of health literacy. From 2017, the country has a phone line, SNS24 (Contact Centre of the National Health Service), for triage, counseling, and referral service, 24 hours/7 days a week. The pilot project ‘Call before, save lives’ was implemented in the municipalities of Póvoa de Varzim and Vila do Conde (around 150.000 residents), in May 2023, by the executive board of the Portuguese Health Service, with the support of the Shared Services of the Ministry of Health, and local authorities. This geographical area has short travel times, 99% of the population a family doctor and the region is organized in a health local unit (HLU), integrating PHC and the local hospital. The purposes of this project included to increase awareness to contact SNS 24, before going to an ED, and non-urgent conditions oriented to a family doctor, reducing ED visits. The implementation of the project involved two phases, beginning with: i) development of campaigns using local influencers (fishmonger, model, fireman) through local institutions and media; ii) provision of telephone installed on site to contact SNS24; iii) establishment of open consultation in PHC; iv) promotion of the use of SNS24; v) creation of acute consultations at the hospital for complex chronic patients; and vi) direct referral for home hospitalization by PHC. The results of this project showed an excellent level of access to SNS24, an increase in the number of users referred to ED, with great satisfaction of users and professionals. The second phase, initiated in January 2024, for access to the ED, the need for prior referral was established as an admission rule, except for certain situations, as trauma patients. If the patient refuses, their registration in the ED and subsequent screening in accordance with the MTS must be ensured. When the patient is non-urgent, shall not be observed in the ED, provided that, according to his clinical condition, is guaranteed to be referred to PHC or to consultation/day hospital, through effective scheduling of an appointment for the same or the following day. In terms of results, 8 weeks after beginning of phase 2, we assist of a decrease in self-reported patients to ED from 59% to 15%, and a reduction of around 7% of ED visits. The key for this success was an effective public campaign that increases the knowledge of the right use of the health system, and capable of changing behaviors.Keywords: contact centre of the national health service, emergency department visits, public campaign, health literacy, SNS24
Procedia PDF Downloads 6714 Enzymatic Determination of Limonene in Red Clover Genotypes
Authors: Andrés Quiroz, Emilio Hormazabal, Ana Mutis, Fernando Ortega, Manuel Chacón-Fuentes, Leonardo Parra
Abstract:
Red clover (Trifolium pratense L.) is an important forage species in temperate regions of the world. The main limitation of this species worldwide is a lack of persistence related to the high mortality of plants due to a complex of biotic and abiotic factors, determining a life span of two or three seasons. Because of the importance of red clover in Chile, a red clover breeding program was started at INIA Carillanca Research Center in 1989, with the main objective of improving the survival of plants, forage yield, and persistence. The main selection criteria for selecting new varieties have been based on agronomical parameters and biotic factors. The main biotic factor associated with red clover mortality in Chile is Hylastinus obscurus (Coleoptera: Curculionidae). Both larval and adults feed on the roots, causing weakening and subsequent death of clover plants. Pesticides have not been successful for controlling infestations of this root borer. Therefore, alternative strategies for controlling this pest are a high priority for red clover producers. Currently, the role of semiochemical in the interaction between H. obscurus and red clover plants has been widely studied for our group. Specifically, from the red clover foliage has been identified limonene is eliciting repellency from the root borer. Limonene is generated in the plant from two independent biosynthetic pathways, the mevalonic acid, and deoxyxylulose pathway. Mevalonate pathway enzymes are localized in the cytosol, whereas the deoxyxylulose phosphate pathway enzymes are found in plastids. In summary, limonene can be determinated by enzymatic bioassay using GPP as substrate and by limonene synthase expression. Therefore, the main objective of this work was to study genetic variation of limonene in material provided by INIA´s Red Clover breeding program. Protein extraction was carried out homogenizing 250 mg of leave tissue and suspended in 6 mL of extraction buffer (PEG 1500, PVP-30, 20 mM MgCl2 and antioxidants) and stirred on ice for 20 min. After centrifugation, aliquots of 2.5 mL were desalted on PD-10 columns, resulting in a final volume of 3.5 mL. Protein determination was performed according to Bradford with BSA as a standard. Monoterpene synthase assays were performed with 50 µL of protein extracts transferred into gas-tight 2 mL crimp seal vials after addition of 4 µL MgCl₂ and 41 µL assay buffer. The assay was started by adding 5 µL of a GPP solution. The mixture was incubated for 30 min at 40 °C. Biosynthesized limonene was quantified in a GC equipped with a chiral column and using synthetic R and S-limonene standards. The enzymatic the production of R and S-limonene from different Superqueli-Carillanca genotypes is shown in this work. Preliminary results showed significant differences in limonene content among the genotypes analyzed. These results constitute an important base for selecting genotypes with a high content of this repellent monoterpene towards H. obscurus.Keywords: head space, limonene enzymatic determination, red clover, Hylastinus obscurus
Procedia PDF Downloads 26613 Parasitological Tracking of Wild Passerines in Group for the Rehabilitation of Native Fauna and Its Habitat
Authors: Catarina Ferreira Rebelo, Luis Madeira de Carvalho, Fernando González González
Abstract:
The order Passeridae corresponds to the richest and most abundant group of birds, with approximately 6500 species, making it possible to assert that two out of every three bird species are passerines. They are globally distributed and exhibit remarkable morphological and ecological variability. While numerous species of parasites have been identified and described in wild birds, there has been little focus on passeriformes. Seventeen passerines admitted to GREFA, a Wildlife Rehabilitation Center, throughout the months of October, November and December 2022 were analyzed. The species included Aegithalos caudatus, Anthus pratensis, Carduelis chloris, Certhia brachydactyla, Erithacus rubecula, Fringilla coelebs, Parus ater, Passer domesticus, Sturnus unicolor, Sylvia atricapilla, Turdus merula and Turdus philomelos. Data regarding past history was collected, and necropsies were conducted to identify the cause of death and body condition and determine the presence of parasites. Additionally, samples of intestinal content were collected for direct/fecal smear, flotation and sedimentation techniques. Sixteen (94.1%) passerines were considered positive for the presence of parasitic forms in at least one of the techniques used, including parasites detected in necropsy. Adult specimens of both sexes and tritonymphs of Monojoubertia microhylla and ectoparasites of the genus Ornithonyssus were identified. Macroscopic adult endoparasitic forms were also found during necropsies, including Diplotriaena sp., Serratospiculum sp. and Porrocaecum sp.. Parasitism by coccidia was observed with no sporulation. Additionally, eggs of nematodes from various genera were detected, such as Diplotriaena sp., Capillaria sp., Porrocaecum sp., Syngamus sp. and Strongyloides sp., eggs of trematodes, specifically the genus Brachylecithum and cestode oncospheres, whose genera were not identified. To our knowledge, the respiratory nematode Serratospiculum sp. found in this study is being reported for the first time in passerines in the Iberian Peninsula, along with the application of common coprological techniques for the identification of eggs in the intestinal content. The majority of parasites identified utilize intermediary hosts present in the diet of the passerines sampled. Furthermore, the discovery of certain parasites with a direct life cycle could potentially exert greater influence, particularly in specific scenarios such as within nests or during the rehabilitation process in wildlife centers. These parasites may impact intraspecific competition, increase susceptibility to predators or lead to death. However, their cost to wild birds is often not clear, as individuals can endure various parasites without significant harm. Furthermore, wild birds serve as important sources of parasites across different animal groups, including humans and other mammals. This study provides invaluable insights into the parasitic fauna of these birds, not only serving as a cornerstone for future epidemiological investigations but also enhancing our comprehension of these avian species.Keywords: birds, parasites, passerines, wild, spain
Procedia PDF Downloads 4012 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 35011 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.Keywords: cross-validation, importance sampling, information criteria, predictive accuracy
Procedia PDF Downloads 39210 Implementation of Synthesis and Quality Control Procedures of ¹⁸F-Fluoromisonidazole Radiopharmaceutical
Authors: Natalia C. E. S. Nascimento, Mercia L. Oliveira, Fernando R. A. Lima, Leonardo T. C. do Nascimento, Marina B. Silveira, Brigida G. A. Schirmer, Andrea V. Ferreira, Carlos Malamut, Juliana B. da Silva
Abstract:
Tissue hypoxia is a common characteristic of solid tumors leading to decreased sensitivity to radiotherapy and chemotherapy. In the clinical context, tumor hypoxia assessment employing the positron emission tomography (PET) tracer ¹⁸F-fluoromisonidazole ([¹⁸F]FMISO) is helpful for physicians for planning and therapy adjusting. The aim of this work was to implement the synthesis of 18F-FMISO in a TRACERlab® MXFDG module and also to establish the quality control procedure. [¹⁸F]FMISO was synthesized at Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN/Brazil) using an automated synthesizer (TRACERlab® MXFDG, GE) adapted for the production of [¹⁸F]FMISO. The FMISO chemical standard was purchased from ABX. 18O- enriched water was acquired from Center of Molecular Research. Reagent kits containing eluent solution, acetonitrile, ethanol, 2.0 M HCl solution, buffer solution, water for injections and [¹⁸F]FMISO precursor (dissolved in 2 ml acetonitrile) were purchased from ABX. The [¹⁸F]FMISO samples were purified by Solid Phase Extraction method. The quality requirements of [¹⁸F]FMISO are established in the European Pharmacopeia. According to that reference, quality control of [¹⁸F]FMISO should include appearance, pH, radionuclidic identity and purity, radiochemical identity and purity, chemical purity, residual solvents, bacterial endotoxins, and sterility. The duration of the synthesis process was 53 min, with radiochemical yield of (37.00 ± 0.01) % and the specific activity was more than 70 GBq/µmol. The syntheses were reproducible and showed satisfactory results. In relation to the quality control analysis, the samples were clear and colorless at pH 6.0. The spectrum emission, measured by using a High-Purity Germanium Detector (HPGe), presented a single peak at 511 keV and the half-life, determined by the decay method in an activimeter, was (111.0 ± 0.5) min, indicating no presence of radioactive contaminants, besides the desirable radionuclide (¹⁸F). The samples showed concentration of tetrabutylammonium (TBA) < 50μg/mL, assessed by visual comparison to TBA standard applied in the same thin layer chromatographic plate. Radiochemical purity was determined by high performance liquid chromatography (HPLC) and the results were 100%. Regarding the residual solvents tested, ethanol and acetonitrile presented concentration lower than 10% and 0.04%, respectively. Healthy female mice were injected via lateral tail vein with [¹⁸F]FMISO, microPET imaging studies (15 min) were performed after 2 h post injection (p.i), and the biodistribution was analyzed in five-time points (30, 60, 90, 120 and 180 min) after injection. Subsequently, organs/tissues were assayed for radioactivity with a gamma counter. All parameters of quality control test were in agreement to quality criteria confirming that [¹⁸F]FMISO was suitable for use in non-clinical and clinical trials, following the legal requirements for the production of new radiopharmaceuticals in Brazil.Keywords: automatic radiosynthesis, hypoxic tumors, pharmacopeia, positron emitters, quality requirements
Procedia PDF Downloads 1939 Increased Stability of Rubber-Modified Asphalt Mixtures to Swelling, Expansion and Rebound Effect during Post-Compaction
Authors: Fernando Martinez Soto, Gaetano Di Mino
Abstract:
The application of rubber into bituminous mixtures requires attention and care during mixing and compaction. Rubber modifies the properties because it reacts in the internal structure of bitumen at high temperatures changing the performance of the mixture (interaction process of solvents with binder-rubber aggregate). The main change is the increasing of the viscosity and elasticity of the binder due to the larger sizes of the rubber particles by dry process but, this positive effect is counteracted by short mixing times, compared to wet technology, and due to the transport processes, curing time and post-compaction of the mixtures. Therefore, negative effects as swelling of rubber particles, rebounding effect of the specimens and thermal changes by different expansion of the structure inside the mixtures, can change the mechanical properties of the rubberized blends. Based on the dry technology, different asphalt-rubber binders using devulcanized or natural rubber (truck and bus tread rubber), have served to demonstrate these effects and how to solve them into two dense-gap graded rubber modified asphalt concrete mixes (RUMAC) to enhance the stability, workability and durability of the compacted samples by Superpave gyratory compactor method. This paper specifies the procedures developed in the Department of Civil Engineering of the University of Palermo during September 2016 to March 2017, for characterizing the post-compaction and mix-stability of the one conventional mixture (hot mix asphalt without rubber) and two gap-graded rubberized asphalt mixes according granulometry for rail sub-ballast layers with nominal size of Ø22.4mm of aggregates according European standard. Thus, the main purpose of this laboratory research is the application of ambient ground rubber from scrap tires processed at conventional temperature (20ºC) inside hot bituminous mixtures (160-220ºC) as a substitute for 1.5%, 2% and 3% by weight of the total aggregates (3.2%, 4.2% and, 6.2% respectively by volumetric part of the limestone aggregates of bulk density equal to 2.81g/cm³) considered, not as a part of the asphalt binder. The reference bituminous mixture was designed with 4% of binder and ± 3% of air voids, manufactured for a conventional bitumen B50/70 at 160ºC-145ºC mix-compaction temperatures to guarantee the workability of the mixes. The proportions of rubber proposed are #60-40% for mixtures with 1.5 to 2% of rubber and, #20-80% for mixture with 3% of rubber (as example, a 60% of Ø0.4-2mm and 40% of Ø2-4mm). The temperature of the asphalt cement is between 160-180 ºC for mixing and 145-160 ºC for compaction, according to the optimal values for viscosity using Brookfield viscometer and 'ring and ball' - penetration tests. These crumb rubber particles act as a rubber-aggregate into the mixture, varying sizes between 0.4mm to 2mm in a first fraction, and 2-4mm as second proportion. Ambient ground rubber with a specific gravity of 1.154g/cm³ is used. The rubber is free of loose fabric, wire, and other contaminants. It was found optimal results in real beams and cylindrical specimens with each HMA mixture reducing the swelling effect. Different factors as temperature, particle sizes of rubber, number of cycles and pressures of compaction that affect the interaction process are explained.Keywords: crumb-rubber, gyratory compactor, rebounding effect, superpave mix-design, swelling, sub-ballast railway
Procedia PDF Downloads 2438 Structural Fluxionality of Luminescent Coordination Compounds with Lanthanide Ions
Authors: Juliana A. B. Silva, Caio H. T. L. Albuquerque, Leonardo L. dos Santos, Cristiane K. Oliveira, Ivani Malvestiti, Fernando Hallwass, Ricardo L. Longo
Abstract:
Complexes with lanthanide ions have been extensively studied due to their applications as luminescent, magnetic and catalytic materials as molecular or extended crystals, thin films, glasses, polymeric matrices, ionic liquids, and in solution. NMR chemical shift data in solution have been reported and suggest fluxional structures in a wide range of coordination compounds with rare earth ions. However, the fluxional mechanisms for these compounds are still not established. This structural fluxionality may affect the photophysical, catalytic and magnetic properties in solution. Thus, understanding the structural interconversion mechanisms may aid the design of coordination compounds with, for instance, improved (electro)luminescence, catalytic and magnetic behaviors. The [Eu(btfa)₃bipy] complex, where btfa= 4,4,4-trifluoro-1-phenyl-1,3-butanedionate and bipy= 2,2’-bipiridyl, has a well-defined X-ray crystallographic structure and preliminary 1H NMR data suggested a structural fluxionality. Thus, we have investigated a series of coordination compounds with lanthanide ions [Ln(btfa)₃L], where Ln = La, Eu, Gd or Yb and L= bipy or phen (phen=1,10-phenanthroline) using a combined theoretical-experimental approach. These complexes were synthesized and fully characterized, and detailed NMR measurements were obtained. They were also studied by quantum chemical computational methods (DFT-PBE0). The aim was to determine the relevant factors in the structure of these compounds that favor or not the fluxional behavior. Measurements of the 1H NMR signals at variable temperature in CD₂Cl₂ of the [Eu(btfa)₃L] complexes suggest that these compounds have a fluxional structure, because the crystal structure has non-equivalent btfa ligands that should lead to non-equivalent hydrogen atoms and thus to more signals in the NMR spectra than those obtained at room temperature, where all hydrogen atoms of the btfa ligands are equivalent, and phen ligand has an effective vertical symmetry plane. For the [Eu(btfa)₃bipy] complex, the broadening of the signals at –70°C provides a lower bound for the coalescence temperature, which indicates the energy barriers involved in the structural interconversion mechanisms are quite small. These barriers and, consequently, the coalescence temperature are dependent upon the radii of the lanthanide ion as well as to their paramagnetic effects. The PBE0 calculated structures are in very good agreement with the crystallographic data and, for the [Eu(btfa)₃bipy] complex, this method provided several distinct structures with almost the same energy. However, the energy barrier for structural interconversion via dissociative pathways were found to be quite high and could not explain the experimental observations. Whereas the pseudo-rotation pathways, involving the btfa and bipy ligands, have very small activation barriers, in excellent agreement with the NMR data. The results also showed an increase in the activation barrier along the lanthanide series due to the decrease of the ionic radii and consequent increase of the steric effects. TD-DFT calculations showed a dependence of the ligand donor state energy with different structures of the complex [Eu(btfa)₃phen], which can affect the energy transfer rates and the luminescence. The energy required to promote the structural fluxionality may also enhance the luminescence quenching in solution. These results can aid in the design of more luminescent compounds and more efficient devices.Keywords: computational chemistry, lanthanide-based compounds, NMR, structural fluxionality
Procedia PDF Downloads 1997 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine
Authors: D. Madhushanka, Y. Liu, H. C. Fernando
Abstract:
Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2
Procedia PDF Downloads 235