Search results for: Francisco Eneldo López Monteagudo
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 424

Search results for: Francisco Eneldo López Monteagudo

184 The Quality of Life, Situations and Emerging Concerns of Parents of Children with Neurodevelopmental Disorders in Philippine Children's Medical Center during the Covid-19 Pandemic

Authors: Annelyn Fatima Lopez, Ermenilda Avendano, Aileen Marie Vargas, Lara Baylon, Rorilee Angeles

Abstract:

BACKGROUND: The COVID-19 resulted in a public health emergency and quarantine measures which may negatively impact psychosocial and environmental aspects of vulnerable populations. OBJECTIVES: This study intended to determine the quality of life, situations and emerging concerns of parents of children with neurodevelopmental disorders during the ongoing coronavirus pandemic. METHODOLOGY: Parents of patients seen in the PCMC Neurodevelopmental Pediatrics OPD clinic were recruited to fill out questionnaires on parent and child characteristics, survey on situations and emerging concerns during the coronavirus pandemic and WHOQOL-BREF (Filipino version) for parental quality of life. RESULTS: Data from 115 respondents showed a lower score in the environmental domain. The child characteristics that are statistically comparable with the QoL scores include sex, severity of ID and ADHD while the parent characteristics that are statistically comparable with the QoL scores include educational attainment, monthly family income, father’s employment status and family structure (P-value <0.05). Most respondents reported physical distancing (82.61%) and curfew (80.87%) as measures implemented due to the pandemic. Inability to access essential services (43.48-74.48%) were further compounded by limited financial resources (51.30%) and public transport (60%). Government responses received include quarantine pass (90.43%), food allowance or relief package (86.09%), disinfection (60.87%), DSWD-SAP (42.61%) and cash distribution (41.74%). Concerns encountered include socio-environmental issues (i.e. no available transportation, effect on the ability to earn, inadequate food/medicine rations, disruptions in basic social services) and patient concerns (i.e. access to education, medical, developmental and behavioral services, nutrition and sleep). RECOMMENDATIONS: Programs and policies should be planned accordingly to provide improvement of quality of life for both parents and the child with a neurodevelopmental disorder.

Keywords: covid-19, neurodevelopmental disorder, parental quality of life, whoqol-bref

Procedia PDF Downloads 177
183 Counting Fishes in Aquaculture Ponds: Application of Imaging Sonars

Authors: Juan C. Gutierrez-Estrada, Inmaculada Pulido-Calvo, Ignacio De La Rosa, Antonio Peregrin, Fernando Gomez-Bravo, Samuel Lopez-Dominguez, Alejandro Garrocho-Cruz, Jairo Castro-Gutierrez

Abstract:

The semi-intensive aquaculture in traditional earth ponds is the main rearing system in Southern Spain. These fish rearing systems are approximately two thirds of aquatic production in this area which has made a significant contribution to the regional economy in recent years. In this type of rearing system, a crucial aspect is the correct quantification and control of the fish abundance in the ponds because the fish farmer knows how many fishes he puts in the ponds but doesn’t know how many fishes will harvest at the end of the rear period. This is a consequence of the mortality induced by different causes as pathogen agents as parasites, viruses and bacteria and other factors as predation of fish-eating birds and poaching. Track the fish abundance in these installations is very difficult because usually the ponds take up a large area of land and the management of the water flow is not automatized. Therefore, there is a very high degree of uncertainty on the abundance fishes which strongly hinders the management and planning of the sales. A novel and non-invasive procedure to count fishes in the ponds is by the means of imaging sonars, particularly fixed systems and/or linked to aquatic vehicles as Remotely Operated Vehicles (ROVs). In this work, a method based on census stations procedures is proposed to evaluate the fish abundance estimation accuracy using images obtained of multibeam sonars. The results indicate that it is possible to obtain a realistic approach about the number of fishes, sizes and therefore the biomass contained in the ponds. This research is included in the framework of the KTTSeaDrones Project (‘Conocimiento y transferencia de tecnología sobre vehículos aéreos y acuáticos para el desarrollo transfronterizo de ciencias marinas y pesqueras 0622-KTTSEADRONES-5-E’) financed by the European Regional Development Fund (ERDF) through the Interreg V-A Spain-Portugal Programme (POCTEP) 2014-2020.

Keywords: census station procedure, fish biomass, semi-intensive aquaculture, multibeam sonars

Procedia PDF Downloads 186
182 Effect of Cooking Process on the Antioxidant Activity of Different Variants of Tomato-Based Sofrito

Authors: Ana Beltran Sanahuja, A. Valdés García, Saray Lopez De Pablo Gallego, Maria Soledad Prats Moya

Abstract:

Tomato consumption has greatly increased worldwide in the last few years, mostly due to a growing demand for products like sofrito. In this sense, regular consumption of tomato-based products has been consistently associated with a reduction in the incidence of chronic degenerative diseases. The sofrito is a homemade tomato sauce typical of the Mediterranean area, which contains as main ingredients: tomato, onion, garlic and olive oil. There are also sofrito’s variations by adding other spices which bring at the same time not only color, flavor, smell and or aroma; they also provide medicinal properties, due to their antioxidant power. This protective effect has mainly been attributed to the predominant bioactive compounds present in sofrito, such as lycopene and other carotenoids as well as more than 40 different polyphenols. Regarding the cooking process, it is known that it can modify the properties and the availability of nutrients in sofrito; however, there is not enough information regarding this issue. For this reason, the aim of the present work is to evaluate the cooking effect on the antioxidant capacity of different variants of tomato-based sofrito combined with other spices, through the analysis of total phenols content (TPC) and to evaluate the antioxidant capacity by using the method of free radical 2,2-diphenyl-1-picrylhydrazyl (DPPH). Based on the results obtained, it can be confirmed that the basic sofrito composed of tomato, onion, garlic and olive oil and the sofrito with 1 g of rosemary added, are the ones with the highest content of phenols presenting greater antioxidant power than other industrial sofrito, and that of other variables of sofrito with added thyme or higher amounts of garlic. Moreover, it has been observed that in the elaboration of the tomato-based sofrito, it is possible to cook until 60 minutes, since the cooking process increases the bioavailability of the carotenoids when breaking the cell walls, which weakens the binding forces between the carotenoids and increases the levels of antioxidants present, confirmed both with the TPC and DPPH methods. It can be concluded that the cooking process of different variants of tomato-based sofrito, including spices, can improve the antioxidant capacity. The synergistic effects of different antioxidants may have a greater protective effect; increasing, also, the digestibility of proteins. In addition, the antioxidants help to deactivate the free radicals of diseases such as atherosclerosis, aging, immune suppression, cancer, and diabetes.

Keywords: antioxidants, cooking process, phenols sofrito

Procedia PDF Downloads 107
181 Breeding Performance and Egg Quality of Red Jungle Fowl (Gallus Gallus L.) Mated with Native Hens (Gallus galus domesticus) in Selected Areas of Leyte under Confinement System

Authors: Francisco F. Buctot Jr.

Abstract:

This study was conducted to assess the breeding performance and egg quality traits of Red Jungle Fowls in selected areas of Leyte mated to Native hens under confinement system. A total of six Red Jungle Fowl roosters, two native roosters and 16 native hens were randomly assigned to four treatments with eight replications; each composed of one rooster and two hens randomly laid out in a Randomized Complete Block Design set up. Result on egg weight showed highly significant difference at p<0.01 and revealed heaviest weight (39.0 g) and lightest weight (35.75 g) on Native x Native and Baybay RJF x Native, respectively. While comparable number of eggs per clutch, fertility and hatchability rates, yolk and albumen weights, shell weight, egg length and width, egg shape index and yolk color score were obtained.

Keywords: egg clutch, egg shape index, native chicken, hatchability rate

Procedia PDF Downloads 339
180 Architectural Design Studio (ADS) as an Operational Synthesis in Architectural Education

Authors: Francisco A. Ribeiro Da Costa

Abstract:

Who is responsible for teaching architecture; consider various ways to participate in learning, manipulating various pedagogical tools to streamline the creative process. The Architectural Design Studio (ADS) should become a holistic, systemic process responding to the complexity of our world. This essay corresponds to a deep reflection developed by the author on the teaching of architecture. The outcomes achieved are the corollary of experimentation; discussion and application of pedagogical methods that allowed consolidate the creativity applied by students. The purpose is to show the conjectures that have been considered effective in creating an intellectual environment that nurtures the subject of Architectural Design Studio (ADS), as an operational synthesis in the final stage of the degree. These assumptions, which are part of the proposed model, displaying theories and teaching methodologies that try to respect the learning process based on student learning styles Kolb, ensuring their latent specificities and formulating the structure of the ASD discipline. In addition, the assessing methods are proposed, which consider the architectural Design Studio as an operational synthesis in the teaching of architecture.

Keywords: teaching-learning, architectural design studio, architecture, education

Procedia PDF Downloads 364
179 Iridium-Based Bimetallic Catalysts for Hydrogen Production through Glycerol Aqueous-Phase Reforming

Authors: Francisco Espinosa, Juan Chavarría

Abstract:

Glycerol is a byproduct of biodiesel production that can be used for aqueous-phase reforming to obtain hydrogen. Iridium is a material that has high activity and hydrogen selectivity for steam phase reforming. Nevertheless, a drawback for the use of iridium in aqueous-phase reforming is the low activity in water-gas shift reaction. Therefore, in this work, it is proposed the use of nickel and copper as a second metal in the catalyst to reach a synergetic effect. Iridium, iridium-nickel and iridium-copper catalysts were prepared by incipient wetness impregnation and evaluated in the aqueous-phase reforming of glycerol using CeO₂ or La₂O₃ as support. The catalysts were characterized by XRD, XPS, and EDX. The reactions were carried out in a fixed bed reactor feeding a solution of glycerol 10 wt% in water at 270°C, and reaction products were analyzed by gas chromatography. It was found that IrNi/CeO₂ reached highest glycerol conversion and hydrogen production, slightly above 70% and 43 vol% respectively. In terms of conversion, iridium is a promising metal, and its activity for hydrogen production can be enhanced when adding a second metal.

Keywords: aqueous-phase reforming, glycerol, hydrogen production, iridium

Procedia PDF Downloads 292
178 Anaerobic Co-digestion in Two-Phase TPAD System of Sewage Sludge and Fish Waste

Authors: Rocio López, Miriam Tena, Montserrat Pérez, Rosario Solera

Abstract:

Biotransformation of organic waste into biogas is considered an interesting alternative for the production of clean energy from renewable sources by reducing the volume and organic content of waste Anaerobic digestion is considered one of the most efficient technologies to transform waste into fertilizer and biogas in order to obtain electrical energy or biofuel within the concept of the circular economy. Currently, three types of anaerobic processes have been developed on a commercial scale: (1) single-stage process where sludge bioconversion is completed in a single chamber, (2) two-stage process where the acidogenic and methanogenic stages are separated into two chambers and, finally, (3) temperature-phase sequencing (TPAD) process that combines a thermophilic pretreatment unit prior to mesophilic anaerobic digestion. Two-stage processes can provide hydrogen and methane with easier control of the first and second stage conditions producing higher total energy recovery and substrate degradation than single-stage processes. On the other hand, co-digestion is the simultaneous anaerobic digestion of a mixture of two or more substrates. The technology is similar to anaerobic digestion but is a more attractive option as it produces increased methane yields due to the positive synergism of the mixtures in the digestion medium thus increasing the economic viability of biogas plants. The present study focuses on the energy recovery by anaerobic co-digestion of sewage sludge and waste from the aquaculture-fishing sector. The valorization is approached through the application of a temperature sequential phase process or TPAD technology (Temperature - Phased Anaerobic Digestion). Moreover, two-phase of microorganisms is considered. Thus, the selected process allows the development of a thermophilic acidogenic phase followed by a mesophilic methanogenic phase to obtain hydrogen (H₂) in the first stage and methane (CH₄) in the second stage. The combination of these technologies makes it possible to unify all the advantages of these anaerobic digestion processes individually. To achieve these objectives, a sequential study has been carried out in which the biochemical potential of hydrogen (BHP) is tested followed by a BMP test, which will allow checking the feasibility of the two-stage process. The best results obtained were high total and soluble COD yields (59.8% and 82.67%, respectively) as well as H₂ production rates of 12LH₂/kg SVadded and methane of 28.76 L CH₄/kg SVadded for TPAD.

Keywords: anaerobic co-digestion, TPAD, two-phase, BHP, BMP, sewage sludge, fish waste

Procedia PDF Downloads 121
177 Wind Comfort and Safety of People in the Vicinity of Tall Buildings

Authors: Mohan Kotamrazu

Abstract:

Tall buildings block and divert strong upper level winds to the ground. These high velocity winds many a time create adverse wind effects at ground level which can be uncomfortable and even compromise the safety of pedestrians and people who frequent the spaces in the vicinity of tall buildings. Discomfort can be experienced around the entrances and corners of tall buildings. Activities such as strolling or sitting in a park, waiting for a bus near a tall building can become highly unpleasant. For the elderly unpleasant conditions can also become dangerous leading to accidents and injuries. Today there is a growing concern among architects, planners and urban designers about the wind environment in the vicinity of tall building. Regulating authorities insist on wind tunnel testing of tall buildings in cities such as Wellington, Auckland, Boston, San Francisco, etc. prior to granting permission for their construction The present paper examines the different ways that tall buildings can induce strong winds at pedestrian level and their impact on people who frequent the spaces around tall buildings.

Keywords: tall buildings, wind effects, wind comfort, wind safety

Procedia PDF Downloads 344
176 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English

Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista

Abstract:

The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.

Keywords: corpus linguistics, historical linguistics, old English, parallel corpus

Procedia PDF Downloads 173
175 Exposing The Invisible

Authors: Kimberley Adamek

Abstract:

According to the Council on Tall Buildings, there has been a rapid increase in the construction of tall or “megatall” buildings over the past two decades. Simultaneously, the New England Journal of Medicine has reported that there has been a steady increase in climate related natural disasters since the 1970s; the eastern expansion of the USA's infamous Tornado Alley being just one of many current issues. In the future, this could mean that tall buildings, which already guide high speed winds down to pedestrian levels would have to withstand stronger forces and protect pedestrians in more extreme ways. Although many projects are required to be verified within wind tunnels and a handful of cities such as San Francisco have included wind testing within building code standards, there are still many examples where wind is only considered for basic loading. This typically results in and an increase of structural expense and unwanted mitigation strategies that are proposed late within a project. When building cities, architects rarely consider how each building alters the invisible patterns of wind and how these alterations effect other areas in different ways later on. It is not until these forces move, overpower and even destroy cities that people take notice. For example, towers have caused winds to blow objects into people (Walkie-Talkie Tower, Leeds, England), cause building parts to vibrate and produce loud humming noises (Beetham Tower, Manchester), caused wind tunnels in streets as well as many other issues. Alternatively, there exist towers which have used their form to naturally draw in air and ventilate entire facilities in order to eliminate the needs for costly HVAC systems (The Met, Thailand) and used their form to increase wind speeds to generate electricity (Bahrain Tower, Dubai). Wind and weather exist and effect all parts of the world in ways such as: Science, health, war, infrastructure, catastrophes, tourism, shopping, media and materials. Working in partnership with a leading wind engineering company RWDI, a series of tests, images and animations documenting discovered interactions of different building forms with wind will be collected to emphasize the possibilities for wind use to architects. A site within San Francisco (due to its increasing tower development, consistently wind conditions and existing strict wind comfort criteria) will host a final design. Iterations of this design will be tested within the wind tunnel and computational fluid dynamic systems which will expose, utilize and manipulate wind flows to create new forms, technologies and experiences. Ultimately, this thesis aims to question the amount which the environment is allowed to permeate building enclosures, uncover new programmatic possibilities for wind in buildings, and push the boundaries of working with the wind to ensure the development and safety of future cities. This investigation will improve and expand upon the traditional understanding of wind in order to give architects, wind engineers as well as the general public the ability to broaden their scope in order to productively utilize this living phenomenon that everyone constantly feels but cannot see.

Keywords: wind engineering, climate, visualization, architectural aerodynamics

Procedia PDF Downloads 337
174 Street Art Lenses: A Glimpse into the Street Artists’ Identity and Socio-Political Perspective in Brussels

Authors: José Francisco Urrutia Reyes, Judith Espinosa Real

Abstract:

This paper is meant to re-examine the role of street art in the contemporary world. By studying this form of art in Brussels, it can be explained how murals show the socio-political reality of a given community and influence on its interaction. Through the definitions of street art, murals and street artists, and analysing their role in Brussels, it is possible to understand how this counter culture movement serves as an engine of social development, as it interacts with its surroundings sending a clear message to a wider audience. Street art impacts on its environment because it interacts with the people who occupies the day-to-day public space. This has proven to be effective in the arouse of social consciousness, up to the point of being adopted by the government of Brussels to promote social movements such as the AIDS-HIV campaign along with the Plate-Forme Prévention Sida. It can be concluded that street art has evolved since its vandalic beginnings, to become a form of art that has not lost it counter official status, but now has a critical vision that can promote social awakening. Street art is now a global trend that uses visual inputs to create a positive impact.

Keywords: street art, Brussels, social impact, political perspective

Procedia PDF Downloads 326
173 Wastewater Treatment in the Abrasives Industry via Fenton and Photo-Fenton Oxidation Processes: A Case Study from Peru

Authors: Hernan Arturo Blas López, Gustavo Henndel Lopes, Antonio Carlos Silva Costa Teixeira, Carmen Elena Flores Barreda, Patricia Araujo Pantoja

Abstract:

Phenols are toxic for life and the environment and may come from many sources. Uncured phenolic monomers present in phenolic resins used as binders in grinding wheels and emery paper can contaminate industrial wastewaters in abrasives manufacture plants. Furthermore, vestiges of resol and novolacs resins generated by wear and tear of abrasives are also possible sources of water contamination by phenolics in these facilities. Fortunately, advanced oxidation by dark Fenton and photo-Fenton techniques are capable of oxidizing phenols and their degradation products up to their mineralization into H₂O and CO₂. The maximal allowable concentrations for phenols in Peruvian waterbodies is very low, such that insufficiently treated effluents from the abrasives industry are a potential environmental noncompliance. The current case study highlights findings obtained during the lab-scale application of Fenton’s and photo-assisted Fenton’s chemistries to real industrial wastewater samples from an abrasives manufacture plant in Peru. The goal was to reduce the phenolic content and sample toxicity. For this purpose, two independent variables-reaction time and effect of ultraviolet radiation–were studied as for their impacts on the concentration of total phenols, total organic carbon (TOC), biological oxygen demand (BOD) and chemical oxygen demand (COD). In this study, diluted samples (1 L) of the industrial effluent were treated with Fenton’s reagent (H₂O₂ and Fe²⁺ from FeSO₄.H₂O) during 10 min in a photochemical batch reactor (Alphatec RFS-500, Brazil) at pH 2.92. In the case of photo-Fenton tests with ultraviolet lamps of 9 W, UV-A, UV-B and UV-C lamps were evaluated. All process conditions achieved 100% of phenols degraded within 5 minutes. TOC, BOD and COD decreased by 49%, 52% and 86% respectively (all processes together). However, Fenton treatment was not capable of reducing BOD, COD and TOC below a certain value even after 10 minutes, contrarily to photo-Fenton. It was also possible to conclude that the processes here studied degrade other compounds in addition to phenols, what is an advantage. In all cases, elevated effluent dilution factors and high amounts of oxidant agent impact negatively the overall economy of the processes here investigated.

Keywords: fenton oxidation, wastewater treatment, phenols, abrasives industry

Procedia PDF Downloads 286
172 Overall Student Satisfaction at Tabor School of Education: An Examination of Key Factors Based on the AUSSE SEQ

Authors: Francisco Ben, Tracey Price, Chad Morrison, Victoria Warren, Willy Gollan, Robyn Dunbar, Frank Davies, Mark Sorrell

Abstract:

This paper focuses particularly on the educational aspects that contribute to the overall educational satisfaction rated by Tabor School of Education students who participated in the Australasian Survey of Student Engagement (AUSSE) conducted by the Australian Council for Educational Research (ACER) in 2010, 2012 and 2013. In all three years of participation, Tabor ranked first especially in the area of overall student satisfaction. By using a single level path analysis in relation to the AUSSE datasets collected using the Student Engagement Questionnaire (SEQ) for Tabor School of Education, seven aspects that contribute to overall student satisfaction have been identified. There appears to be a direct causal link between aspects of the Supportive Learning Environment, Work Integrated Learning, Career Readiness, Academic Challenge, and overall educational satisfaction levels. A further three aspects, being Student and Staff Interactions, Active Learning, and Enriching Educational Experiences, indirectly influence overall educational satisfaction levels.

Keywords: attrition, retention, educational experience, pre-service teacher education, student satisfaction

Procedia PDF Downloads 328
171 Batch Biodrying of Pulp and Paper Secondary Sludge: Influence of Initial Moisture Content on the Process

Authors: César Huiliñir, Danilo Villanueva, Pedro Iván Alvarez, Francisco Cubillos

Abstract:

Biodrying aims at removing water from biowastes and has been mostly studied for municipal solid wastes (MSW), while few studies have dealt with secondary sludge from the paper and pulp industry. The goal of this study was to investigate the effect of initial moisture content (MC) on the batch biodrying of pulp and paper secondary sludge, using rice husks as bulking agents. Three initial MCs were studied (54, 65, and 74% w.b.) in closed batch laboratory-scale reactors under adiabatic conditions and with a constant air-flow rate (0.65 l min-1 kg-1 wet solid). The initial MC of the mixture of secondary sludge and rice husks showed a significant effect on the biodrying process. Using initial moisture content between 54-65% w.b., the solid moisture content was reduce up to 37 % w.b. in ten days, getting calorific values between 8000-9000 kJ kg-1. It was concluded that a decreasing of initial MC improves the drying rate and decreases the solid volatile consumption, therefore, the optimization of biodrying should consider this parameter.

Keywords: biodrying, secondary sludge, initial moisture content, pulp and paper industry, rice husk

Procedia PDF Downloads 468
170 Jurisdictional Federalism and Formal Federalism: Levels of Political Centralization on American and Brazilian Models

Authors: Henrique Rangel, Alexandre Fadel, Igor De Lazari, Bianca Neri, Carlos Bolonha

Abstract:

This paper promotes a comparative analysis of American and Brazilian models of federalism assuming their levels of political centralization as main criterion. The central problem faced herein is the Brazilian approach of Unitarian regime. Although the hegemony of federative form after 1989, Brazil had a historical frame of political centralization that remains under the 1988 constitutional regime. Meanwhile, United States framed a federalism in which States absorb significant authorities. The hypothesis holds that the amount of alternative criteria of federalization – which can generate political centralization –, and the way they are upheld on judicial review, are crucial to understand the levels of political centralization achieved in each model. To test this hypothesis, the research is conducted by a methodology temporally delimited to 1994-2014 period. Three paradigmatic precedents of U.S. Supreme Court were selected: United States vs. Morrison (2000), on gender-motivated violence, Gonzales vs. Raich (2005), on medical use of marijuana, and United States vs. Lopez (1995), on firearm possession on scholar zones. These most relevant cases over federalism in the recent activity of Supreme Court indicates a determinant parameter of deliberation: the commerce clause. After observe the criterion used to permit or prohibit the political centralization in America, the Brazilian normative context is presented. In this sense, it is possible to identify the eventual legal treatment these controversies could receive in this Country. The decision-making reveals some deliberative parameters, which characterizes each federative model. At the end of research, the precedents of Rehnquist Court promote a broad revival of federalism debate, establishing the commerce clause as a secure criterion to uphold or not the necessity of centralization – even with decisions considered conservative. Otherwise, the Brazilian federalism solves them controversies upon in a formalist fashion, within numerous and comprehensive – sometimes casuistic too – normative devices, oriented to make an intense centralization. The aim of this work is indicate how jurisdictional federalism found in United States can preserve a consistent model with States robustly autonomous, while Brazil gives preference to normative mechanisms designed to starts from centralization.

Keywords: constitutional design, federalism, U.S. Supreme Court, legislative authority

Procedia PDF Downloads 493
169 Kant’s Conception of Human Dignity and the Importance of Singularity within Commonality

Authors: Francisco Lobo

Abstract:

Kant’s household theory of human dignity as a common feature of all rational beings is the starting point of any intellectual endeavor to unravel the implications of this normative notion. Yet, it is incomplete, as it neglects considering the importance of the singularity or uniqueness of the individual. In a first, deconstructive stage, this paper describes the Kantian account of human dignity as one among many conceptions of human dignity. It reads carefully into the original wording used by Kant in German and its English translations, as well as the works of modern commentators, to identify its shortcomings. In a second, constructive stage, it then draws on the theories of Aristotle, Alexis de Tocqueville, John Stuart Mill, and Hannah Arendt to try and enhance the Kantian conception, in the sense that these authors give major importance to the singularity of the individual. The Kantian theory can be perfected by including elements from the works of these authors, while at the same time being mindful of the dangers entailed in focusing too much on singularity. The conclusion of this paper is that the Kantian conception of human dignity can be enhanced if it acknowledges that not only morality has dignity, but also the irreplaceable human individual to the extent that she is a narrative, original creature with the potential to act morally.

Keywords: commonality, dignity, Kant, singularity

Procedia PDF Downloads 247
168 Modeling the Acquisition of Expertise in a Sequential Decision-Making Task

Authors: Cristóbal Moënne-Loccoz, Rodrigo C. Vergara, Vladimir López, Domingo Mery, Diego Cosmelli

Abstract:

Our daily interaction with computational interfaces is plagued of situations in which we go from inexperienced users to experts through self-motivated exploration of the same task. In many of these interactions, we must learn to find our way through a sequence of decisions and actions before obtaining the desired result. For instance, when drawing cash from an ATM machine, choices are presented in a step-by-step fashion so that a specific sequence of actions must be performed in order to produce the expected outcome. But, as they become experts in the use of such interfaces, do users adopt specific search and learning strategies? Moreover, if so, can we use this information to follow the process of expertise development and, eventually, predict future actions? This would be a critical step towards building truly adaptive interfaces that can facilitate interaction at different moments of the learning curve. Furthermore, it could provide a window into potential mechanisms underlying decision-making behavior in real world scenarios. Here we tackle this question using a simple game interface that instantiates a 4-level binary decision tree (BDT) sequential decision-making task. Participants have to explore the interface and discover an underlying concept-icon mapping in order to complete the game. We develop a Hidden Markov Model (HMM)-based approach whereby a set of stereotyped, hierarchically related search behaviors act as hidden states. Using this model, we are able to track the decision-making process as participants explore, learn and develop expertise in the use of the interface. Our results show that partitioning the problem space into such stereotyped strategies is sufficient to capture a host of exploratory and learning behaviors. Moreover, using the modular architecture of stereotyped strategies as a Mixture of Experts, we are able to simultaneously ask the experts about the user's most probable future actions. We show that for those participants that learn the task, it becomes possible to predict their next decision, above chance, approximately halfway through the game. Our long-term goal is, on the basis of a better understanding of real-world decision-making processes, to inform the construction of interfaces that can establish dynamic conversations with their users in order to facilitate the development of expertise.

Keywords: behavioral modeling, expertise acquisition, hidden markov models, sequential decision-making

Procedia PDF Downloads 225
167 Potential Applications of Biosurfactants from Corn Steep Liquor in Cosmetic

Authors: J. M. Cruz, X. Vecıno, L. Rodrıguez-López, J. M. Dominguez, A. B. Moldes

Abstract:

The cosmetic and personal care industry are the fields where biosurfactants could have more possibilities of success because in this kind of products the replacement of synthetic detergents by natural surfactants will provide an additional added value to the product, at the same time that the harmful effects produced by some synthetic surfactants could be avoided or reduced. Therefore, nowadays, consumers are disposed to pay and additional cost if they obtain more natural products. In this work we provide data about the potential of biosurfactants in the cosmetic and personal care industry. Biosurfactants from corn steep liquor, that is a fermented and condensed stream, have showed good surface-active properties, reducing substantially the surface tension of water. The bacteria that usually growth in corn steep liquor comprises Lactobacillus species, generally recognize as safe. The biosurfactant extracted from CSL consists of a lipopeptide, composed by fatty acids, which can reduce the surface tension of water in more than 30 units. It is a yellow and viscous liquid with a density of 1.053 mg/mL and pH=4. By these properties, they could be introduced in the formulation of cosmetic creams, hair conditioners or shampoos. Moreover this biosurfactant extracted from corn steep liquor, have showed a potent antimicrobial effect on different strains of Streptococcus. Some species of Streptococcus are commonly found weakly living in the human respiratory and genitourinary systems, producing several diseases in humans, including skin diseases. For instance, Streptococcus pyogenes produces many toxins and enzymes that help to stabilize skin infections; probably biosurfactants from corn steep liquor can inhibit the mechanisms of the S. pyogenes enzymes. S. pyogenes is an important cause of pharyngitis, impetigo, cellulitis and necrotizing fasciitis. In this work it was observed that 50 mg/L of biosurfactant extract obtained from corn steep liquor is able to inhibit more than 50% the growth of S. pyogenes. Thus, cosmetic and personal care products, formulated with biosurfactants from corn steep liquor, could have prebiotic properties. The natural biosurfactant presented in this work and obtained from corn milling industry streams, have showed a high potential to provide an interesting and sustainable alternative to those, antibacterial and surfactant ingredients used in cosmetic and personal care manufacture, obtained by chemical synthesis, which can cause irritation, and often only show short time effects.

Keywords: antimicrobial activity, biosurfactants, cosmetic, personal care

Procedia PDF Downloads 227
166 The Emergence of Memory at the Nanoscale

Authors: Victor Lopez-Richard, Rafael Schio Wengenroth Silva, Fabian Hartmann

Abstract:

Memcomputing is a computational paradigm that combines information processing and storage on the same physical platform. Key elements for this topic are devices with an inherent memory, such as memristors, memcapacitors, and meminductors. Despite the widespread emergence of memory effects in various solid systems, a clear understanding of the basic microscopic mechanisms that trigger them is still a puzzling task. We report basic ingredients of the theory of solid-state transport, intrinsic to a wide range of mechanisms, as sufficient conditions for a memristive response that points to the natural emergence of memory. This emergence should be discernible under an adequate set of driving inputs, as highlighted by our theoretical prediction and general common trends can be thus listed that become a rule and not the exception, with contrasting signatures according to symmetry constraints, either built-in or induced by external factors at the microscopic level. Explicit analytical figures of merit for the memory modulation of the conductance are presented, unveiling very concise and accessible correlations between general intrinsic microscopic parameters such as relaxation times, activation energies, and efficiencies (encountered throughout various fields in Physics) with external drives: voltage pulses, temperature, illumination, etc. These building blocks of memory can be extended to a vast universe of materials and devices, with combinations of parallel and independent transport channels, providing an efficient and unified physical explanation for a wide class of resistive memory devices that have emerged in recent years. Its simplicity and practicality have also allowed a direct correlation with reported experimental observations with the potential of pointing out the optimal driving configurations. The main methodological tools used to combine three quantum transport approaches, Drude-like model, Landauer-Buttiker formalism, and field-effect transistor emulators, with the microscopic characterization of nonequilibrium dynamics. Both qualitative and quantitative agreements with available experimental responses are provided for validating the main hypothesis. This analysis also shades light on the basic universality of complex natural impedances of systems out of equilibrium and might help pave the way for new trends in the area of memory formation as well as in its technological applications.

Keywords: memories, memdevices, memristors, nonequilibrium states

Procedia PDF Downloads 66
165 Terrestrial Laser Scans to Assess Aerial LiDAR Data

Authors: J. F. Reinoso-Gordo, F. J. Ariza-López, A. Mozas-Calvache, J. L. García-Balboa, S. Eddargani

Abstract:

The DEMs quality may depend on several factors such as data source, capture method, processing type used to derive them, or the cell size of the DEM. The two most important capture methods to produce regional-sized DEMs are photogrammetry and LiDAR; DEMs covering entire countries have been obtained with these methods. The quality of these DEMs has traditionally been evaluated by the national cartographic agencies through punctual sampling that focused on its vertical component. For this type of evaluation there are standards such as NMAS and ASPRS Positional Accuracy Standards for Digital Geospatial Data. However, it seems more appropriate to carry out this evaluation by means of a method that takes into account the superficial nature of the DEM and, therefore, its sampling is superficial and not punctual. This work is part of the Research Project "Functional Quality of Digital Elevation Models in Engineering" where it is necessary to control the quality of a DEM whose data source is an experimental LiDAR flight with a density of 14 points per square meter to which we call Point Cloud Product (PCpro). In the present work it is described the capture data on the ground and the postprocessing tasks until getting the point cloud that will be used as reference (PCref) to evaluate the PCpro quality. Each PCref consists of a patch 50x50 m size coming from a registration of 4 different scan stations. The area studied was the Spanish region of Navarra that covers an area of 10,391 km2; 30 patches homogeneously distributed were necessary to sample the entire surface. The patches have been captured using a Leica BLK360 terrestrial laser scanner mounted on a pole that reached heights of up to 7 meters; the position of the scanner was inverted so that the characteristic shadow circle does not exist when the scanner is in direct position. To ensure that the accuracy of the PCref is greater than that of the PCpro, the georeferencing of the PCref has been carried out with real-time GNSS, and its accuracy positioning was better than 4 cm; this accuracy is much better than the altimetric mean square error estimated for the PCpro (<15 cm); The kind of DEM of interest is the corresponding to the bare earth, so that it was necessary to apply a filter to eliminate vegetation and auxiliary elements such as poles, tripods, etc. After the postprocessing tasks the PCref is ready to be compared with the PCpro using different techniques: cloud to cloud or after a resampling process DEM to DEM.

Keywords: data quality, DEM, LiDAR, terrestrial laser scanner, accuracy

Procedia PDF Downloads 73
164 School Autonomy in the United Kingdom: A Correlational Study Applied to English Principals

Authors: Pablo Javier Ortega-Rodriguez, Francisco Jose Pozuelos-Estrada

Abstract:

Recently, there has been a renewed interest in school autonomy in the United Kingdom and its impact on students' outcomes. English principals have a pivotal role in decision-making. The aim of this paper is to explore the correlation between the type of school (public or private) and the considerable responsibilities of English principals which participated in PISA 2015. The final sample consisted of 419 principals. Descriptive data (percentages and means) were generated for the variables related to professional autonomy. Pearson's chi-square test was used to determine if there is an association between the type of school and principals' responsibilities for relevant tasks. Statistical analysis was performed using SPSS software, version 22. Findings suggest a significant correlation between the type of school and principals' responsibility for firing teachers and formulating the school budget. This study confirms that the type of school is not associated with principals' responsibility for choosing which textbooks are used at school. The present study establishes a quantitative framework for defining four models of professional autonomy and some proposals to improve school autonomy in the United Kingdom.

Keywords: decision making, principals, professional autonomy, school autonomy

Procedia PDF Downloads 664
163 Influence of Cure Degree in GO and CNT-Epoxy Nanocomposites

Authors: Marina Borgert Moraes, Wesley Francisco, Filipe Vargas, Gilmar Patrocínio Thim

Abstract:

In recent years, carbon nanotubes (CNT) and graphene oxide (GO), especially the functionalized ones, have been added to epoxy resin in order to increase the mechanical, electrical and thermal properties of nanocomposites. However, it's still unknown how the presence of these nanoparticles influences the curing process and the final mechanical properties as well. In this work, kinetic and mechanical properties of the nanocomposites were analyzed, where the kinetic process was followed by DSC and the mechanical properties by DMA. Initially, CNT was annealed at high temperature (1800 °C) under vacuum atmosphere, followed by a chemical treatment using acids and ethylenediamine. GO was synthesized through chemical route, washed clean, dried and ground to #200. The presence of functional groups on CNT and GO surface was confirmed by XPS spectra and FT-IR. Then, epoxy resin, nanoparticles and acetone were mixed by sonication in order to obtain the composites. DSC analyses were performed on samples with different curing cycles (1h 80°C + 2h 120°C; 3h 80°C + 2h 120°C; 5h 80°C) and samples with different times at constant temperature (120°C). Results showed that the kinetic process and the mechanical strength are very dependent on the presence of graphene and functionalized-CNT in the nanocomposites.

Keywords: carbon nanotube, epoxy resin, Graphene oxide, nanocomposite

Procedia PDF Downloads 287
162 Analysis in Mexico on Workers Performing Highly Repetitive Movements with Sensory Thermography in the Surface of the Wrist and Elbows

Authors: Sandra K. Enriquez, Claudia Camargo, Jesús E. Olguín, Juan A. López, German Galindo

Abstract:

Currently companies have increased the number of disorders of cumulative trauma (CTDs), these are increasing significantly due to the Highly Repetitive Movements (HRM) performed in workstations, which causes economic losses to businesses, due to temporary and permanent disabilities of workers. This analysis focuses on the prevention of disorders caused by: repeatability, duration and effort; And focuses on reducing cumulative trauma disorders such as occupational diseases using sensory thermography as a noninvasive method, the above is to evaluate the injuries could have workers to perform repetitive motions. Objectives: The aim is to define rest periods or job rotation before they generate a CTD, this sensory thermography by analyzing changes in temperature patterns on wrists and elbows when the worker is performing HRM over a period of time 2 hours and 30 minutes. Information on non-work variables such as wrist and elbow injuries, weight, gender, age, among others, and work variables such as temperature workspace, repetitiveness and duration also met. Methodology: The analysis to 4 industrial designers, 2 men and 2 women to be specific was conducted in a business in normal health for a period of 12 days, using the following time ranges: the first day for every 90 minutes continuous work were asked to rest 5 minutes, the second day for every 90 minutes of continuous work were asked to rest 10 minutes, the same to work 60 and 30 minutes straight. Each worker was tested with 6 different ranges at least twice. This analysis was performed in a controlled room temperature between 20 and 25 ° C, and a time to stabilize the temperature of the wrists and elbows than 20 minutes at the beginning and end of the analysis. Results: The range time of 90 minutes working continuous and a rest of 5 minutes of activity is where the maximum temperature (Tmax) was registered in the wrists and elbows in the office, we found the Tmax was 35.79 ° C with a difference of 2.79 ° C between the initial and final temperature of the left elbow presented at the individual 4 during the 86 minutes, in of range in 90 minutes continuously working and rested for 5 minutes of your activity. Conclusions: It is possible with this alternative technology is sensory thermography predict ranges of rotation or rest for the prevention of CTD to perform HRM work activities, obtaining with this reduce occupational disease, quotas by health agencies and increasing the quality of life of workers, taking this technology a cost-benefit acceptable in the future.

Keywords: sensory thermography, temperature, cumulative trauma disorder (CTD), highly repetitive movement (HRM)

Procedia PDF Downloads 395
161 Dynamic Two-Way FSI Simulation for a Blade of a Small Wind Turbine

Authors: Alberto Jiménez-Vargas, Manuel de Jesús Palacios-Gallegos, Miguel Ángel Hernández-López, Rafael Campos-Amezcua, Julio Cesar Solís-Sanchez

Abstract:

An optimal wind turbine blade design must be able of capturing as much energy as possible from the wind source available at the area of interest. Many times, an optimal design means the use of large quantities of material and complicated processes that make the wind turbine more expensive, and therefore, less cost-effective. For the construction and installation of a wind turbine, the blades may cost up to 20% of the outline pricing, and become more important due to they are part of the rotor system that is in charge of transmitting the energy from the wind to the power train, and where the static and dynamic design loads for the whole wind turbine are produced. The aim of this work is the develop of a blade fluid-structure interaction (FSI) simulation that allows the identification of the major damage zones during the normal production situation, and thus better decisions for design and optimization can be taken. The simulation is a dynamic case, since we have a time-history wind velocity as inlet condition instead of a constant wind velocity. The process begins with the free-use software NuMAD (NREL), to model the blade and assign material properties to the blade, then the 3D model is exported to ANSYS Workbench platform where before setting the FSI system, a modal analysis is made for identification of natural frequencies and modal shapes. FSI analysis is carried out with the two-way technic which begins with a CFD simulation to obtain the pressure distribution on the blade surface, then these results are used as boundary condition for the FEA simulation to obtain the deformation levels for the first time-step. For the second time-step, CFD simulation is reconfigured automatically with the next time-step inlet wind velocity and the deformation results from the previous time-step. The analysis continues the iterative cycle solving time-step by time-step until the entire load case is completed. This work is part of a set of projects that are managed by a national consortium called “CEMIE-Eólico” (Mexican Center in Wind Energy Research), created for strengthen technological and scientific capacities, the promotion of creation of specialized human resources, and to link the academic with private sector in national territory. The analysis belongs to the design of a rotor system for a 5 kW wind turbine design thought to be installed at the Isthmus of Tehuantepec, Oaxaca, Mexico.

Keywords: blade, dynamic, fsi, wind turbine

Procedia PDF Downloads 448
160 Optimized Renewable Energy Mix for Energy Saving in Waste Water Treatment Plants

Authors: J. D. García Espinel, Paula Pérez Sánchez, Carlos Egea Ruiz, Carlos Lardín Mifsut, Andrés López-Aranguren Oliver

Abstract:

This paper shortly describes three main actuations over a Waste Water Treatment Plant (WWTP) for reducing its energy consumption: Optimization of the biological reactor in the aeration stage by including new control algorithms and introducing new efficient equipment, the installation of an innovative hybrid system with zero Grid injection (formed by 100kW of PV energy and 5 kW of mini-wind energy generation) and an intelligent management system for load consumption and energy generation control in the most optimum way. This project called RENEWAT, involved in the European Commission call LIFE 2013, has the main objective of reducing the energy consumptions through different actions on the processes which take place in a WWTP and introducing renewable energies on these treatment plants, with the purpose of promoting the usage of treated waste water for irrigation and decreasing the C02 gas emissions. WWTP is always required before waste water can be reused for irrigation or discharged in water bodies. However, the energetic demand of the treatment process is high enough for making the price of treated water to exceed the one for drinkable water. This makes any policy very difficult to encourage the re-use of treated water, with a great impact on the water cycle, particularly in those areas suffering hydric stress or deficiency. The cost of treating waste water involves another climate-change related burden: the energy necessary for the process is obtained mainly from the electric network, which is, in most of the cases in Europe, energy obtained from the burning of fossil fuels. The innovative part of this project is based on the implementation, adaptation and integration of solutions for this problem, together with a new concept of the integration of energy input and operative energy demand. Moreover, there is an important qualitative jump between the technologies used and the alleged technologies to use in the project which give it an innovative character, due to the fact that there are no similar previous experiences of a WWTP including an intelligent discrimination of energy sources, integrating renewable ones (PV and Wind) and the grid.

Keywords: aeration system, biological reactor, CO2 emissions, energy efficiency, hybrid systems, LIFE 2013 call, process optimization, renewable energy sources, wasted water treatment plants

Procedia PDF Downloads 330
159 D6tions: A Serious Game to Learn Software Engineering Process and Design

Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, 
 Francisco E. Martinez-Perez, Alberto S. Nunez-Varela

Abstract:

The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.

Keywords: serious games, software engineering, software engineering education, software engineering teaching process

Procedia PDF Downloads 450
158 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 286
157 Methodology for Developing an Intelligent Tutoring System Based on Marzano’s Taxonomy

Authors: Joaquin Navarro Perales, Ana Lidia Franzoni Velázquez, Francisco Cervantes Pérez

Abstract:

The Mexican educational system faces diverse challenges related with the quality and coverage of education. The development of Intelligent Tutoring Systems (ITS) may help to solve some of them by helping teachers to customize their classes according to the performance of the students in online courses. In this work, we propose the adaptation of a functional ITS based on Bloom’s taxonomy called Sistema de Apoyo Generalizado para la Enseñanza Individualizada (SAGE), to measure student’s metacognition and their emotional response based on Marzano’s taxonomy. The students and the system will share the control over the advance in the course, so they can improve their metacognitive skills. The system will not allow students to get access to subjects not mastered yet. The interaction between the system and the student will be implemented through Natural Language Processing techniques, thus avoiding the use of sensors to evaluate student’s response. The teacher will evaluate student’s knowledge utilization, which is equivalent to the last cognitive level in Marzano’s taxonomy.

Keywords: intelligent tutoring systems, student modelling, metacognition, affective computing, natural language processing

Procedia PDF Downloads 165
156 Contribution of the Corn Milling Industry to a Global and Circular Economy

Authors: A. B. Moldes, X. Vecino, L. Rodriguez-López, J. M. Dominguez, J. M. Cruz

Abstract:

The concept of the circular economy is focus on the importance of providing goods and services sustainably. Thus, in a future it will be necessary to respond to the environmental contamination and to the use of renewables substrates by moving to a more restorative economic system that drives towards the utilization and revalorization of residues to obtain valuable products. During its evolution our industrial economy has hardly moved through one major characteristic, established in the early days of industrialization, based on a linear model of resource consumption. However, this industrial consumption system will not be maintained during long time. On the other hand, there are many industries, like the corn milling industry, that although does not consume high amount of non renewable substrates, they produce valuable streams that treated accurately, they could provide additional, economical and environmental, benefits by the extraction of interesting commercial renewable products, that can replace some of the substances obtained by chemical synthesis, using non renewable substrates. From this point of view, the use of streams from corn milling industry to obtain surface-active compounds will decrease the utilization of non-renewables sources for obtaining this kind of compounds, contributing to a circular and global economy. However, the success of the circular economy depends on the interest of the industrial sectors in the revalorization of their streams by developing relevant and new business models. Thus, it is necessary to invest in the research of new alternatives that reduce the consumption of non-renewable substrates. In this study is proposed the utilization of a corn milling industry stream to obtain an extract with surfactant capacity. Once the biosurfactant is extracted, the corn milling stream can be commercialized as nutritional media in biotechnological process or as animal feed supplement. Usually this stream is combined with other ingredients obtaining a product namely corn gluten feed or may be sold separately as a liquid protein source for beef and dairy feeding, or as a nutritional pellet binder. Following the productive scheme proposed in this work, the corn milling industry will obtain a biosurfactant extract that could be incorporated in its productive process replacing those chemical detergents, used in some point of its productive chain, or it could be commercialized as a new product of the corn manufacture. The biosurfactants obtained from corn milling industry could replace the chemical surfactants in many formulations, and uses, and it supposes an example of the potential that many industrial streams could offer for obtaining valuable products when they are manage properly.

Keywords: biosurfactantes, circular economy, corn, sustainability

Procedia PDF Downloads 228
155 Understanding Evolutionary Algorithms through Interactive Graphical Applications

Authors: Javier Barrachina, Piedad Garrido, Manuel Fogue, Julio A. Sanguesa, Francisco J. Martinez

Abstract:

It is very common to observe, especially in Computer Science studies that students have difficulties to correctly understand how some mechanisms based on Artificial Intelligence work. In addition, the scope and limitations of most of these mechanisms are usually presented by professors only in a theoretical way, which does not help students to understand them adequately. In this work, we focus on the problems found when teaching Evolutionary Algorithms (EAs), which imitate the principles of natural evolution, as a method to solve parameter optimization problems. Although this kind of algorithms can be very powerful to solve relatively complex problems, students often have difficulties to understand how they work, and how to apply them to solve problems in real cases. In this paper, we present two interactive graphical applications which have been specially designed with the aim of making Evolutionary Algorithms easy to be understood by students. Specifically, we present: (i) TSPS, an application able to solve the ”Traveling Salesman Problem”, and (ii) FotEvol, an application able to reconstruct a given image by using Evolution Strategies. The main objective is that students learn how these techniques can be implemented, and the great possibilities they offer.

Keywords: education, evolutionary algorithms, evolution strategies, interactive learning applications

Procedia PDF Downloads 311