Search results for: nanopore technology
915 Development of Technologies for the Treatment of Nutritional Problems in Primary Care
Authors: Marta Fernández Batalla, José María Santamaría García, Maria Lourdes Jiménez Rodríguez, Roberto Barchino Plata, Adriana Cercas Duque, Enrique Monsalvo San Macario
Abstract:
Background: Primary Care Nursing is taking more autonomy in clinical decisions. One of the most frequent therapies to solve is related to the problems of maintaining a sufficient supply of food. Nursing diagnoses related to food are addressed by the nurse-family and community as the first responsible. Objectives and interventions are set according to each patient. To improve the goal setting and the treatment of these care problems, a technological tool is developed to help nurses. Objective: To evaluate the computational tool developed to support the clinical decision in feeding problems. Material and methods: A cross-sectional descriptive study was carried out at the Meco Health Center, Madrid, Spain. The study population consisted of four specialist nurses in primary care. These nurses tested the tool on 30 people with ‘need for nutritional therapy’. Subsequently, the usability of the tool and the satisfaction of the professional were sought. Results: A simple and convenient computational tool is designed for use. It has 3 main entrance fields: age, size, sex. The tool returns the following information: BMI (Body Mass Index) and caloric consumed by the person. The next step is the caloric calculation depending on the activity. It is possible to propose a goal of BMI or weight to achieve. With this, the amount of calories to be consumed is proposed. After using the tool, it was determined that the tool calculated the BMI and calories correctly (in 100% of clinical cases). satisfaction on nutritional assessment was ‘satisfactory’ or ‘very satisfactory’, linked to the speed of operations. As a point of improvement, the options of ‘stress factor’ linked to weekly physical activity. Conclusion: Based on the results, it is clear that the computational tools of decision support are useful in the clinic. Nurses are not only consumers of computational tools, but can develop their own tools. These technological solutions improve the effectiveness of nutrition assessment and intervention. We are currently working on improvements such as the calculation of protein percentages as a function of protein percentages as a function of stress parameters.Keywords: feeding behavior health, nutrition therapy, primary care nursing, technology assessment
Procedia PDF Downloads 229914 Preferences of Electric Buses in Public Transport; Conclusions from Real Life Testing in Eight Swedish Municipalities
Authors: Sven Borén, Lisiana Nurhadi, Henrik Ny
Abstract:
From a theoretical perspective, electric buses can be more sustainable and can be cheaper than fossil fuelled buses in city traffic. The authors have not found other studies based on actual urban public transport in Swedish winter climate. Further on, noise measurements from buses for the European market were found old. The aims of this follow-up study was therefore to test and possibly verify in a real-life environment how energy efficient and silent electric buses are, and then conclude on if electric buses are preferable to use in public transport. The Ebusco 2.0 electric bus, fitted with a 311 kWh battery pack, was used and the tests were carried out during November 2014-April 2015 in eight municipalities in the south of Sweden. Six tests took place in urban traffic and two took place in more of a rural traffic setting. The energy use for propulsion was measured via logging of the internal system in the bus and via an external charging meter. The average energy use turned out to be 8% less (0,96 kWh/km) than assumed in the earlier theoretical study. This rate allows for a 320 km range in public urban traffic. The interior of the bus was kept warm by a diesel heater (biodiesel will probably be used in a future operational traffic situation), which used 0,67 kWh/km in January. This verified that electric buses can be up to 25% cheaper when used in public transport in cities for about eight years. The noise was found to be lower, primarily during acceleration, than for buses with combustion engines in urban bus traffic. According to our surveys, most passengers and drivers appreciated the silent and comfortable ride and preferred electric buses rather than combustion engine buses. Bus operators and passenger transport executives were also positive to start using electric buses for public transport. The operators did however point out that procurement processes need to account for eventual risks regarding this new technology, along with personnel education. The study revealed that it is possible to establish a charging infrastructure for almost all studied bus lines. However, design of a charging infrastructure for each municipality requires further investigations, including electric grid capacity analysis, smart location of charging points, and tailored schedules to allow fast charging. In conclusion, electric buses proved to be a preferable alternative for all stakeholders involved in public bus transport in the studied municipalities. However, in order to electric buses to be a prominent support for sustainable development, they need to be charged either by stand-alone units or via an expansion of the electric grid, and the electricity should be made from new renewable sources.Keywords: sustainability, electric, bus, noise, greencharge
Procedia PDF Downloads 344913 Distinct Patterns of Resilience Identified Using Smartphone Mobile Experience Sampling Method (M-ESM) and a Dual Model of Mental Health
Authors: Hussain-Abdulah Arjmand, Nikki S. Rickard
Abstract:
The response to stress can be highly heterogenous, and may be influenced by methodological factors. The integrity of data will be optimized by measuring both positive and negative affective responses to an event, by measuring responses in real time as close to the stressful event as possible, and by utilizing data collection methods that do not interfere with naturalistic behaviours. The aim of the current study was to explore short term prototypical responses to major stressor events on outcome measures encompassing both positive and negative indicators of psychological functioning. A novel mobile experience sampling methodology (m-ESM) was utilized to monitor both effective responses to stressors in real time. A smartphone mental health app (‘Moodprism’) which prompts users daily to report both their positive and negative mood, as well as whether any significant event had occurred in the past 24 hours, was developed for this purpose. A sample of 142 participants was recruited as part of the promotion of this app. Participants’ daily reported experience of stressor events, levels of depressive symptoms and positive affect were collected across a 30 day period as they used the app. For each participant, major stressor events were identified on the subjective severity of the event rated by the user. Depression and positive affect ratings were extracted for the three days following the event. Responses to the event were scaled relative to their general reactivity across the remainder of the 30 day period. Participants were first clustered into groups based on initial reactivity and subsequent recovery following a stressor event. This revealed distinct patterns of responding along depressive symptomatology and positive affect. Participants were then grouped based on allocations to clusters in each outcome variable. A highly individualised nature in which participants respond to stressor events, in symptoms of depression and levels of positive affect, was observed. A complete description of the novel profiles identified will be presented at the conference. These findings suggest that real-time measurement of both positive and negative functioning to stressors yields a more complex set of responses than previously observed with retrospective reporting. The use of smartphone technology to measure individualized responding also proved to shed significant insight.Keywords: depression, experience sampling methodology, positive functioning, resilience
Procedia PDF Downloads 240912 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment
Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane
Abstract:
Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.Keywords: artificial intelligence, computer science, criminal investigation, digital forensics
Procedia PDF Downloads 213911 Development of Coastal Inundation–Inland and River Flow Interface Module Based on 2D Hydrodynamic Model
Authors: Eun-Taek Sin, Hyun-Ju Jang, Chang Geun Song, Yong-Sik Han
Abstract:
Due to the climate change, the coastal urban area repeatedly suffers from the loss of property and life by flooding. There are three main causes of inland submergence. First, when heavy rain with high intensity occurs, the water quantity in inland cannot be drained into rivers by increase in impervious surface of the land development and defect of the pump, storm sewer. Second, river inundation occurs then water surface level surpasses the top of levee. Finally, Coastal inundation occurs due to rising sea water. However, previous studies ignored the complex mechanism of flooding, and showed discrepancy and inadequacy due to linear summation of each analysis result. In this study, inland flooding and river inundation were analyzed together by HDM-2D model. Petrov-Galerkin stabilizing method and flux-blocking algorithm were applied to simulate the inland flooding. In addition, sink/source terms with exponentially growth rate attribute were added to the shallow water equations to include the inland flooding analysis module. The applications of developed model gave satisfactory results, and provided accurate prediction in comprehensive flooding analysis. The applications of developed model gave satisfactory results, and provided accurate prediction in comprehensive flooding analysis. To consider the coastal surge, another module was developed by adding seawater to the existing Inland Flooding-River Inundation binding module for comprehensive flooding analysis. Based on the combined modules, the Coastal Inundation – Inland & River Flow Interface was simulated by inputting the flow rate and depth data in artificial flume. Accordingly, it was able to analyze the flood patterns of coastal cities over time. This study is expected to help identify the complex causes of flooding in coastal areas where complex flooding occurs, and assist in analyzing damage to coastal cities. Acknowledgements—This research was supported by a grant ‘Development of the Evaluation Technology for Complex Causes of Inundation Vulnerability and the Response Plans in Coastal Urban Areas for Adaptation to Climate Change’ [MPSS-NH-2015-77] from the Natural Hazard Mitigation Research Group, Ministry of Public Safety and Security of Korea.Keywords: flooding analysis, river inundation, inland flooding, 2D hydrodynamic model
Procedia PDF Downloads 363910 Global Modeling of Drill String Dragging and Buckling in 3D Curvilinear Bore-Holes
Authors: Valery Gulyayev, Sergey Glazunov, Elena Andrusenko, Nataliya Shlyun
Abstract:
Enhancement of technology and techniques for drilling deep directed oil and gas bore-wells are of essential industrial significance because these wells make it possible to increase their productivity and output. Generally, they are used for drilling in hard and shale formations, that is why their drivage processes are followed by the emergency and failure effects. As is corroborated by practice, the principal drilling drawback occurring in drivage of long curvilinear bore-wells is conditioned by the need to obviate essential force hindrances caused by simultaneous action of the gravity, contact and friction forces. Primarily, these forces depend on the type of the technological regime, drill string stiffness, bore-hole tortuosity and its length. They can lead to the Eulerian buckling of the drill string and its sticking. To predict and exclude these states, special mathematic models and methods of computer simulation should play a dominant role. At the same time, one might note that these mechanical phenomena are very complex and only simplified approaches (‘soft string drag and torque models’) are used for their analysis. Taking into consideration that now the cost of directed wells increases essentially with complication of their geometry and enlargement of their lengths, it can be concluded that the price of mistakes of the drill string behavior simulation through the use of simplified approaches can be very high and so the problem of correct software elaboration is very urgent. This paper deals with the problem of simulating the regimes of drilling deep curvilinear bore-wells with prescribed imperfect geometrical trajectories of their axial lines. On the basis of the theory of curvilinear flexible elastic rods, methods of differential geometry, and numerical analysis methods, the 3D ‘stiff-string drag and torque model’ of the drill string bending and the appropriate software are elaborated for the simulation of the tripping in and out regimes and drilling operations. It is shown by the computer calculations that the contact and friction forces can be calculated and regulated, providing predesigned trouble-free modes of operation. The elaborated mathematic models and software can be used for the emergency situations prognostication and their exclusion at the stages of the drilling process design and realization.Keywords: curvilinear drilling, drill string tripping in and out, contact forces, resistance forces
Procedia PDF Downloads 147909 A Low-Cost and Easy-To-Operate Remediation Technology of Heavy Metals Contaminated Agricultural Soil
Authors: Xiao-Hua Zhu, Xin Yuan, Yi-Ran Zhao
Abstract:
High-cadmium pollution in rice is a serious problem in many parts of China. Many kinds of remediation technologies have been tested and applied in many farmlands. Because of the productive function of the farmland, most technologies are inappropriate due to their destruction to the tillage soil layer. And the large labours and expensive fees of many technologies are also the restrictive factors for their applications. The conception of 'Root Micro-Geochemical Barrier' was proposed to reduce cadmium (Cd) bioavailability and the concentration of the cadmium in rice. Remediation and mitigation techniques were demonstrated on contaminated farmland in the downstream of some mine. According to the rule of rice growth, Cd would be absorbed by the crops in every growth stage, and the plant-absorb efficiency in the first stage of the tillering stage is almost the highest. We should create a method to protect the crops from heavy metal pollution, which could begin to work from the early growth stage. Many materials with repair property get our attention. The materials will create a barrier preventing Cd from being absorbed by the crops during all the growing process because the material has the ability to adsorb soil-Cd and making it losing its migration activity. And we should choose a good chance to put the materials into the crop-growing system cheaply as soon as early. Per plant, rice has a little root system scope, which makes the roots reach about 15cm deep and 15cm wide. So small root radiation area makes it possible for all the Cd approaching the roots to be adsorbed with a small amount of adsorbent. Mixing the remediation materials with the seed-raising soli and adding them to the tillage soil in the process of transplanting seedlings, we can control the soil-Cd activity in the range of roots to reduce the Cd-amount absorbed by the crops. Of course, the mineral materials must have enough adsorptive capacity and no additional pollution. More than 3000 square meters farmlands have been remediated. And on the application of root micro-geochemical barrier, the Cd-concentration in rice and the remediation-cost have been decreased by 90% and 80%, respectively, with little extra labour brought to the farmers. The Cd-concentrations in rice from remediated farmland have been controlled below 0.1 ppm. The remediation of one acre of contaminated cropland costs less than $100. The concept has its advantage in the remediation of paddy field contaminated by Cd, especially for the field with outside pollution sources.Keywords: cadmium pollution, growth stage, cost, root micro-geochemistry barrier
Procedia PDF Downloads 87908 Comics as an Intermediary for Media Literacy Education
Authors: Ryan C. Zlomek
Abstract:
The value of using comics in the literacy classroom has been explored since the 1930s. At that point in time researchers had begun to implement comics into daily lesson plans and, in some instances, had started the development process for comics-supported curriculum. In the mid-1950s, this type of research was cut short due to the work of psychiatrist Frederic Wertham whose research seemingly discovered a correlation between comic readership and juvenile delinquency. Since Wertham’s allegations the comics medium has had a hard time finding its way back to education. Now, over fifty years later, the definition of literacy is in mid-transition as the world has become more visually-oriented and students require the ability to interpret images as often as words. Through this transition, comics has found a place in the field of literacy education research as the shift focuses from traditional print to multimodal and media literacies. Comics are now believed to be an effective resource in bridging the gap between these different types of literacies. This paper seeks to better understand what students learn from the process of reading comics and how those skills line up with the core principles of media literacy education in the United States. In the first section, comics are defined to determine the exact medium that is being examined. The different conventions that the medium utilizes are also discussed. In the second section, the comics reading process is explored through a dissection of the ways a reader interacts with the page, panel, gutter, and different comic conventions found within a traditional graphic narrative. The concepts of intersubjective acts and visualization are attributed to the comics reading process as readers draw in real world knowledge to decode meaning. In the next section, the learning processes that comics encourage are explored parallel to the core principles of media literacy education. Each principle is explained and the extent to which comics can act as an intermediary for this type of education is theorized. In the final section, the author examines comics use in his computer science and technology classroom. He lays out different theories he utilizes from Scott McCloud’s text Understanding Comics and how he uses them to break down media literacy strategies with his students. The article concludes with examples of how comics has positively impacted classrooms around the United States. It is stated that integrating comics into the classroom will not solve all issues related to literacy education but, rather, that comics can be a powerful multimodal resource for educators looking for new mediums to explore with their students.Keywords: comics, graphics novels, mass communication, media literacy, metacognition
Procedia PDF Downloads 301907 The Challenges of Well Integrity on Plug and Abandoned Wells for Offshore Co₂ Storage Site Containment
Authors: Siti Noor Syahirah Mohd Sabri
Abstract:
The oil and gas industry is committed to net zero carbon emissions because the consequences of climate change could be catastrophic unless responded to very soon. One way of reducing CO₂ emissions is to inject it into a depleted reservoir buried underground. This greenhouse gas reduction technique significantly reduces CO₂ released into the atmosphere. In general, depleted oil and gas reservoirs provide readily available sites for the storage of CO₂ in offshore areas. This is mainly due to the hydrocarbons have been optimally produced and the existence of voids for effective CO₂ storage. Hence, make it a good candidate for a CO₂ well injector location. Geological storage sites are often evaluated in terms of capacity, injectivity and containment. Leakage through the cap rock or existing well is the main concern in the depleted fields. In order to develop these fields as CO₂ storage sites, the long-term integrity of wells drilled in these oil & gas fields must be ascertained to ensure good CO₂ containment. Well, integrity is often defined as the ability to contain fluids without significant leakage through the project lifecycle. Most plugged and abandoned (P & A) wells in Peninsular Malaysia have drilled 20 – 30 years ago and were not designed to withstand downhole conditions having >50%vol CO₂ and CO₂/H₂O mixture. In addition, Corrosive-Resistant Alloy (CRA) tubular and CO₂-resistant cement was not used during good construction. The reservoir pressure and temperature conditions may have further degraded the material strength and elevated the corrosion rate. Understanding all the uncertainties that may have affected cement-casing bonds, such as the quality of cement behind the casing, subsidence effect, corrosion rate, etc., is the first step toward well integrity evaluation. Secondly, proper quantification of all the uncertainties involved needs to be done to ensure long-term underground storage objectives of CO₂ are achieved. This paper will discuss challenges associated with estimating the performance of well barrier elements in existing P&A wells. Risk ranking of the existing P&A wells is to be carried out in order to ensure the integrity of the storage site is maintained for long-term CO₂ storage. High-risk existing P&A wells are to be re-entered to restore good integrity and to reduce future leakage that may happen. In addition, the requirement to design a fit-for-purpose monitoring and mitigation technology package for potential CO₂ leakage/seepage in the marine environment will be discussed accordingly. The holistic approach will ensure that the integrity is maintained, and CO₂ is contained underground for years to come.Keywords: CCUS, well integrity, co₂ storage, offshore
Procedia PDF Downloads 91906 Regeneration of a Liquid Desiccant Using Membrane Distillation to Unlock Coastal Desert Agriculture Potential
Authors: Kimberly J. Cribbs, Ryan M. Lefers, TorOve Leiknes, Noreddine Ghaffour
Abstract:
In Gulf Cooperation Council (GCC) countries, domestic agriculture is hindered by a lack of freshwater, poor soil quality, and ambient temperatures unsuitable for cultivation resulting in a heavy reliance on imported food. Attempts to minimize the risk of food insecurity by growing crops domestically creates a significant demand on limited freshwater resources in this region. Cultivating food in a greenhouse allows some of these challenges, such as poor soil quality and temperatures unsuitable for cultivation, to be overcome. One of the most common methods for greenhouse cooling is evaporative cooling. This method cools the air by the evaporation of water and requires a large amount of water relative to that needed for plant growth and air with a low relative humidity. Considering that much of the population in GCC countries live within 100 km of a coast and that sea water can be utilized for evaporative cooling, coastal agriculture could reduce the risk of food insecurity and water demand. Unfortunately, coastal regions tend to experience both high temperatures and high relative humidity causing evaporative cooling by itself to be inadequate. Therefore, dehumidification is needed prior to utilizing evaporative cooling. Utilizing a liquid desiccant for air dehumidification is promising, but the desiccant regeneration to retain its dehumidification potential remains a significant obstacle for the adoption of this technology. This project studied the regeneration of a magnesium chloride (MgCl₂) desiccant solution from 20wt% to 30wt% by direct contact membrane distillation (DCMD) and explored the possibility of using the recovered water for irrigation. Two 0.2 µm hydrophobic PTFE membranes were tested at feed temperatures of 80, 70, and 60°C and with a permeate temperature of 20°C. It was observed that the permeate flux increases as the difference between the feed and coolant temperature increases and also as the feed concentration decreases. At 21wt% the permeate flux was 34,17, and 14 L m⁻² h⁻¹ for feed temperatures of 80, 70, and 60°C, respectively. Salt rejection decreased overtime; however, it remained greater than 99.9% over an experimental time span of 10 hours. The results show that DCMD can successfully regenerate the magnesium chloride desiccant solution.Keywords: agriculture, direct contact membrane distillation, GCC countries, liquid desiccant, water recovery
Procedia PDF Downloads 151905 Characterization of Soil Microbial Communities from Vineyard under a Spectrum of Drought Pressures in Sensitive Area of Mediterranean Region
Authors: Gianmaria Califano, Júlio Augusto Lucena Maciel, Olfa Zarrouk, Miguel Damasio, Jose Silvestre, Ana Margarida Fortes
Abstract:
Global warming, with rapid and sudden changes in meteorological conditions, is one of the major constraints to ensuring agricultural and crop resilience in the Mediterranean regions. Several strategies are being adopted to reduce the pressure of drought stress on grapevines at regional and local scales: improvements in the irrigation systems, adoption of interline cover crops, and adaptation of pruning techniques. However, still, more can be achieved if also microbial compartments associated with plants are considered in crop management. It is known that the microbial community change according to several factors such as latitude, plant variety, age, rootstock, soil composition and agricultural management system. Considering the increasing pressure of the biotic and abiotic stresses, it is of utmost necessity to also evaluate the effects of drought on the microbiome associated with the grapevine, which is a commercially important crop worldwide. In this study, we characterize the diversity and the structure of the microbial community under three long-term irrigation levels (100% ETc, 50% ETc and rain-fed) in a drought-tolerant grapevine cultivar present worldwide, Syrah. To avoid the limitations of culture-dependent methods, amplicon sequencing with target primers for bacteria and fungi was applied to the same soil samples. The use of the DNeasy PowerSoil (Qiagen) extraction kit required further optimization with the use of lytic enzymes and heating steps to improve DNA yield and quality systematically across biological treatments. Target regions (16S rRNA and ITS genes) of our samples are being sequenced with Illumina technology. With bioinformatic pipelines, it will be possible to obtain a characterization of the bacterial and fungal diversity, structure and composition. Further, the microbial communities will be assessed for their functional activity, which remains an important metric considering the strong inter-kingdom interactions existing between plants and their associated microbiome. The results of this study will lay the basis for biotechnological applications: in combination with the establishment of a bacterial library, it will be possible to explore the possibility of testing synthetic microbial communities to support plant resistance to water scarcity.Keywords: microbiome, metabarcoding, soil, vinegrape, syrah, global warming, crop sustainability
Procedia PDF Downloads 128904 A Discussion on Urban Planning Methods after Globalization within the Context of Anticipatory Systems
Authors: Ceylan Sozer, Ece Ceylan Baba
Abstract:
The reforms and changes that began with industrialization in cities and continued with globalization in 1980’s, created many changes in urban environments. City centers which are desolated due to industrialization, began to get crowded with globalization and became the heart of technology, commerce and social activities. While the immediate and intense alterations are planned around rigorous visions in developed countries, several urban areas where the processes were underestimated and not taken precaution faced with irrevocable situations. When the effects of the globalization in the cities are examined, it is seen that there are some anticipatory system plans in the cities about the future problems. Several cities such as New York, London and Tokyo have planned to resolve probable future problems in a systematic scheme to decrease possible side effects during globalization. The decisions in urban planning and their applications are the main points in terms of sustainability and livability in such mega-cities. This article examines the effects of globalization on urban planning through 3 mega cities and the applications. When the applications of urban plannings of the three mega-cities are investigated, it is seen that the city plans are generated under light of past experiences and predictions of a certain future. In urban planning, past and present experiences of a city should have been examined and then future projections could be predicted together with current world dynamics by a systematic way. In this study, methods used in urban planning will be discussed and ‘Anticipatory System’ model will be explained and relations with global-urban planning will be discussed. The concept of ‘anticipation’ is a phenomenon that means creating foresights and predictions about the future by combining past, present and future within an action plan. The main distinctive feature that separates anticipatory systems from other systems is the combination of past, present and future and concluding with an act. Urban plans that consist of various parameters and interactions together are identified as ‘live’ and they have systematic integrities. Urban planning with an anticipatory system might be alive and can foresight some ‘side effects’ in design processes. After globalization, cities became more complex and should be designed within an anticipatory system model. These cities can be more livable and can have sustainable urban conditions for today and future.In this study, urban planning of Istanbul city is going to be analyzed with comparisons of New York, Tokyo and London city plans in terms of anticipatory system models. The lack of a system in İstanbul and its side effects will be discussed. When past and present actions in urban planning are approached through an anticipatory system, it can give more accurate and sustainable results in the future.Keywords: globalization, urban planning, anticipatory system, New York, London, Tokyo, Istanbul
Procedia PDF Downloads 144903 Energy Usage in Isolated Areas of Honduras
Authors: Bryan Jefry Sabillon, Arlex Molina Cedillo
Abstract:
Currently, the raise in the demand of electrical energy as a consequence of the development of technology and population growth, as well as some projections made by ‘La Agencia Internacional de la Energía’ (AIE) and research institutes, reveal alarming data about the expected raise of it in the next few decades. Because of this, something should be made to raise the awareness of the rational and efficient usage of this resource. Because of the global concern of providing electrical energy to isolated areas, projects consisting of energy generation using renewable resources are commonly carried out. On a socioeconomically and cultural point of view, it can be foreseen a positive impact that would result for the society to have this resource. This article is focused on the great potential that Honduras shows, as a country that is looking forward to produce renewable energy due to the crisis that it’s living nowadays. Because of this, we present a detailed research that exhibits the main necessities that the rural communities are facing today, to allay the negative aspects due to the scarcity of electrical energy. We also discuss which should be the type of electrical generation method to be used, according to the disposition, geography, climate, and of course the accessibility of each area. Honduras is actually in the process of developing new methods for the generation of energy; therefore, it is of our concern to talk about renewable energy, the exploitation of which is a global trend. Right now the countries’ main energetic generation methods are: hydrological, thermic, wind, biomass and photovoltaic (this is one of the main sources of clean electrical generation). The use of these resources was possible partially due to the studies made by the organizations that focus on electrical energy and its demand, such as ‘La Cooperación Alemana’ (GIZ), ‘La Secretaria de Energía y Recursos Naturales’ (SERNA), and ‘El Banco Centroamericano de Integración Económica’ (BCIE), which eased the complete guide that is to be used in the protocol to be followed to carry out the three stages of this type of projects: 1) Licences and Permitions, 2) Fincancial Aspects and 3) The inscription for the Protocol in Kyoto. This article pretends to take the reader through the necessary information (according to the difficult accessibility that each zone might present), about the best option of electrical generation in zones that are totally isolated from the net, pretending to use renewable resources to generate electrical energy. We finally conclude that the usage of hybrid systems of generation of energy for small remote communities brings about a positive impact, not only because of the fact of providing electrical energy but also because of the improvements in education, health, sustainable agriculture and livestock, and of course the advances in the generation of energy which is the main concern of this whole article.Keywords: energy, isolated, renewable, accessibility
Procedia PDF Downloads 232902 Increased Efficiency during Oxygen Carrier Aided Combustion of Municipal Solid Waste in an Industrial Scaled Circulating Fluidized Bed-Boiler
Authors: Angelica Corcoran, Fredrik Lind, Pavleta Knutsson, Henrik Thunman
Abstract:
Solid waste volumes are at current predominately deposited on landfill. Furthermore, the impending climate change requires new solutions for a sustainable future energy mix. Currently, solid waste is globally utilized to small extent as fuel during combustion for heat and power production. Due to its variable composition and size, solid waste is considered difficult to combust and requires a technology with high fuel flexibility. One of the commercial technologies used for combustion of such difficult fuels is circulating fluidized beds (CFB). In a CFB boiler, fine particles of a solid material are used as 'bed material', which is accelerated by the incoming combustion air that causes the bed material to fluidize. The chosen bed material has conventionally been silica sand with the main purpose of being a heat carrier, as it transfers heat released by the combustion to the heat-transfer surfaces. However, the release of volatile compounds occurs rapidly in comparison with the lateral mixing in the combustion chamber. To ensure complete combustion a surplus of air is introduced, which decreases the total efficiency of the boiler. In recent years, the concept of partly or entirely replacing the silica sand with an oxygen carrier as bed material has been developed. By introducing an oxygen carrier to the combustion chamber, combustion can be spread out both temporally and spatially in the boiler. Specifically, the oxygen carrier can take up oxygen from the combustion air where it is in abundance and release it to combustible gases where oxygen is in deficit. The concept is referred to as oxygen carrier aided combustion (OCAC) where the natural ore ilmenite (FeTiO3) has been the oxygen carrier used. The authors have validated the oxygen buffering ability of ilmenite during combustion of biomass in Chalmers 12-MWth CFB boiler in previous publications. Furthermore, the concept has been demonstrated on full industrial scale during combustion of municipal solid waste (MSW) in E.ON’s 75 MWth CFB boiler. The experimental campaigns have showed increased mass transfer of oxygen inside the boiler when combustion both biomass and MSW. As a result, a higher degree of burnout is achieved inside the combustion chamber and the plant can be operated at a lower surplus of air. Moreover, the buffer of oxygen provided by the oxygen carrier makes the system less sensitive to disruptions in operation. In conclusion, combusting difficult fuels with OCAC results in higher operation stability and an increase in boiler efficiency.Keywords: OCAC, ilmenite, combustion, CFB
Procedia PDF Downloads 241901 Influence of CO₂ on the Curing of Permeable Concrete
Authors: A. M. Merino-Lechuga, A. González-Caro, D. Suescum-Morales, E. Fernández-Ledesma, J. R. Jiménez, J. M. Fernández-Rodriguez
Abstract:
Since the mid-19th century, the boom in the economy and industry has grown exponentially. This has led to an increase in pollution due to rising Greenhouse Gas (GHG) emissions and the accumulation of waste, leading to an increasingly imminent future scarcity of raw materials and natural resources. Carbon dioxide (CO₂) is one of the primary greenhouse gases, accounting for up to 55% of Greenhouse Gas (GHG) emissions. The manufacturing of construction materials generates approximately 73% of CO₂ emissions, with Portland cement production contributing to 41% of this figure. Hence, there is scientific and social alarm regarding the carbon footprint of construction materials and their influence on climate change. Carbonation of concrete is a natural process whereby CO₂ from the environment penetrates the material, primarily through pores and microcracks. Once inside, carbon dioxide reacts with calcium hydroxide (Ca(OH)2) and/or CSH, yielding calcium carbonates (CaCO3) and silica gel. Consequently, construction materials act as carbon sinks. This research investigated the effect of accelerated carbonation on the physical, mechanical, and chemical properties of two types of non-structural vibrated concrete pavers (conventional and draining) made from natural aggregates and two types of recycled aggregates from construction and demolition waste (CDW). Natural aggregates were replaced by recycled aggregates using a volumetric substitution method, and the CO₂ capture capacity was calculated. Two curing environments were utilized: a carbonation chamber with 5% CO₂ and a standard climatic chamber with atmospheric CO₂ concentration. Additionally, the effect of curing times of 1, 3, 7, 14, and 28 days on concrete properties was analyzed. Accelerated carbonation in-creased the apparent dry density, reduced water-accessible porosity, improved compressive strength, and decreased setting time to achieve greater mechanical strength. The maximum CO₂ capture ratio was achieved with the use of recycled concrete aggregate (52.52 kg/t) in the draining paver. Accelerated carbonation conditions led to a 525% increase in carbon capture compared to curing under atmospheric conditions. Accelerated carbonation of cement-based products containing recycled aggregates from construction and demolition waste is a promising technology for CO₂ capture and utilization, offering a means to mitigate the effects of climate change and promote the new paradigm of circular economy.Keywords: accelerated carbonation, CO₂ curing, CO₂ uptake and construction and demolition waste., circular economy
Procedia PDF Downloads 66900 An Analytical Review of Tourism Management in India with Special Reference to Maharashtra State
Authors: Anilkumar L. Rathod
Abstract:
This paper examines event tourism as a field of study and area of professional practice updating the previous review article published in 2015. In this substantially extended review, a deeper analysis of the field's evolution and development is presented, charting the growth of the literature, focusing both chronologically and thematically. A framework for understanding and creating knowledge about events and tourism is presented, forming the basis which signposts established research themes and concepts and outlines future directions for research. In addition, the review article focuses on constraining and propelling forces, ontological advances, contributions from key journals, and emerging themes and issues. It also presents a roadmap for research activity in event tourism. Published scholarly studies within this period are examined through content analysis, using such keywords as knowledge management, organizational learning, hospitality, tourism, tourist destinations, travel industry, hotels, lodging, motels, hotel industry, gaming, casino hotel and convention to search scholarly research journals. All contributions found are then screened for a hospitality and tourism theme. Researchers mostly discuss knowledge management approach in improving information technology, marketing and strategic planning in order to gain competitive advantage. Overall, knowledge management research is still limited. Planned events in tourism are created for a purpose, and what was once the realm of individual and community initiatives has largely become the realm of professionals and entrepreneurs provides a typology of the four main categories of planned events within an event-tourism context, including the main venues associated with each. It also assesses whether differences exist between socio-demographic groupings. An analysis using primarily descriptive statistics indicated both sub-samples had similar viewpoints although Maharashtra residents tended to have higher scores pertaining to the consequences of gambling. It is suggested that the differences arise due to the greater exposure of Maharashtra residents to the influences of casino development.Keywords: organizational learning, hospitality, tourism, tourist destinations, travel industry, hotels, lodging, motels, hotel industry, gaming, casino hotel and convention to search scholarly research journals
Procedia PDF Downloads 240899 Implementation of Correlation-Based Data Analysis as a Preliminary Stage for the Prediction of Geometric Dimensions Using Machine Learning in the Forming of Car Seat Rails
Authors: Housein Deli, Loui Al-Shrouf, Hammoud Al Joumaa, Mohieddine Jelali
Abstract:
When forming metallic materials, fluctuations in material properties, process conditions, and wear lead to deviations in the component geometry. Several hundred features sometimes need to be measured, especially in the case of functional and safety-relevant components. These can only be measured offline due to the large number of features and the accuracy requirements. The risk of producing components outside the tolerances is minimized but not eliminated by the statistical evaluation of process capability and control measurements. The inspection intervals are based on the acceptable risk and are at the expense of productivity but remain reactive and, in some cases, considerably delayed. Due to the considerable progress made in the field of condition monitoring and measurement technology, permanently installed sensor systems in combination with machine learning and artificial intelligence, in particular, offer the potential to independently derive forecasts for component geometry and thus eliminate the risk of defective products - actively and preventively. The reliability of forecasts depends on the quality, completeness, and timeliness of the data. Measuring all geometric characteristics is neither sensible nor technically possible. This paper, therefore, uses the example of car seat rail production to discuss the necessary first step of feature selection and reduction by correlation analysis, as otherwise, it would not be possible to forecast components in real-time and inline. Four different car seat rails with an average of 130 features were selected and measured using a coordinate measuring machine (CMM). The run of such measuring programs alone takes up to 20 minutes. In practice, this results in the risk of faulty production of at least 2000 components that have to be sorted or scrapped if the measurement results are negative. Over a period of 2 months, all measurement data (> 200 measurements/ variant) was collected and evaluated using correlation analysis. As part of this study, the number of characteristics to be measured for all 6 car seat rail variants was reduced by over 80%. Specifically, direct correlations for almost 100 characteristics were proven for an average of 125 characteristics for 4 different products. A further 10 features correlate via indirect relationships so that the number of features required for a prediction could be reduced to less than 20. A correlation factor >0.8 was assumed for all correlations.Keywords: long-term SHM, condition monitoring, machine learning, correlation analysis, component prediction, wear prediction, regressions analysis
Procedia PDF Downloads 50898 The Practice and Research of Computer-Aided Language Learning in China
Authors: Huang Yajing
Abstract:
Context: Computer-aided language learning (CALL) in China has undergone significant development over the past few decades, with distinct stages marking its evolution. This paper aims to provide a comprehensive review of the practice and research in this field in China, tracing its journey from the early stages of audio-visual education to the current multimedia network integration stage. Research Aim: The study aims to analyze the historical progression of CALL in China, identify key developments in the field, and provide recommendations for enhancing CALL practices in the future. Methodology: The research employs document analysis and literature review to synthesize existing knowledge on CALL in China, drawing on a range of sources to construct a detailed overview of the evolution of CALL practices and research in the country. Findings: The review highlights the significant advancements in CALL in China, showcasing the transition from traditional audio-visual educational approaches to the current integrated multimedia network stage. The study identifies key milestones, technological advancements, and theoretical influences that have shaped CALL practices in China. Theoretical Importance: The evolution of CALL in China reflects not only technological progress but also shifts in educational paradigms and theories. The study underscores the significance of cognitive psychology as a theoretical underpinning for CALL practices, emphasizing the learner's active role in the learning process. Data Collection and Analysis Procedures: Data collection involved extensive review and analysis of documents and literature related to CALL in China. The analysis was carried out systematically to identify trends, developments, and challenges in the field. Questions Addressed: The study addresses the historical development of CALL in China, the impact of technological advancements on teaching practices, the role of cognitive psychology in shaping CALL methodologies, and the future outlook for CALL in the country. Conclusion: The review provides a comprehensive overview of the evolution of CALL in China, highlighting key stages of development and emerging trends. The study concludes by offering recommendations to further enhance CALL practices in the Chinese context.Keywords: English education, educational technology, computer-aided language teaching, applied linguistics
Procedia PDF Downloads 57897 Co-Creational Model for Blended Learning in a Flipped Classroom Environment Focusing on the Combination of Coding and Drone-Building
Authors: A. Schuchter, M. Promegger
Abstract:
The outbreak of the COVID-19 pandemic has shown us that online education is so much more than just a cool feature for teachers – it is an essential part of modern teaching. In online math teaching, it is common to use tools to share screens, compute and calculate mathematical examples, while the students can watch the process. On the other hand, flipped classroom models are on the rise, with their focus on how students can gather knowledge by watching videos and on the teacher’s use of technological tools for information transfer. This paper proposes a co-educational teaching approach for coding and engineering subjects with the help of drone-building to spark interest in technology and create a platform for knowledge transfer. The project combines aspects from mathematics (matrices, vectors, shaders, trigonometry), physics (force, pressure and rotation) and coding (computational thinking, block-based programming, JavaScript and Python) and makes use of collaborative-shared 3D Modeling with clara.io, where students create mathematics knowhow. The instructor follows a problem-based learning approach and encourages their students to find solutions in their own time and in their own way, which will help them develop new skills intuitively and boost logically structured thinking. The collaborative aspect of working in groups will help the students develop communication skills as well as structural and computational thinking. Students are not just listeners as in traditional classroom settings, but play an active part in creating content together by compiling a Handbook of Knowledge (called “open book”) with examples and solutions. Before students start calculating, they have to write down all their ideas and working steps in full sentences so other students can easily follow their train of thought. Therefore, students will learn to formulate goals, solve problems, and create a ready-to use product with the help of “reverse engineering”, cross-referencing and creative thinking. The work on drones gives the students the opportunity to create a real-life application with a practical purpose, while going through all stages of product development.Keywords: flipped classroom, co-creational education, coding, making, drones, co-education, ARCS-model, problem-based learning
Procedia PDF Downloads 123896 Effectiveness Assessment of a Brazilian Larvicide on Aedes Control
Authors: Josiane N. Muller, Allan K. R. Galardo, Tatiane A. Barbosa, Evan P. Ferro, Wellington M. Dos Santos, Ana Paula S. A. Correa, Edinaldo C. Rego, Jose B. P. Lima
Abstract:
The susceptibility status of an insect population to any larvicide depends on several factors such includes genetic constitution, environmental conditions and others. The mosquito Aedes aegypti is the primary vector of three important viral diseases, Zika, Dengue, and Chikungunya. The frequent outbreaks of those diseases in different parts of Brazil demonstrate the importance of testing the susceptibility of vectors in different environments. Since the control of this mosquito leads to the control of disease, alternatives for vector control that value the different Brazilian environmental conditions are needed for effective actions. The aim of this study was to evaluate a new commercial formulation of Bacillus thuringiensis israelenses (DengueTech: Brazilian innovative technology) in the Brazilian Legal Amazon considering the climate conditions. Semi-field tests were conducted in the Institute of Scientific and Technological Research of the State of Amapa in two different environments, one in a shaded area and the other exposed to sunlight. The mosquito larvae were exposed to larvicide concentration and a control; each group was tested in three containers of 40 liters each. To assess persistence 50 third instar larvae of Aedes aegypti laboratory lineages (Rockefeller) and 50 larvae of Aedes aegypti collected in the municipality of Macapa, Brazil’s Amapa state, were added weekly and after 24 hours the mortality was assessed. In total 16 tests were performed, where 12 were done with replacement of water (1/5 of the volume, three times per week). The effectiveness of the product was determined through mortality of ≥ 80%, as recommend by the World Health Organization. The results demonstrated that high-water temperatures (26-35 °C) on the containers influenced the residual time of the product, where the maximum effect achieved was 21 days in the shaded area; and no effectiveness of 60 days was found in any of the tests, as expected according to the larvicide company. The test with and without water replacement did not present significant differences in the mortality rate. Considering the different environments and climate, these results stimulate the need to test larvicide and its effectiveness in specific environmental settings in order to identify the parameters required for better results. Thus, we see the importance of semi-field researches considering the local climate conditions for a successful control of Aedes aegypti.Keywords: Aedes aegypti, bioassay, larvicida, vector control
Procedia PDF Downloads 130895 Hansen Solubility Parameter from Surface Measurements
Authors: Neveen AlQasas, Daniel Johnson
Abstract:
Membranes for water treatment are an established technology that attracts great attention due to its simplicity and cost effectiveness. However, membranes in operation suffer from the adverse effect of membrane fouling. Bio-fouling is a phenomenon that occurs at the water-membrane interface, and is a dynamic process that is initiated by the adsorption of dissolved organic material, including biomacromolecules, on the membrane surface. After initiation, attachment of microorganisms occurs, followed by biofilm growth. The biofilm blocks the pores of the membrane and consequently results in reducing the water flux. Moreover, the presence of a fouling layer can have a substantial impact on the membrane separation properties. Understanding the mechanism of the initiation phase of biofouling is a key point in eliminating the biofouling on membrane surfaces. The adhesion and attachment of different fouling materials is affected by the surface properties of the membrane materials. Therefore, surface properties of different polymeric materials had been studied in terms of their surface energies and Hansen solubility parameters (HSP). The difference between the combined HSP parameters (HSP distance) allows prediction of the affinity of two materials to each other. The possibilities of measuring the HSP of different polymer films via surface measurements, such as contact angle has been thoroughly investigated. Knowing the HSP of a membrane material and the HSP of a specific foulant, facilitate the estimation of the HSP distance between the two, and therefore the strength of attachment to the surface. Contact angle measurements using fourteen different solvents on five different polymeric films were carried out using the sessile drop method. Solvents were ranked as good or bad solvents using different ranking method and ranking was used to calculate the HSP of each polymeric film. Results clearly indicate the absence of a direct relation between contact angle values of each film and the HSP distance between each polymer film and the solvents used. Therefore, estimating HSP via contact angle alone is not sufficient. However, it was found if the surface tensions and viscosities of the used solvents are taken in to the account in the analysis of the contact angle values, a prediction of the HSP from contact angle measurements is possible. This was carried out via training of a neural network model. The trained neural network model has three inputs, contact angle value, surface tension and viscosity of solvent used. The model is able to predict the HSP distance between the used solvent and the tested polymer (material). The HSP distance prediction is further used to estimate the total and individual HSP parameters of each tested material. The results showed an accuracy of about 90% for all the five studied filmsKeywords: surface characterization, hansen solubility parameter estimation, contact angle measurements, artificial neural network model, surface measurements
Procedia PDF Downloads 95894 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 113893 The Risk of Prioritizing Management over Education at Japanese Universities
Authors: Masanori Kimura
Abstract:
Due to the decline of the 18-year-old population, Japanese universities have a tendency to convert their form of employment from tenured positions to fixed-term positions for newly hired teachers. The advantage of this is that universities can be more flexible in their employment plans in case they fail to fill the enrollment of quotas of prospective students or they need to supplement teachers who can engage in other academic fields or research areas where new demand is expected. The most serious disadvantage of this, however, is that if secure positions cannot be provided to faculty members, there is the possibility that coherence of education and continuity of research supported by the university cannot be achieved. Therefore, the question of this presentation is as follows: Are universities aiming to give first priority to management, or are they trying to prioritize educational and research rather than management? To answer this question, the author examined the number of job offerings for college foreign language teachers posted on the JREC-IN (Japan Research Career Information Network, which is run by Japan Science and Technology Agency) website from April 2012 to October 2015. The results show that there were 1,002 and 1,056 job offerings for tenured positions and fixed-term contracts respectively, suggesting that, overall, today’s Japanese universities show a tendency to give first priority to management. More detailed examinations of the data, however, show that the tendency slightly varies depending on the types of universities. National universities which are supported by the central government and state universities which are supported by local governments posted more job offerings for tenured positions than for fixed-term contracts: national universities posted 285 and 257 job offerings for tenured positions and fixed-term contracts respectively, and state universities posted 106 and 86 job offerings for tenured positions and fixed-term contracts respectively. Yet the difference in number between the two types of employment status at national and state universities is marginal. As for private universities, they posted 713 job offerings for fixed-term contracts and 616 offerings for tenured positions. Moreover, 73% of the fixed-term contracts were offered for low rank positions including associate professors, lectures, and so forth. Generally speaking, those positions are offered to younger teachers. Therefore, this result indicates that private universities attempt to cut their budgets yet expect the same educational effect by hiring younger teachers. Although the results have shown that there are some differences in personal strategies among the three types of universities, the author argues that all three types of universities may lose important human resources that will take a pivotal role at their universities in the future unless they urgently review their employment strategies.Keywords: higher education, management, employment status, foreign language education
Procedia PDF Downloads 135892 Regular Laboratory Based Neonatal Simulation Program Increases Senior Clinicians’ Knowledge, Skills and Confidence Caring for Sick Neonates
Authors: Madeline Tagg, Choihoong Mui, Elizabeth Lek, Jide Menakaya
Abstract:
Introduction: Simulation technology is used by neonatal teams to learn and refresh skills and gain the knowledge and confidence to care for sick neonates. In-situ simulation is considered superior to laboratory-based programmes as it closely mirrors real life situations. This study reports our experience of running regular laboratory-based simulation sessions for senior clinicians and nurses and its impact on their knowledge, skills and confidence. Methods: A before and after questionnaire survey was carried out on senior clinicians and nurses that attended a scheduled laboratory-based simulation session. Participants were asked to document their expectations before a 3-hour monthly laboratory programme started and invited to feedback their reflections at the end of the session. The session included discussion of relevant clinical guidelines, immersion in a scenario and video led debrief. The results of the survey were analysed in three skills based categories - improved, no change or a worsened experience. Results: 45 questionnaires were completed and analysed. Of these 25 (55%) were completed by consultants seven and six by nurses and trainee doctors respectively, and seven respondents were unknown. 40 (88%) rated the session overall and guideline review as good/excellent, 39 respondents (86%) rated the scenario session good/excellent and 40/45 fed back a good/excellent debrief session. 33 (73%) respondents completed the before and after questionnaire. 21/33 (63%) reflected an improved knowledge, skill or confidence in caring for sick new-bon babies, eight respondents reported no change and four fed back a worse experience after the session. Discussion: Most respondents found the laboratory based structured simulation session beneficial for their professional development. They valued equally the whole content of the programme such as guideline review and equipment training as well as the simulation and debrief sessions. Two out three participants stated their knowledge of caring for sick new-born babies had been transformed positively by the session. Sessions where simulation equipment failed or relevant staff were absent contributed to a poor educational experience. Summary: A regular structured laboratory-based simulation programme with a rich content is a credible educational resource for improving the knowledge, skills and confidence of senior clinicians caring for sick new born babies.Keywords: knowledge, laboratory based, neonates, simulation
Procedia PDF Downloads 123891 Delving into the Concept of Social Capital in the Smart City Research
Authors: Atefe Malekkhani, Lee Beattie, Mohsen Mohammadzadeh
Abstract:
Unprecedented growth of megacities and urban areas all around the world have resulted in numerous risks, concerns, and problems across various aspects of urban life, including environmental, social, and economic domains like climate change, spatial and social inequalities. In this situation, ever-increasing progress of technology has created a hope for urban authorities that the negative effects of various socio-economic and environmental crises can potentially be mitigated with the use of information and communication technologies. The concept of 'smart city' represents an emerging solution to urban challenges arising from increased urbanization using ICTs. However, smart cities are often perceived primarily as technological initiatives and are implemented without considering the social and cultural contexts of cities and the needs of their residents. The implementation of smart city projects and initiatives has the potential to (un)intentionally exacerbate pre-existing social, spatial, and cultural segregation. Investigating the impact of smart city on social capital of people who are users of smart city systems and with governance as policymakers is worth exploring. The importance of inhabitants to the existence and development of smart cities cannot be overlooked. This concept has gained different perspectives in the smart city studies. Reviewing the literature about social capital and smart city show that social capital play three different roles in smart city development. Some research indicates that social capital is a component of a smart city and has embedded in its dimensions, definitions, or strategies, while other ones see it as a social outcome of smart city development and point out that the move to smart cities improves social capital; however, in most cases, it remains an unproven hypothesis. Other studies show that social capital can enhance the functions of smart cities, and the consideration of social capital in planning smart cities should be promoted. Despite the existing theoretical and practical knowledge, there is a significant research gap reviewing the knowledge domain of smart city studies through the lens of social capital. To shed light on this issue, this study aims to explore the domain of existing research in the field of smart city through the lens of social capital. This research will use the 'Preferred Reporting Items for Systematic Reviews and Meta-Analyses' (PRISMA) method to review relevant literature, focusing on the key concepts of 'Smart City' and 'Social Capital'. The studies will be selected Web of Science Core Collection, using a selection process that involves identifying literature sources, screening and filtering studies based on titles, abstracts, and full-text reading.Keywords: smart city, urban digitalisation, ICT, social capital
Procedia PDF Downloads 16890 Repurposing Dairy Manure Solids as a Non- Polluting Fertilizer and the Effects on Nutrient Recovery in Tomatoes (Solanum Lycopersicum)
Authors: Devon Simpson
Abstract:
Recycled Manure Solids (RMS), attained via centrifugation from Canadian dairy farms, were synthesized into a non-polluting fertilizer by bonding micronutrients (Fe, Zn, and Mn) to cellulose fibers and then assessed for the effectiveness of nutrient recovery in tomatoes. Manure management technology is critical for improving the sustainability of agroecosystems and has the capacity to offer a truly circular economy. The ability to add value to manure byproducts offers an opportunity for economic benefits while generating tenable solutions to livestock waste. The dairy industry is under increasing pressure from new environmental protections such as government restrictions on manure applications, limitations on herd size as well as increased product demand from a growing population. Current systems use RMS as bedding, so there is a lack of data pertaining to RMS use as a fertilizer. This is because of nutrient distribution, where most nutrients are retained in the liquid effluent of the solid-liquid separation. A literature review on the physical and chemical properties of dairy manure further revealed more data for raw manure than centrifuged solids. This research offers an innovative perspective and a new avenue of exploration in the use of RMS. Manure solids in this study were obtained directly from dairy farms in Salmon Arm and Abbotsford, British Columbia, and underwent physical, chemical, and biological characterizations pre- and post-synthesis processing. Samples were sent to A&L labs Canada for analysis. Once characterized and bonded to micronutrients, the effect of synthesized RMS on nutrient recovery in tomatoes was studied in a greenhouse environment. The agricultural research package ‘agricolae’ for R was used for experimental design and data analysis. The growth trials consisted of a randomized complete block design (RCBD) that allowed for analysis of variance (ANOVA). The primary outcome was to measure nutrient uptake, and this was done using an Inductively Coupled Plasma Mass Spectrometer (IC-PMS) to analyze the micronutrient content of both the tissue and fruit of the tomatoes. It was found that treatments containing bonded dairy manure solids had an increased micronutrient concentration. Treatments with bonded dairy manure solids also saw an increase in yield, and a brix analysis showed higher sugar content than the untreated control and a grower standard.Keywords: aoecosystems, dairy manure, micronutrient fertilizer, manure management, nutrient recovery, nutrient recycling, recycled manure solids, regenerative agricugrlture, sustainable farming
Procedia PDF Downloads 194889 Two-Stage Estimation of Tropical Cyclone Intensity Based on Fusion of Coarse and Fine-Grained Features from Satellite Microwave Data
Authors: Huinan Zhang, Wenjie Jiang
Abstract:
Accurate estimation of tropical cyclone intensity is of great importance for disaster prevention and mitigation. Existing techniques are largely based on satellite imagery data, and research and utilization of the inner thermal core structure characteristics of tropical cyclones still pose challenges. This paper presents a two-stage tropical cyclone intensity estimation network based on the fusion of coarse and fine-grained features from microwave brightness temperature data. The data used in this network are obtained from the thermal core structure of tropical cyclones through the Advanced Technology Microwave Sounder (ATMS) inversion. Firstly, the thermal core information in the pressure direction is comprehensively expressed through the maximal intensity projection (MIP) method, constructing coarse-grained thermal core images that represent the tropical cyclone. These images provide a coarse-grained feature range wind speed estimation result in the first stage. Then, based on this result, fine-grained features are extracted by combining thermal core information from multiple view profiles with a distributed network and fused with coarse-grained features from the first stage to obtain the final two-stage network wind speed estimation. Furthermore, to better capture the long-tail distribution characteristics of tropical cyclones, focal loss is used in the coarse-grained loss function of the first stage, and ordinal regression loss is adopted in the second stage to replace traditional single-value regression. The selection of tropical cyclones spans from 2012 to 2021, distributed in the North Atlantic (NA) regions. The training set includes 2012 to 2017, the validation set includes 2018 to 2019, and the test set includes 2020 to 2021. Based on the Saffir-Simpson Hurricane Wind Scale (SSHS), this paper categorizes tropical cyclone levels into three major categories: pre-hurricane, minor hurricane, and major hurricane, with a classification accuracy rate of 86.18% and an intensity estimation error of 4.01m/s for NA based on this accuracy. The results indicate that thermal core data can effectively represent the level and intensity of tropical cyclones, warranting further exploration of tropical cyclone attributes under this data.Keywords: Artificial intelligence, deep learning, data mining, remote sensing
Procedia PDF Downloads 63888 Improving Student Retention: Enhancing the First Year Experience through Group Work, Research and Presentation Workshops
Authors: Eric Bates
Abstract:
Higher education is recognised as being of critical importance in Ireland and has been linked as a vital factor to national well-being. Statistics show that Ireland has one of the highest rates of higher education participation in Europe. However, student retention and progression, especially in Institutes of Technology, is becoming an issue as rates on non-completion rise. Both within Ireland and across Europe student retention is seen as a key performance indicator for higher education and with these increasing rates the Irish higher education system needs to be flexible and adapt to the situation it now faces. The author is a Programme Chair on a Level 6 full time undergraduate programme and experience to date has shown that the first year undergraduate students take some time to identify themselves as a group within the setting of a higher education institute. Despite being part of a distinct class on a specific programme some individuals can feel isolated as he or she take the first step into higher education. Such feelings can contribute to students eventually dropping out. This paper reports on an ongoing initiative that aims to accelerate the bonding experience of a distinct group of first year undergraduates on a programme which has a high rate of non-completion. This research sought to engage the students in dynamic interactions with their peers to quickly evolve a group sense of coherence. Two separate modules – a Research Module and a Communications module - delivered by the researcher were linked across two semesters. Students were allocated into random groups and each group was given a topic to be researched. There were six topics – essentially the six sub-headings on the DIT Graduate Attribute Statement. The research took place in a computer lab and students also used the library. The output from this was a document that formed part of the submission for the Research Module. In the second semester the groups then had to make a presentation of their findings where each student spoke for a minimum amount of time. Presentation workshops formed part of that module and students were given the opportunity to practice their presentation skills. These presentations were video recorded to enable feedback to be given. Although this was a small scale study preliminary results found a strong sense of coherence among this particular cohort and feedback from the students was very positive. Other findings indicate that spreading the initiative across two semesters may have been an inhibitor. Future challenges include spreading such Initiatives College wide and indeed sector wide.Keywords: first year experience, student retention, group work, presentation workshops
Procedia PDF Downloads 232887 Photocatalytic Disintegration of Naphthalene and Naphthalene Similar Compounds in Indoors Air
Authors: Tobias Schnabel
Abstract:
Naphthalene and naphthalene similar compounds are a common problem in the indoor air of buildings from the 1960s and 1970s in Germany. Often tar containing roof felt was used under the concrete floor to prevent humidity to come through the floor. This tar containing roof felt has high concentrations of PAH (Polycyclic aromatic hydrocarbon) and naphthalene. Naphthalene easily evaporates and contaminates the indoor air. Especially after renovations and energetically modernization of the buildings, the naphthalene concentration rises because no forced air exchange can happen. Because of this problem, it is often necessary to change the floors after renovation of the buildings. The MFPA Weimar (Material research and testing facility) developed in cooperation a project with LEJ GmbH and Reichmann Gebäudetechnik GmbH. It is a technical solution for the disintegration of naphthalene in naphthalene, similar compounds in indoor air with photocatalytic reforming. Photocatalytic systems produce active oxygen species (hydroxyl radicals) through trading semiconductors on a wavelength of their bandgap. The light energy separates the charges in the semiconductor and produces free electrons in the line tape and defect electrons. The defect electrons can react with hydroxide ions to hydroxyl radicals. The produced hydroxyl radicals are a strong oxidation agent, and can oxidate organic matter to carbon dioxide and water. During the research, new titanium oxide catalysator surface coatings were developed. This coating technology allows the production of very porous titan oxide layer on temperature stable carrier materials. The porosity allows the naphthalene to get easily absorbed by the surface coating, what accelerates the reaction of the heterogeneous photocatalysis. The photocatalytic reaction is induced by high power and high efficient UV-A (ultra violet light) Leds with a wavelength of 365nm. Various tests in emission chambers and on the reformer itself show that a reduction of naphthalene in important concentrations between 2 and 250 µg/m³ is possible. The disintegration rate was at least 80%. To reduce the concentration of naphthalene from 30 µg/m³ to a level below 5 µg/m³ in a usual 50 ² classroom, an energy of 6 kWh is needed. The benefits of the photocatalytic indoor air treatment are that every organic compound in the air can be disintegrated and reduced. The use of new photocatalytic materials in combination with highly efficient UV leds make a safe and energy efficient reduction of organic compounds in indoor air possible. At the moment the air cleaning systems take the step from prototype stage into the usage in real buildings.Keywords: naphthalene, titandioxide, indoor air, photocatalysis
Procedia PDF Downloads 144886 Technical and Economic Potential of Partial Electrification of Railway Lines
Authors: Rafael Martins Manzano Silva, Jean-Francois Tremong
Abstract:
Electrification of railway lines allows to increase speed, power, capacity and energetic efficiency of rolling stocks. However, this process of electrification is complex and costly. An electrification project is not just about design of catenary. It also includes installation of structures around electrification, as substation installation, electrical isolation, signalling, telecommunication and civil engineering structures. France has more than 30,000 km of railways, whose only 53% are electrified. The others 47% of railways use diesel locomotive and represent only 10% of the circulation (tons.km). For this reason, a new type of electrification, less expensive than the usual, is requested to enable the modernization of these railways. One solution could be the use of hybrids trains. This technology opens up new opportunities for less expensive infrastructure development such as the partial electrification of railway lines. In a partially electrified railway, the power supply of theses hybrid trains could be made either by the catenary or by the on-board energy storage system (ESS). Thus, the on-board ESS would feed the energetic needs of the train along the non-electrified zones while in electrified zones, the catenary would feed the train and recharge the on-board ESS. This paper’s objective deals with the technical and economic potential identification of partial electrification of railway lines. This study provides different scenarios of electrification by replacing the most expensive places to electrify using on-board ESS. The target is to reduce the cost of new electrification projects, i.e. reduce the cost of electrification infrastructures while not increasing the cost of rolling stocks. In this study, scenarios are constructed in function of the electrification’s cost of each structure. The electrification’s cost varies considerably because of the installation of catenary support in tunnels, bridges and viaducts is much more expensive than in others zones of the railway. These scenarios will be used to describe the power supply system and to choose between the catenary and the on-board energy storage depending on the position of the train on the railway. To identify the influence of each partial electrification scenario in the sizing of the on-board ESS, a model of the railway line and of the rolling stock is developed for a real case. This real case concerns a railway line located in the south of France. The energy consumption and the power demanded at each point of the line for each power supply (catenary or on-board ESS) are provided at the end of the simulation. Finally, the cost of a partial electrification is obtained by adding the civil engineering costs of the zones to be electrified plus the cost of the on-board ESS. The study of the technical and economic potential ends with the identification of the most economically interesting scenario of electrification.Keywords: electrification, hybrid, railway, storage
Procedia PDF Downloads 432