Search results for: evaluation capacity building
2049 The Use of Technology in Theatrical Performances as a Tool of Audience’S Engagement
Authors: Chrysoula Bousiouta
Abstract:
Throughout the history of theatre, technology has played an important role both in influencing the relationship between performance and audience and offering different kinds of experiences. The use of technology dates back in ancient times, when the introduction of artifacts, such as “Deus ex machine” in ancient Greek theatre, started. Taking into account the key techniques and experiences used throughout history, this paper investigates how technology, through new media, influences contemporary theatre. In the context of this research, technology is defined as projections, audio environments, video-projections, sensors, tele-connections, all alongside with the performance, challenging audience’s participation. The theoretical framework of the research covers, except for the history of theatre, the theory of “experience economy” that took over the service and goods economy. The research is based on the qualitative and comparative analysis of two case studies, Contact Theatre in Manchester (United Kingdom) and Bios in Athens (Greece). The data selection includes desk research and is complemented with semi structured interviews. Building on the results of the research one could claim that the intended experience of modern/contemporary theatre is that of engagement. In this context, technology -as defined above- plays a leading role in creating it. This experience passes through and exists in the middle of the realms of entertainment, education, estheticism and escapism. Furthermore, it is observed that nowadays, theatre is not only about acting but also about performing; it is that one where the performances are unfinished without the participation of the audience. Both case studies try to achieve the experience of engagement through practices that promote the attraction of attention, the increase of imagination, the interaction, the intimacy and the true activity. These practices are achieved through the script, the scenery, the language and the environment of a performance. Contact and Bios consider technology as an intimate tool in order to accomplish the above, and they make an extended use of it. The research completes a notable record of technological techniques that modern theatres use. The use of technology, inside or outside the limits of film technique’s, helps to rivet the attention of the audience, to make performances enjoyable, to give the sense of the “unfinished” or to be used for things that take place around the spectators and force them to take action, being spect-actors. The advantage of technology is that it can be used as a hook for interaction in all stages of a performance. Further research on the field could involve exploring alternative ways of binding technology and theatre or analyzing how the performance is perceived through the use of technological artifacts.Keywords: experience of engagement, interactive theatre, modern theatre, performance, technology
Procedia PDF Downloads 2502048 Building Resilient Communities: The Traumatic Effect of Wildfire on Mati, Greece
Authors: K. Vallianou, T. Alexopoulos, V. Plaka, M. K. Seleventi, V. Skanavis, C. Skanavis
Abstract:
The present research addresses the role of place attachment and emotions in community resiliency and recovery within the context of a disaster. Natural disasters represent a disruption in the normal functioning of a community, leading to a general feeling of disorientation. This study draws on the trauma caused by a natural hazard such as a forest fire. The changes of the sense of togetherness are being assessed. Finally this research determines how the place attachment of the inhabitants was affected during the reorientation process of the community. The case study area is Mati, a small coastal town in eastern Attica, Greece. The fire broke out on July 23rd, 2018. A quantitative research was conducted through questionnaires via phone interviews, one year after the disaster, to address community resiliency in the long-run. The sample was composed of 159 participants from the rural community of Mati plus 120 coming from Skyros Island that was used as a control group. Inhabitants were prompted to answer items gauging their emotions related to the event, group identification and emotional significance of their community, and place attachment before and a year after the fire took place. Importantly, the community recovery and reorientation were examined within the context of a relative absence of government backing and official support. Emotions related to the event were aggregated into 4 clusters related to: activation/vigilance, distress/disorientation, indignation, and helplessness. The findings revealed a decrease in the level of place attachment in the impacted area of Mati as compared to the control group of Skyros Island. Importantly, initial distress caused by the fire prompted the residents to identify more with their community and to report more positive feelings toward their community. Moreover, a mediation analysis indicated that the positive effect of community cohesion on place attachment one year after the disaster was mediated by the positive feelings toward the community. Finally, place attachment contributes to enhanced optimism and a more positive perspective concerning Mati’s future prospects. Despite an insufficient state support to this affected area, the findings suggest an important role of emotions and place attachment during the process of recovery. Implications concerning the role of emotions and social dynamics in meshing place attachment during the disaster recovery process as well as community resiliency are discussed.Keywords: community resilience, natural disasters, place attachment, wildfire
Procedia PDF Downloads 1032047 Rainfall Estimation over Northern Tunisia by Combining Meteosat Second Generation Cloud Top Temperature and Tropical Rainfall Measuring Mission Microwave Imager Rain Rates
Authors: Saoussen Dhib, Chris M. Mannaerts, Zoubeida Bargaoui, Ben H. P. Maathuis, Petra Budde
Abstract:
In this study, a new method to delineate rain areas in northern Tunisia is presented. The proposed approach is based on the blending of the geostationary Meteosat Second Generation (MSG) infrared channel (IR) with the low-earth orbiting passive Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). To blend this two products, we need to apply two main steps. Firstly, we have to identify the rainy pixels. This step is achieved based on a classification using MSG channel IR 10.8 and the water vapor WV 0.62, applying a threshold on the temperature difference of less than 11 Kelvin which is an approximation of the clouds that have a high likelihood of precipitation. The second step consists on fitting the relation between IR cloud top temperature with the TMI rain rates. The correlation coefficient of these two variables has a negative tendency, meaning that with decreasing temperature there is an increase in rainfall intensity. The fitting equation will be applied for the whole day of MSG 15 minutes interval images which will be summed. To validate this combined product, daily extreme rainfall events occurred during the period 2007-2009 were selected, using a threshold criterion for large rainfall depth (> 50 mm/day) occurring at least at one rainfall station. Inverse distance interpolation method was applied to generate rainfall maps for the drier summer season (from May to October) and the wet winter season (from November to April). The evaluation results of the estimated rainfall combining MSG and TMI was very encouraging where all the events were detected rainy and the correlation coefficients were much better than previous evaluated products over the study area such as MSGMPE and PERSIANN products. The combined product showed a better performance during wet season. We notice also an overestimation of the maximal estimated rain for many events.Keywords: combination, extreme, rainfall, TMI-MSG, Tunisia
Procedia PDF Downloads 1752046 Study on Capability of the Octocopter Configurations in Finite Element Analysis Simulation Environment
Authors: Jeet Shende, Leonid Shpanin, Misko Abramiuk, Mattew Goodwin, Nicholas Pickett
Abstract:
Energy harvesting on board the Unmanned Ariel Vehicle (UAV) is one of the most rapidly growing emerging technologies and consists of the collection of small amounts of energy, for different applications, from unconventional sources that are incidental to the operation of the parent system or device. Different energy harvesting techniques have already been investigated in the multirotor drones, where the energy collected comes from the systems surrounding ambient environment and typically involves the conversion of solar, kinetic, or thermal energies into electrical energy. The energy harvesting from the vibrated propeller using the piezoelectric components inside the propeller has also been proven to be feasible. However, the impact on the UAV flight performance using this technology has not been investigated. In this contribution the impact on the multirotor drone operation has been investigated at different flight control configurations which support the efficient performance of the propeller vibration energy harvesting. The industrially made MANTIS X8-PRO octocopter frame kit was used to explore the octocopter operation which was modelled using SolidWorks 3D CAD package for simulation studies. The octocopter flight control strategy is developed through integration of the SolidWorks 3D CAD software and MATLAB/Simulink simulation environment for evaluation of the octocopter behaviour under different simulated flight modes and octocopter geometries. Analysis of the two modelled octocopter geometries and their flight performance is presented via graphical representation of simulated parameters. The possibility of not using the landing gear in octocopter geometry is demonstrated. The conducted study evaluates the octocopter’s flight control technique and its impact on the energy harvesting mechanism developed on board the octocopter. Finite Element Analysis (FEA) simulation results of the modelled octocopter in operation are presented exploring the performance of the octocopter flight control and structural configurations. Applications of both octocopter structures and their flight control strategy are discussed.Keywords: energy harvesting, flight control modelling, object modeling, unmanned aerial vehicle
Procedia PDF Downloads 772045 Strengthening by Assessment: A Case Study of Rail Bridges
Authors: Evangelos G. Ilias, Panagiotis G. Ilias, Vasileios T. Popotas
Abstract:
The United Kingdom has one of the oldest railway networks in the world dating back to 1825 when the world’s first passenger railway was opened. The network has some 40,000 bridges of various construction types using a wide range of materials including masonry, steel, cast iron, wrought iron, concrete and timber. It is commonly accepted that the successful operation of the network is vital for the economy of the United Kingdom, consequently the cost effective maintenance of the existing infrastructure is a high priority to maintain the operability of the network, prevent deterioration and to extend the life of the assets. Every bridge on the railway network is required to be assessed every eighteen years and a structured approach to assessments is adopted with three main types of progressively more detailed assessments used. These assessment types include Level 0 (standardized spreadsheet assessment tools), Level 1 (analytical hand calculations) and Level 2 (generally finite element analyses). There is a degree of conservatism in the first two types of assessment dictated to some extent by the relevant standards which can lead to some structures not achieving the required load rating. In these situations, a Level 2 Assessment is often carried out using finite element analysis to uncover ‘latent strength’ and improve the load rating. If successful, the more sophisticated analysis can save on costly strengthening or replacement works and avoid disruption to the operational railway. This paper presents the ‘strengthening by assessment’ achieved by Level 2 analyses. The use of more accurate analysis assumptions and the implementation of non-linear modelling and functions (material, geometric and support) to better understand buckling modes and the structural behaviour of historic construction details that are not specifically covered by assessment codes are outlined. Metallic bridges which are susceptible to loss of section size through corrosion have largest scope for improvement by the Level 2 Assessment methodology. Three case studies are presented, demonstrating the effectiveness of the sophisticated Level 2 Assessment methodology using finite element analysis against the conservative approaches employed for Level 0 and Level 1 Assessments. One rail overbridge and two rail underbridges that did not achieve the required load rating by means of a Level 1 Assessment due to the inadequate restraint provided by U-Frame action are examined and the increase in assessed capacity given by the Level 2 Assessment is outlined.Keywords: assessment, bridges, buckling, finite element analysis, non-linear modelling, strengthening
Procedia PDF Downloads 3092044 A Quality Improvement Approach for Reducing Stigma and Discrimination against Young Key Populations in the Delivery of Sexual Reproductive Health and Rights Services
Authors: Atucungwiire Rwebiita
Abstract:
Introduction: In Uganda, provision of adolescent sexual reproductive health and rights (SRHR) services for key population is still hindered by negative attitudes, stigma and discrimination (S&D) at both the community and facility levels. To address this barrier, Integrated Community Based Initiatives (ICOBI) with support from SIDA is currently implementing a quality improvement (QI) innovative approach for strengthening the capacity of key population (KP) peer leaders and health workers to deliver friendly SRHR services without S&D. Methods: Our innovative approach involves continuous mentorship and coaching of 8 QI teams at 8 health facilities and their catchment areas. Each of the 8 teams (comprised of 5 health workers and 5 KP peer leaders) are facilitated twice a month by two QI Mentors in a 2-hour mentorship session over a period of 4 months. The QI mentors were provided a 2-weeks training on QI approaches for reducing S&D against young key populations in the delivery of SRHR Services. The mentorship sessions are guided by a manual where teams base to analyse root causes of S&D and develop key performance indicators (KPIs) in the 1st and 2nd second sessions respectively. The teams then develop action plans in the 3rd session and review implementation progress on KPIs at the end of subsequent sessions. The KPIs capture information on the attitude of health workers and peer leaders and the general service delivery setting as well as clients’ experience. A dashboard is developed to routinely track the KPIs for S&D across all the supported health facilities and catchment areas. After 4 months, QI teams share documented QI best practices and tested change packages on S&D in a learning and exchange session involving all the teams. Findings: The implementation of this approach is showing positive results. So far, QI teams have already identified the root causes of S&D against key populations including: poor information among health workers, fear of a perceived risk of infection, perceived links between HIV and disreputable behaviour. Others are perceptions that HIV & STIs are divine punishment, sex work and homosexuality are against religion and cultural values. They have also noted the perception that MSM are mentally sick and a danger to everyone. Eight QI teams have developed action plans to address the root causes of S&D. Conclusion: This approach is promising, offers a novel and scalable means to implement stigma-reduction interventions in facility and community settings.Keywords: key populations, sexual reproductive health and rights, stigma and discrimination , quality improvement approach
Procedia PDF Downloads 1732043 Inelastic and Elastic Taping in Plantar Pressure of Runners Pronators: Clinical Trial
Authors: Liana Gomide, Juliana Rodrigues
Abstract:
The morphology of the foot defines its mode of operation and a biomechanical reform indispensable for a symmetrical distribution of plantar pressures in order not to overload some of its components in isolation. High plantar pressures at specific points in the foot may be a causal factor in several orthopedic disorders that affect the feet such as pain and stress fracture. With digital baro-podometry equipment one can observe an intensity of pressures along the entire foot and quantify some of the movements, such as a subtalar pronation present in the midfoot region. Although, they are involved in microtraumas. In clinical practice, excessive movement has been limited with the use of different taping techniques applied on the plantar arch. Thus, the objective of the present study was to analyze and compare the influence of the inelastic and elastic taping on the distribution of plantar pressure of runners pronators. This is a randomized clinical trial and blind-crossover. Twenty (20) male subjects, mean age 33 ± 7 years old, mean body mass of 71 ± 7 kg, mean height of 174 ± 6 cm, were included in the study. A data collection was carried out by a single research through barop-odometry equipment - Tekscan, model F-scan mobile. The tests were performed at three different times. In the first, an initial barop-odometric evaluation was performed, without a bandage application, with edges at a speed of 9.0 km/h. In the second and third moments, the inelastic or elastic taping was applied consecutively, according to the definition defined in the randomization. As results, it was observed that both as inelastic and elastic taping, provided significant reductions in contact pressure and peak pressure values when compared to the moment without a taping. However, an elastic taping was more effective in decreasing contact pressure (no bandage = 714 ± 201, elastic taping = 690 ± 210 and inelastic taping = 716 ± 180) and no peak pressure in the midfoot region (no bandage = 1490 ± 42, elastic taping = 1273 ± 323 and inelastic taping = 1487 ± 437). It is possible to conclude that it is an elastic taping provided by pressure in the middle region, thereby reducing the subtalar pronunciation event during the run.Keywords: elastic taping, inelastic taping, running, subtalar pronation
Procedia PDF Downloads 1562042 Experimental Assessment of the Effectiveness of Judicial Instructions and of Expert Testimony in Improving Jurors’ Evaluation of Eyewitness Evidence
Authors: Alena Skalon, Jennifer L. Beaudry
Abstract:
Eyewitness misidentifications can sometimes lead to wrongful convictions of innocent people. This occurs in part because jurors tend to believe confident eyewitnesses even when the identification took place under suggestive conditions. Empirical research demonstrated that jurors are often unaware of the factors that can influence the reliability of eyewitness identification. Most common legal safeguards that are designed to educate jurors about eyewitness evidence are judicial instructions and expert testimony. To date, very few studies assessed the effectiveness of judicial instructions and most of them found that judicial instructions make jurors more skeptical of eyewitness evidence or do not have any effect on jurors’ judgments. Similar results were obtained for expert testimony. However, none of the previous studies focused on the ability of legal safeguards to improve jurors’ assessment of evidence obtained from suggestive identification procedures—this is one of the gaps addressed by this paper. Furthermore, only three studies investigated whether legal safeguards improve the ultimate accuracy of jurors’ judgments—that is, whether after listening to judicial instructions or expert testimony jurors can differentiate between accurate and inaccurate eyewitnesses. This presentation includes two studies. Both studies used genuine eyewitnesses (i.e., eyewitnesses who watched the crime) and manipulated the suggestiveness of identification procedures. The first study manipulated the presence of judicial instructions; the second study manipulated the presence of one of two types of expert testimony: a traditional, verbal expert testimony or expert testimony accompanied by visual aids. All participant watched a video-recording of an identification procedure and of an eyewitness testimony. The results indicated that neither judicial instructions nor expert testimony affected jurors’ judgments. However, consistent with the previous findings, when the identification procedure was non-suggestive, jurors believed accurate eyewitnesses more often than inaccurate eyewitnesses. When the procedure was suggestive, jurors believed accurate and inaccurate eyewitnesses at the same rate. The paper will discuss the implications of these studies and directions for future research.Keywords: expert testimony, eyewitness evidence, judicial instructions, jurors’ decision making, legal safeguards
Procedia PDF Downloads 1772041 An Investigation into the Use of an Atomistic, Hermeneutic, Holistic Approach in Education Relating to the Architectural Design Process
Authors: N. Pritchard
Abstract:
Within architectural education, students arrive fore-armed with; their life-experience; knowledge gained from subject-based learning; their brains and more specifically their imaginations. The learning-by-doing that they embark on in studio-based/project-based learning calls for supervision that allows the student to proactively undertake research and experimentation with design solution possibilities. The degree to which this supervision includes direction is subject to debate and differing opinion. It can be argued that if the student is to learn-by-doing, then design decision making within the design process needs to be instigated and owned by the student so that they have the ability to personally reflect on and evaluate those decisions. Within this premise lies the problem that the student's endeavours can become unstructured and unfocused as they work their way into a new and complex activity. A resultant weakness can be that the design activity is compartmented and not holistic or comprehensive, and therefore, the student's reflections are consequently impoverished in terms of providing a positive, informative feedback loop. The construct proffered in this paper is that a supportive 'armature' or 'Heuristic-Framework' can be developed that facilitates a holistic approach and reflective learning. The normal explorations of architectural design comprise: Analysing the site and context, reviewing building precedents, assimilating the briefing information. However, the student can still be compromised by 'not knowing what they need to know'. The long-serving triad 'Firmness, Commodity and Delight' provides a broad-brush framework of considerations to explore and integrate into good design. If this were further atomised in subdivision formed from the disparate aspects of architectural design that need to be considered within the design process, then the student could sieve through the facts more methodically and reflectively in terms of considering their interrelationship conflict and alliances. The words facts and sieve hold the acronym of the aspects that form the Heuristic-Framework: Function, Aesthetics, Context, Tectonics, Spatial, Servicing, Infrastructure, Environmental, Value and Ecological issues. The Heuristic could be used as a Hermeneutic Model with each aspect of design being focused on and considered in abstraction and then considered in its relation to other aspect and the design proposal as a whole. Importantly, the heuristic could be used as a method for gathering information and enhancing the design brief. The more poetic, mysterious, intuitive, unconscious processes should still be able to occur for the student. The Heuristic-Framework should not be seen as comprehensive prescriptive formulaic or inhibiting to the wide exploration of possibilities and solutions within the architectural design process.Keywords: atomistic, hermeneutic, holistic, approach architectural design studio education
Procedia PDF Downloads 2602040 The Impact of Heat Waves on Human Health: State of Art in Italy
Authors: Vito Telesca, Giuseppina A. Giorgio
Abstract:
The earth system is subject to a wide range of human activities that have changed the ecosystem more rapidly and extensively in the last five decades. These global changes have a large impact on human health. The relationship between extreme weather events and mortality are widely documented in different studies. In particular, a number of studies have investigated the relationship between climatological variations and the cardiovascular and respiratory system. The researchers have become interested in the evaluation of the effect of environmental variations on the occurrence of different diseases (such as infarction, ischemic heart disease, asthma, respiratory problems, etc.) and mortality. Among changes in weather conditions, the heat waves have been used for investigating the association between weather conditions and cardiovascular events and cerebrovascular, using thermal indices, which combine air temperature, relative humidity, and wind speed. The effects of heat waves on human health are mainly found in the urban areas and they are aggravated by the presence of atmospheric pollution. The consequences of these changes for human health are of growing concern. In particular, meteorological conditions are one of the environmental aspects because cardiovascular diseases are more common among the elderly population, and such people are more sensitive to weather changes. In addition, heat waves, or extreme heat events, are predicted to increase in frequency, intensity, and duration with climate change. In this context, are very important public health and climate change connections increasingly being recognized by the medical research, because these might help in informing the public at large. Policy experts claim that a growing awareness of the relationships of public health and climate change could be a key in breaking through political logjams impeding action on mitigation and adaptation. The aims of this study are to investigate about the importance of interactions between weather variables and your effects on human health, focusing on Italy. Also highlighting the need to define strategies and practical actions of monitoring, adaptation and mitigation of the phenomenon.Keywords: climate change, illness, Italy, temperature, weather
Procedia PDF Downloads 2472039 Time of Week Intensity Estimation from Interval Censored Data with Application to Police Patrol Planning
Authors: Jiahao Tian, Michael D. Porter
Abstract:
Law enforcement agencies are tasked with crime prevention and crime reduction under limited resources. Having an accurate temporal estimate of the crime rate would be valuable to achieve such a goal. However, estimation is usually complicated by the interval-censored nature of crime data. We cast the problem of intensity estimation as a Poisson regression using an EM algorithm to estimate the parameters. Two special penalties are added that provide smoothness over the time of day and day of the week. This approach presented here provides accurate intensity estimates and can also uncover day-of-week clusters that share the same intensity patterns. Anticipating where and when crimes might occur is a key element to successful policing strategies. However, this task is complicated by the presence of interval-censored data. The censored data refers to the type of data that the event time is only known to lie within an interval instead of being observed exactly. This type of data is prevailing in the field of criminology because of the absence of victims for certain types of crime. Despite its importance, the research in temporal analysis of crime has lagged behind the spatial component. Inspired by the success of solving crime-related problems with a statistical approach, we propose a statistical model for the temporal intensity estimation of crime with censored data. The model is built on Poisson regression and has special penalty terms added to the likelihood. An EM algorithm was derived to obtain maximum likelihood estimates, and the resulting model shows superior performance to the competing model. Our research is in line with the smart policing initiative (SPI) proposed by the Bureau Justice of Assistance (BJA) as an effort to support law enforcement agencies in building evidence-based, data-driven law enforcement tactics. The goal is to identify strategic approaches that are effective in crime prevention and reduction. In our case, we allow agencies to deploy their resources for a relatively short period of time to achieve the maximum level of crime reduction. By analyzing a particular area within cities where data are available, our proposed approach could not only provide an accurate estimate of intensities for the time unit considered but a time-variation crime incidence pattern. Both will be helpful in the allocation of limited resources by either improving the existing patrol plan with the understanding of the discovery of the day of week cluster or supporting extra resources available.Keywords: cluster detection, EM algorithm, interval censoring, intensity estimation
Procedia PDF Downloads 662038 Sustainable Hydrogen Generation via Gasification of Pig Hair Biowaste with NiO/Al₂O₃ Catalysts
Authors: Jamshid Hussain, Kuen Song Lin
Abstract:
Over one thousand tons of pig hair biowaste (PHB) are produced yearly in Taiwan. The improper disposal of PHB can have a negative impact on the environment, consequently contributing to the spread of diseases. The treatment of PHB has become a major environmental and economic challenge. Innovative treatments must be developed because of the heavy metal and sulfur content of PHB. Like most organic materials, PHB is composed of many organic volatiles that contain large amounts of hydrogen. Hydrogen gas can be effectively produced by the catalytic gasification of PHB using a laboratory-scale fixed-bed gasifier, employing 15 wt% NiO/Al₂O₃ catalyst at 753–913 K. The derived kinetic parameters were obtained and refined using simulation calculations. FE–SEM microphotograph showed that NiO/Al₂O₃ catalyst particles are Spherical or irregularly shaped with diameters of 10–20 nm. HR–TEM represented that the fresh Ni particles were evenly dispersed and uniform in the microstructure of Al₂O₃ support. The sizes of the NiO nanoparticles were vital in determining catalyst activity. As displayed in the pre-edge XANES spectra of the NiO/Al₂O₃ catalysts, it exhibited a non-intensive absorbance nature for the 1s to 3d transition, which is prohibited by the selection rule for an ideal octahedral symmetry. Similarly, the populace of Ni(II) and Ni(0) onto Al₂O₃ supports are proportional to the strength of the 1s to 4pxy transition, respectively. The weak shoulder at 8329–8334 eV and a strong character at 8345–8353 eV were ascribed to the 1s to 4pxy shift, which suggested the presence of NiO types onto Al₂O₃ support in PHB catalytic gasification. As determined by the XANES analyses, Ni(II)→Ni(0) reduction was mostly observed. The oxidation of PHB onto the NiO/Al₂O₃ surface may have resulted in Ni(0) and the formation of tar during the gasification process. The EXAFS spectra revealed that the Ni atoms with Ni–Ni/Ni–O bonds were found. The Ni–O bonding proved that the produced syngas were unable to reduce NiO to Ni(0) completely. The weakness of the Ni–Ni bonds may have been caused by the highly dispersed Ni in the Al₂O₃ support. The central Ni atoms have Ni–O (2.01 Å) and Ni–Ni (2.34 Å) bond distances in the fresh NiO/Al₂O₃ catalyst. The PHB was converted into hydrogen-rich syngas (CO + H₂, >89.8% dry basis). When PHB (250 kg h−1) was catalytically gasified at 753–913 K, syngas was produced at approximately 5.45 × 105 kcal h−1 of heat recovery with 76.5%–83.5% cold gas efficiency. The simulation of the pilot-scale PHB catalytic gasification demonstrated that the system could provide hydrogen (purity > 99.99%) and generate electricity for an internal combustion engine of 100 kW and a proton exchange membrane fuel cell (PEMFC) of 175 kW. A projected payback for a PHB catalytic gasification plant with a capacity of 10- or 20-TPD (ton per day) was around 3.2 or 2.5 years, respectively.Keywords: pig hair biowaste, catalytic gasification, hydrogen production, PEMFC, resource recovery
Procedia PDF Downloads 132037 A Method to Assess Aspect of Sustainable Development: Walkability
Authors: Amna Ali Al-Saadi, Riken Homma, Kazuhisa Iki
Abstract:
Despite the fact that many places have successes in achieving some aspects of sustainable urban development, there are no scientific facts to convince decision makers. Also, each of them was developed to fulfill the need of specific city only. Therefore, objective method to generate the solutions from a successful case is the aim of this research. The questions were: how to learn the lesson from each case study; how to distinguish the potential criteria and negative one; snd how to quantify their effects in the future development. Walkability has been selected as a goal. This is because it has been found as a solution to achieve healthy life style as well as social, environmental and economic sustainability. Moreover, it has complication as every aspect of sustainable development. This research is stand on quantitative- comparative methodology in order to assess pedestrian oriented development. Three analyzed area (AAs) were selected. One site is located in Oman in which hypotheses as motorized oriented development, while two sites are in Japan where the development is pedestrian friendly. The study used Multi- criteria evaluation method (MCEM). Initially, MCEM stands on analytic hierarchy process (AHP). The later was structured into main goal (walkability), objectives (functions and layout) and attributes (the urban form criteria). Secondly, the GIS were used to evaluate the attributes in multi-criteria maps. Since each criterion has different scale of measurement, all results were standardized by z-score and used to measure the co-relations among criteria. As results, different scenario was generated from each AA. MCEM (AHP-OWA)-GIS measured the walkability score and determined the priority of criteria development in the non-walker friendly environment. The comparison criteria for z-score presented a measurable distinguished orientation of development. This result has been used to prove that Oman is motorized environment while Japan is walkable. Also, it defined the powerful criteria and week criteria regardless to the AA. This result has been used to generalize the priority for walkable development. In conclusion, the method was found successful in generate scientific base for policy decisions.Keywords: walkability, policy decisions, sustainable development, GIS
Procedia PDF Downloads 4402036 Comparison Of Virtual Non-Contrast To True Non-Contrast Images Using Dual Layer Spectral Computed Tomography
Authors: O’Day Luke
Abstract:
Purpose: To validate virtual non-contrast reconstructions generated from dual-layer spectral computed tomography (DL-CT) data as an alternative for the acquisition of a dedicated true non-contrast dataset during multiphase contrast studies. Material and methods: Thirty-three patients underwent a routine multiphase clinical CT examination, using Dual-Layer Spectral CT, from March to August 2021. True non-contrast (TNC) and virtual non-contrast (VNC) datasets, generated from both portal venous and arterial phase imaging were evaluated. For every patient in both true and virtual non-contrast datasets, a region-of-interest (ROI) was defined in aorta, liver, fluid (i.e. gallbladder, urinary bladder), kidney, muscle, fat and spongious bone, resulting in 693 ROIs. Differences in attenuation for VNC and TNV images were compared, both separately and combined. Consistency between VNC reconstructions obtained from the arterial and portal venous phase was evaluated. Results: Comparison of CT density (HU) on the VNC and TNC images showed a high correlation. The mean difference between TNC and VNC images (excluding bone results) was 5.5 ± 9.1 HU and > 90% of all comparisons showed a difference of less than 15 HU. For all tissues but spongious bone, the mean absolute difference between TNC and VNC images was below 10 HU. VNC images derived from the arterial and the portal venous phase showed a good correlation in most tissue types. The aortic attenuation was somewhat dependent however on which dataset was used for reconstruction. Bone evaluation with VNC datasets continues to be a problem, as spectral CT algorithms are currently poor in differentiating bone and iodine. Conclusion: Given the increasing availability of DL-CT and proven accuracy of virtual non-contrast processing, VNC is a promising tool for generating additional data during routine contrast-enhanced studies. This study shows the utility of virtual non-contrast scans as an alternative for true non-contrast studies during multiphase CT, with potential for dose reduction, without loss of diagnostic information.Keywords: dual-layer spectral computed tomography, virtual non-contrast, true non-contrast, clinical comparison
Procedia PDF Downloads 1412035 Evaluation of κ -Carrageenan Hydrogel Efficiency in Wound-Healing
Authors: Ali Ayatic, Emad Mozaffari, Bahareh Tanhaei, Maryam Khajenoori, Saeedeh Movaghar Khoshkho, Ali Ayati
Abstract:
The abuse of antibiotics, such as tetracycline (TC), is a great global threat to people and the use of topical antibiotics is a promising tact that can help to solve this problem. Antibiotic therapy is often appropriate and necessary for acute wound infections, while topical tetracycline can be highly efficient in improving the wound healing process in diabetics. Due to the advantages of drug-loaded hydrogels as wound dressing, such as ease of handling, high moisture resistance, excellent biocompatibility, and the ability to activate immune cells to speed wound healing, it was found as an ideal wound treatment. In this work, the tetracycline-loaded hydrogels combining agar (AG) and κ-carrageenan (k-CAR) as polymer materials were prepared, in which span60 surfactant was introduced inside as a drug carrier. The Field Emission Scanning Electron Microscopes (FESEM) and Fourier-transform infrared spectroscopy (FTIR) techniques were employed to provide detailed information on the morphology, composition, and structure of fabricated drug-loaded hydrogels and their mechanical properties, and hydrogel permeability to water vapor was investigated as well. Two types of gram-negative and gram-positive bacteria were used to explore the antibacterial properties of prepared tetracycline-contained hydrogels. Their swelling and drug release behavior was studied using the changing factors such as the ratio of polysaccharides (MAG/MCAR), the span60 surfactant concentration, potassium chloride (KCl) concentration and different release media (deionized water (DW), phosphate-buffered saline (PBS), and simulated wound fluid (SWF)) at different times. Finally, the kinetic behavior of hydrogel swelling was studied. Also, the experimental data of TC release to DW, PBS, and SWF using various mathematical models such as Higuchi, Korsmeyer-Peppas, zero-order, and first-order in the linear and nonlinear modes were evaluated.Keywords: drug release, hydrogel, tetracycline, wound healing
Procedia PDF Downloads 802034 Integration of Educational Data Mining Models to a Web-Based Support System for Predicting High School Student Performance
Authors: Sokkhey Phauk, Takeo Okazaki
Abstract:
The challenging task in educational institutions is to maximize the high performance of students and minimize the failure rate of poor-performing students. An effective method to leverage this task is to know student learning patterns with highly influencing factors and get an early prediction of student learning outcomes at the timely stage for setting up policies for improvement. Educational data mining (EDM) is an emerging disciplinary field of data mining, statistics, and machine learning concerned with extracting useful knowledge and information for the sake of improvement and development in the education environment. The study is of this work is to propose techniques in EDM and integrate it into a web-based system for predicting poor-performing students. A comparative study of prediction models is conducted. Subsequently, high performing models are developed to get higher performance. The hybrid random forest (Hybrid RF) produces the most successful classification. For the context of intervention and improving the learning outcomes, a feature selection method MICHI, which is the combination of mutual information (MI) and chi-square (CHI) algorithms based on the ranked feature scores, is introduced to select a dominant feature set that improves the performance of prediction and uses the obtained dominant set as information for intervention. By using the proposed techniques of EDM, an academic performance prediction system (APPS) is subsequently developed for educational stockholders to get an early prediction of student learning outcomes for timely intervention. Experimental outcomes and evaluation surveys report the effectiveness and usefulness of the developed system. The system is used to help educational stakeholders and related individuals for intervening and improving student performance.Keywords: academic performance prediction system, educational data mining, dominant factors, feature selection method, prediction model, student performance
Procedia PDF Downloads 1062033 Analysis of Storm Flood in Typical Sewer Networks in High Mountain Watersheds of Colombia Based on SWMM
Authors: J. C. Hoyos, J. Zambrano Nájera
Abstract:
Increasing urbanization has led to changes in the natural dynamics of watersheds, causing problems such as increases in volumes of runoff, peak flow rates, and flow rates so that the risk of storm flooding increases. Sewerage networks designed 30 – 40 years ago don’t account for these increases in flow volumes and velocities. Besides, Andean cities with high slopes worsen the problem because velocities are even higher not allowing sewerage network work and causing cities less resilient to landscape changes and climatic change. In Latin America, especially Colombia, this is a major problem because urban population at late XX century was more than 70% is in urban areas increasing approximately in 790% in 1940-1990 period. Thus, it becomes very important to study how changes in hydrological behavior affect hydraulic capacity of sewerage networks in Andean Urban Watersheds. This research aims to determine the impact of urbanization in high-sloped urban watersheds in its hydrology. To this end it will be used as study area experimental urban watershed named Palogrande-San Luis watershed, located in the city of Manizales, Colombia. Manizales is a city in central western Colombia, located in Colombian Central Mountain Range (part of Los Andes Mountains) with an abrupt topography (average altitude is 2.153 m). The climate in Manizales is quite uniform, but due to its high altitude it presents high precipitations (1.545 mm/year average) with high humidity (83% average). Behavior of the current sewerage network will be reviewed by the hydraulic model SWMM (Storm Water Management Model). Based on SWMM the hydrological response of urban watershed selected will be evaluated under the design storm with different frequencies in the region, such as drainage effect and water-logging, overland flow on roads, etc. Cartographic information was obtained from a Geographic Information System (GIS) thematic maps of the Institute of Environmental Studies of the Universidad Nacional de Colombia and the utility Aguas de Manizales S.A. Rainfall and streamflow data is obtained from 4 rain gages and 1 stream gages. This information will allow determining critical issues on drainage systems design in urban watershed with very high slopes, and which practices will be discarded o recommended.Keywords: land cover changes, storm sewer system, urban hydrology, urban planning
Procedia PDF Downloads 2612032 Media Representations of Gender-Intersectional Analysis of Impact/Influence on Collective Consciousness and Perceptions of Feminism, Gender, and Gender Equality: Evidence from Cultural/Media Sources in Nigeria
Authors: Olatawura O. Ladipo-Ajayi
Abstract:
The concept of gender equality is not new, nor are the efforts and movements toward achieving this concept. The idea of gender equality originates from the early feminist movements of the 1880s and its subsequent waves, all fighting to promote gender rights and equality focused on varying aspects and groups. Nonetheless, the progress and achievement of gender equality are not progressing at similar rates across the world and groups. This uneven progress is often due to varying social, cultural, political, and economic factors- some of which underpin intersectional identities and influence the perceptions of gender and associated gender roles that create gender inequality. In assessing perceptions of gender and assigned roles or expectations that cause inequalities, intersectionality provides a framework to interrogate how these perceptions are molded and reinforced to create marginalization. Intersectionality is increasingly becoming a lens and approach to understanding better inequalities and oppression, gender rights and equality, the challenges towards its achievement, and how best to move forward in the fight for gender rights, inclusion, and equality. In light of this, this paper looks at intersectional representations of gender in the media within cultural/social contexts -particularly entertainment media- and how this influences perceptions of gender and impacts progress toward achieving gender equality and advocacy. Furthermore, the paper explores how various identities and, to an extent, personal experiences play a role in the perceptions of and representations of gender, as well as influence the development of policies that promote gender equality in general. Finally, the paper applies qualitative and auto-ethnographic research methods building on intersectional and social construction frameworks to analyze gender representation in media using a literature review of scholarly works, news items, and cultural/social sources like Nigerian movies. It concludes that media influences ideas and perceptions of gender, gender equality, and rights; there isn’t enough being done in the media in the global south in general to challenge the hegemonic patriarchal and binary concepts of gender. As such, the growth of feminism and the attainment of gender equality is slow, and the concepts are often misunderstood. There is a need to leverage media outlets to influence perceptions and start informed conversations on gender equality and feminism; build collective consciousness locally to improve advocacy for equal gender rights. Changing the gender narrative in everyday media, including entertainment media, is one way to influence public perceptions of gender, promote the concept of gender equality, and advocate for policies that support equality.Keywords: gender equality, gender roles/socialization, intersectionality, representation of gender in media
Procedia PDF Downloads 1052031 Production Process for Diesel Fuel Components Polyoxymethylene Dimethyl Ethers from Methanol and Formaldehyde Solution
Authors: Xiangjun Li, Huaiyuan Tian, Wujie Zhang, Dianhua Liu
Abstract:
Polyoxymethylene dimethyl ethers (PODEn) as clean diesel additive can improve the combustion efficiency and quality of diesel fuel and alleviate the problem of atmospheric pollution. Considering synthetic routes, PODE production from methanol and formaldehyde is regarded as the most economical and promising synthetic route. However, methanol used for synthesizing PODE can produce water, which causes the loss of active center of catalyst and hydrolysis of PODEn in the production process. Macroporous strong acidic cation exchange resin catalyst was prepared, which has comparative advantages over other common solid acid catalysts in terms of stability and catalytic efficiency for synthesizing PODE. Catalytic reactions were carried out under 353 K, 1 MPa and 3mL·gcat-1·h-1 in a fixed bed reactor. Methanol conversion and PODE3-6 selectivity reached 49.91% and 23.43%, respectively. Catalyst lifetime evaluation showed that resin catalyst retained its catalytic activity for 20 days without significant changes and catalytic activity of completely deactivated resin catalyst can basically return to previous level by simple acid regeneration. The acid exchange capacities of original and deactivated catalyst were 2.5191 and 0.0979 mmol·g-1, respectively, while regenerated catalyst reached 2.0430 mmol·g-1, indicating that the main reason for resin catalyst deactivation is that Brønsted acid sites of original resin catalyst were temporarily replaced by non-hydrogen ion cations. A separation process consisting of extraction and distillation for PODE3-6 product was designed for separation of water and unreacted formaldehyde from reactive mixture and purification of PODE3-6, respectively. The concentration of PODE3-6 in final product can reach up to 97%. These results indicate that the scale-up production of PODE3-6 from methanol and formaldehyde solution is feasible.Keywords: inactivation, polyoxymethylene dimethyl ethers, separation process, sulfonic cation exchange resin
Procedia PDF Downloads 1372030 Comparative Evaluation of Seropositivity and Patterns Distribution Rates of the Anti-Nuclear Antibodies in the Diagnosis of Four Different Autoimmune Collagen Tissue Diseases
Authors: Recep Kesli, Onur Turkyilmaz, Cengiz Demir
Abstract:
Objective: Autoimmune collagen diseases occur with the immune reactions against the body’s own cell or tissues which cause inflammation and damage the tissues and organs. In this study, it was aimed to compare seropositivity rates and patterns of the anti-nuclear antibodies (ANA) in the diagnosis of four different autoimmune collagen tissue diseases (Rheumatoid Arthritis-RA, Systemic Lupus Erythematous-SLE, Scleroderma-SSc and Sjogren Syndrome-SS) with each other. Methods: One hundred eighty-eight patients applied to different clinics in Afyon Kocatepe University ANS Practice and Research Hospital between 11.07.2014 and 14.07.2015 that thought the different collagen disease such as RA, SLE, SSc and SS have participated in the study retrospectively. All the data obtained from the patients participated in the study were evaluated according to the included criteria. The historical archives belonging to the patients have been screened, assessed in terms of ANA positivity. The obtained data was analysed by using the descriptive statistics; chi-squared, Fischer's exact test. The evaluations were performed by SPSS 20.0 version and p < 0.05 level was considered as significant. Results: Distribution rates of the totally one hundred eighty-eight patients according to the diagnosis were found as follows: 82 (43.6%) were RA, 38 (20.2%) were SLE, 22 (11.7%) were SSc, and 46 (24.5%) were SS. Distribution of ANA positivity rates according to the collagen tissue diseases were found as follows; for RA were 54 (65,9 %), for SLE were 36 (94,7 %), for SSc were 18 (81,8 %), and for SS were 43 (93,5 %). Rheumatoid arthritis should be evaluated and classified as a different class among all the other investigated three autoimmune illnesses. ANA positivity rates were found as differently higher (91.5 %) in the SLE, SSc, and SS, from the RA (65.9 %). Differences at ANA positivity rates for RA and the other three diseases were found as statistically significant (p=0.015). Conclusions: Systemic autoimmune illnesses show broad spectrum. ANA positivity was found as an important predictor marker in the diagnosis of the rheumatologic illnesses. ANA positivity should be evaluated as more valuable and sensitive a predictor diagnostic marker in the laboratory findings of the SLE, SSc, and SS according to RA.Keywords: antinuclear antibody (ANA), rheumatoid arthritis, scleroderma, Sjogren syndrome, systemic lupus Erythemotosus
Procedia PDF Downloads 2432029 Greenhouse Gasses’ Effect on Atmospheric Temperature Increase and the Observable Effects on Ecosystems
Authors: Alexander J. Severinsky
Abstract:
Radiative forces of greenhouse gases (GHG) increase the temperature of the Earth's surface, more on land, and less in oceans, due to their thermal capacities. Given this inertia, the temperature increase is delayed over time. Air temperature, however, is not delayed as air thermal capacity is much lower. In this study, through analysis and synthesis of multidisciplinary science and data, an estimate of atmospheric temperature increase is made. Then, this estimate is used to shed light on current observations of ice and snow loss, desertification and forest fires, and increased extreme air disturbances. The reason for this inquiry is due to the author’s skepticism that current changes cannot be explained by a "~1 oC" global average surface temperature rise within the last 50-60 years. The only other plausible cause to explore for understanding is that of atmospheric temperature rise. The study utilizes an analysis of air temperature rise from three different scientific disciplines: thermodynamics, climate science experiments, and climactic historical studies. The results coming from these diverse disciplines are nearly the same, within ± 1.6%. The direct radiative force of GHGs with a high level of scientific understanding is near 4.7 W/m2 on average over the Earth’s entire surface in 2018, as compared to one in pre-Industrial time in the mid-1700s. The additional radiative force of fast feedbacks coming from various forms of water gives approximately an additional ~15 W/m2. In 2018, these radiative forces heated the atmosphere by approximately 5.1 oC, which will create a thermal equilibrium average ground surface temperature increase of 4.6 oC to 4.8 oC by the end of this century. After 2018, the temperature will continue to rise without any additional increases in the concentration of the GHGs, primarily of carbon dioxide and methane. These findings of the radiative force of GHGs in 2018 were applied to estimates of effects on major Earth ecosystems. This additional force of nearly 20 W/m2 causes an increase in ice melting by an additional rate of over 90 cm/year, green leaves temperature increase by nearly 5 oC, and a work energy increase of air by approximately 40 Joules/mole. This explains the observed high rates of ice melting at all altitudes and latitudes, the spread of deserts and increases in forest fires, as well as increased energy of tornadoes, typhoons, hurricanes, and extreme weather, much more plausibly than the 1.5 oC increase in average global surface temperature in the same time interval. Planned mitigation and adaptation measures might prove to be much more effective when directed toward the reduction of existing GHGs in the atmosphere.Keywords: greenhouse radiative force, greenhouse air temperature, greenhouse thermodynamics, greenhouse historical, greenhouse radiative force on ice, greenhouse radiative force on plants, greenhouse radiative force in air
Procedia PDF Downloads 1042028 Vehicles Analysis, Assessment and Redesign Related to Ergonomics and Human Factors
Authors: Susana Aragoneses Garrido
Abstract:
Every day, the roads are scenery of numerous accidents involving vehicles, producing thousands of deaths and serious injuries all over the world. Investigations have revealed that Human Factors (HF) are one of the main causes of road accidents in modern societies. Distracted driving (including external or internal aspects of the vehicle), which is considered as a human factor, is a serious and emergent risk to road safety. Consequently, a further analysis regarding this issue is essential due to its transcendence on today’s society. The objectives of this investigation are the detection and assessment of the HF in order to provide solutions (including a better vehicle design), which might mitigate road accidents. The methodology of the project is divided in different phases. First, a statistical analysis of public databases is provided between Spain and The UK. Second, data is classified in order to analyse the major causes involved in road accidents. Third, a simulation between different paths and vehicles is presented. The causes related to the HF are assessed by Failure Mode and Effects Analysis (FMEA). Fourth, different car models are evaluated using the Rapid Upper Body Assessment (RULA). Additionally, the JACK SIEMENS PLM tool is used with the intention of evaluating the Human Factor causes and providing the redesign of the vehicles. Finally, improvements in the car design are proposed with the intention of reducing the implication of HF in traffic accidents. The results from the statistical analysis, the simulations and the evaluations confirm that accidents are an important issue in today’s society, especially the accidents caused by HF resembling distractions. The results explore the reduction of external and internal HF through the global analysis risk of vehicle accidents. Moreover, the evaluation of the different car models using RULA method and the JACK SIEMENS PLM prove the importance of having a good regulation of the driver’s seat in order to avoid harmful postures and therefore distractions. For this reason, a car redesign is proposed for the driver to acquire the optimum position and consequently reducing the human factors in road accidents.Keywords: analysis vehicles, asssesment, ergonomics, car redesign
Procedia PDF Downloads 3352027 An Examination of Earnings Management by Publicly Listed Targets Ahead of Mergers and Acquisitions
Authors: T. Elrazaz
Abstract:
This paper examines accrual and real earnings management by publicly listed targets around mergers and acquisitions. Prior literature shows that earnings management around mergers and acquisitions can have a significant economic impact because of the associated wealth transfers among stakeholders. More importantly, acting on behalf of their shareholders or pursuing their self-interests, managers of both targets and acquirers may be equally motivated to manipulate earnings prior to an acquisition to generate higher gains for their shareholders or themselves. Building on the grounds of information asymmetry, agency conflicts, stewardship theory, and the revelation principle, this study addresses the question of whether takeover targets employ accrual and real earnings management in the periods prior to the announcement of Mergers and Acquisitions (M&A). Additionally, this study examines whether acquirers are able to detect targets’ earnings management, and in response, adjust the acquisition premium paid in order not to face the risk of overpayment. This study uses an aggregate accruals approach in estimating accrual earnings management as proxied by estimated abnormal accruals. Additionally, real earnings management is proxied for by employing widely used models in accounting and finance literature. The results of this study indicate that takeover targets manipulate their earnings using accruals in the second year with an earnings release prior to the announcement of the M&A. Moreover, in partitioning the sample of targets according to the method of payment used in the deal, the results are restricted only to targets of stock-financed deals. These results are consistent with the argument that targets of cash-only or mixed-payment deals do not have the same strong motivations to manage their earnings as their stock-financed deals counterparts do additionally supporting the findings of prior studies that the method of payment in takeovers is value relevant. The findings of this study also indicate that takeover targets manipulate earnings upwards through cutting discretionary expenses the year prior to the acquisition while they do not do so by manipulating sales or production costs. Moreover, in partitioning the sample of targets according to the method of payment used in the deal, the results are restricted only to targets of stock-financed deals, providing further robustness to the results derived under the accrual-based models. Finally, this study finds evidence suggesting that acquirers are fully aware of the accrual-based techniques employed by takeover targets and can unveil such manipulation practices. These results are robust to alternative accrual and real earnings management proxies, as well as controlling for the method of payment in the deal.Keywords: accrual earnings management, acquisition premium, real earnings management, takeover targets
Procedia PDF Downloads 1152026 Development and Validation of a Carbon Dioxide TDLAS Sensor for Studies on Fermented Dairy Products
Authors: Lorenzo Cocola, Massimo Fedel, Dragiša Savić, Bojana Danilović, Luca Poletto
Abstract:
An instrument for the detection and evaluation of gaseous carbon dioxide in the headspace of closed containers has been developed in the context of Packsensor Italian-Serbian joint project. The device is based on Tunable Diode Laser Absorption Spectroscopy (TDLAS) with a Wavelength Modulation Spectroscopy (WMS) technique in order to accomplish a non-invasive measurement inside closed containers of fermented dairy products (yogurts and fermented cheese in cups and bottles). The purpose of this instrument is the continuous monitoring of carbon dioxide concentration during incubation and storage of products over a time span of the whole shelf life of the product, in the presence of different microorganisms. The instrument’s optical front end has been designed to be integrated in a thermally stabilized incubator. An embedded computer provides processing of spectral artifacts and storage of an arbitrary set of calibration data allowing a properly calibrated measurement on many samples (cups and bottles) of different shapes and sizes commonly found in the retail distribution. A calibration protocol has been developed in order to be able to calibrate the instrument on the field also on containers which are notoriously difficult to seal properly. This calibration protocol is described and evaluated against reference measurements obtained through an industry standard (sampling) carbon dioxide metering technique. Some sets of validation test measurements on different containers are reported. Two test recordings of carbon dioxide concentration evolution are shown as an example of instrument operation. The first demonstrates the ability to monitor a rapid yeast growth in a contaminated sample through the increase of headspace carbon dioxide. Another experiment shows the dissolution transient with a non-saturated liquid medium in presence of a carbon dioxide rich headspace atmosphere.Keywords: TDLAS, carbon dioxide, cups, headspace, measurement
Procedia PDF Downloads 3242025 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 3152024 Parametric Evaluation for the Optimization of Gastric Emptying Protocols Used in Health Care Institutions
Authors: Yakubu Adamu
Abstract:
The aim of this research was to assess the factors contributing to the need for optimisation of the gastric emptying protocols in nuclear medicine and molecular imaging (SNMMI) procedures. The objective is to suggest whether optimisation is possible and provide supporting evidence for the current imaging protocols of gastric emptying examination used in nuclear medicine. The research involved the use of some selected patients with 30 dynamic series for the image processing using ImageJ, and by so doing, the calculated half-time, retention fraction to the 60 x1 minute, 5 minute and 10-minute protocol, and other sampling intervals were obtained. Results from the study IDs for the gastric emptying clearance half-time were classified into normal, abnormal fast, and abnormal slow categories. In the normal category, which represents 50% of the total gastric emptying image IDs processed, their clearance half-time was within the range of 49.5 to 86.6 minutes of the mean counts. Also, under the abnormal fast category, their clearance half-time fell between 21 to 43.3 minutes of the mean counts, representing 30% of the total gastric emptying image IDs processed, and the abnormal slow category had clearance half-time within the range of 138.6 to 138.6 minutes of the mean counts, representing 20%. The results indicated that the calculated retention fraction values from the 1, 5, and 10-minute sampling curves and the measured values of gastric emptying retention fraction from sampling curves of the study IDs had a normal retention fraction of <60% and decreased exponentially with an increase in time and it was evident with low percentages of retention fraction ratios of < 10% after the 4 hours. Thus, this study does not change categories suggesting that these values could feasibly be used instead of having to acquire actual images. Findings from the study suggest that the current gastric emptying protocol can be optimized by acquiring fewer images. The study recommended that the gastric emptying studies should be performed with imaging at a minimum of 0, 1, 2, and 4 hours after meal ingestion.Keywords: gastric emptying, retention fraction, clearance halftime, optimisation, protocol
Procedia PDF Downloads 42023 Faculty Use of Geospatial Tools for Deep Learning in Science and Engineering Courses
Authors: Laura Rodriguez Amaya
Abstract:
Advances in science, technology, engineering, and mathematics (STEM) are viewed as important to countries’ national economies and their capacities to be competitive in the global economy. However, many countries experience low numbers of students entering these disciplines. To strengthen the professional STEM pipelines, it is important that students are retained in these disciplines at universities. Scholars agree that to retain students in universities’ STEM degrees, it is necessary that STEM course content shows the relevance of these academic fields to their daily lives. By increasing students’ understanding on the importance of these degrees and careers, students’ motivation to remain in these academic programs can also increase. An effective way to make STEM content relevant to students’ lives is the use of geospatial technologies and geovisualization in the classroom. The Geospatial Revolution, and the science and technology associated with it, has provided scientists and engineers with an incredible amount of data about Earth and Earth systems. This data can be used in the classroom to support instruction and make content relevant to all students. The purpose of this study was to find out the prevalence use of geospatial technologies and geovisualization as teaching practices in a USA university. The Teaching Practices Inventory survey, which is a modified version of the Carl Wieman Science Education Initiative Teaching Practices Inventory, was selected for the study. Faculty in the STEM disciplines that participated in a summer learning institute at a 4-year university in the USA constituted the population selected for the study. One of the summer learning institute’s main purpose was to have an impact on the teaching of STEM courses, particularly the teaching of gateway courses taken by many STEM majors. The sample population for the study is 97.5 of the total number of summer learning institute participants. Basic descriptive statistics through the Statistical Package for the Social Sciences (SPSS) were performed to find out: 1) The percentage of faculty using geospatial technologies and geovisualization; 2) Did the faculty associated department impact their use of geospatial tools?; and 3) Did the number of years in a teaching capacity impact their use of geospatial tools? Findings indicate that only 10 percent of respondents had used geospatial technologies, and 18 percent had used geospatial visualization. In addition, the use of geovisualization among faculty of different disciplines was broader than the use of geospatial technologies. The use of geospatial technologies concentrated in the engineering departments. Data seems to indicate the lack of incorporation of geospatial tools in STEM education. The use of geospatial tools is an effective way to engage students in deep STEM learning. Future research should look at the effect on student learning and retention in science and engineering programs when geospatial tools are used.Keywords: engineering education, geospatial technology, geovisualization, STEM
Procedia PDF Downloads 2522022 Evaluation of Neonicotinoids Against Sucking Insect Pests of Cotton in Laboratory and Field Conditions
Authors: Muhammad Sufyan, Muhammad D. Gogi, Muhammad Arshad, Ahmad Nawaz, Muhammad Usman
Abstract:
Cotton (Gossypium hirsutum) universally known as silver fiber and is one of the most important cash crop of Pakistan. A wide array of pests constraints cotton production among which sucking insect pests cause serious losses. Mostly new chemistry insecticides used to control a wide variety of insect pests including sucking insect pests. In the present study efficacy of different neonicotinoids was evaluated against sucking insect pests of cotton in the field and in laboratory for red and dusky cotton bug. The experiment was conducted at Entomology Research Station, University of Agriculture Faisalabad, in a Randomized Complete Block Design (RCBD). Field trial was conducted to evaluate the efficacy of Confidence Ultra (Imidacloprid) 70% SL, Confidor (Imidacloprid) 20% SL, Kendo (Lambda cyhalothrin) 24.7 SC, Actara (Thiamethoxam) 25% WG, Forcast (Tebufenozide+ Emamectin benzoate) 8.8 EW and Timer (Emamectin benzoate) 1.9 EC at their recommended doses. The data was collected on per leaf basis of thrips, aphid, jassid and whitefly before 24 hours of spray. The post treatment data was recorded after 24, 48 and 72 hours. The fresh, non-infested and untreated cotton leaves was collected from the field and brought to the laboratory to assess the efficacy of neonicotinoids against red and dusky cotton bug. After data analysis all the insecticides were found effective against sucking pests. Confidence Ultra was highly effective against the aphid, jassid, and whitefly and gave maximum mortality, while showed non-significant results against thrips. In case of aphid plot which was treated with Kando 24.7 SC showed significant mortality after 72 hours of pesticide application. Similar trends were found in laboratory conditions with all these treatments by making different concentrations and had significant impact on dusky cotton bug and red cotton bug population after 24, 48 and 72 hours after application.Keywords: cotton, laboratory and field conditions, neonicotinoids, sucking insect pests
Procedia PDF Downloads 2422021 The Integrated Methodological Development of Reliability, Risk and Condition-Based Maintenance in the Improvement of the Thermal Power Plant Availability
Authors: Henry Pariaman, Iwa Garniwa, Isti Surjandari, Bambang Sugiarto
Abstract:
Availability of a complex system of thermal power plant is strongly influenced by the reliability of spare parts and maintenance management policies. A reliability-centered maintenance (RCM) technique is an established method of analysis and is the main reference for maintenance planning. This method considers the consequences of failure in its implementation, but does not deal with further risk of down time that associated with failures, loss of production or high maintenance costs. Risk-based maintenance (RBM) technique provides support strategies to minimize the risks posed by the failure to obtain maintenance task considering cost effectiveness. Meanwhile, condition-based maintenance (CBM) focuses on monitoring the application of the conditions that allow the planning and scheduling of maintenance or other action should be taken to avoid the risk of failure prior to the time-based maintenance. Implementation of RCM, RBM, CBM alone or combined RCM and RBM or RCM and CBM is a maintenance technique used in thermal power plants. Implementation of these three techniques in an integrated maintenance will increase the availability of thermal power plants compared to the use of maintenance techniques individually or in combination of two techniques. This study uses the reliability, risks and conditions-based maintenance in an integrated manner to increase the availability of thermal power plants. The method generates MPI (Priority Maintenance Index) is RPN (Risk Priority Number) are multiplied by RI (Risk Index) and FDT (Failure Defense Task) which can generate the task of monitoring and assessment of conditions other than maintenance tasks. Both MPI and FDT obtained from development of functional tree, failure mode effects analysis, fault-tree analysis, and risk analysis (risk assessment and risk evaluation) were then used to develop and implement a plan and schedule maintenance, monitoring and assessment of the condition and ultimately perform availability analysis. The results of this study indicate that the reliability, risks and conditions-based maintenance methods, in an integrated manner can increase the availability of thermal power plants.Keywords: integrated maintenance techniques, availability, thermal power plant, MPI, FDT
Procedia PDF Downloads 7942020 Examination of the South African Fire Legislative Framework
Authors: Mokgadi Julia Ngoepe-Ntsoane
Abstract:
The article aims to make a case for a legislative framework for the fire sector in South Africa. Robust legislative framework is essential for empowering those with obligatory mandate within the sector. This article contributes to the body of knowledge in the field of policy reviews particularly with regards to the legal framework. It has been observed overtime that the scholarly contributions in this field are limited. Document analysis was the methodology selected for the investigation of the various legal frameworks existing in the country. It has been established that indeed the national legislation on the fire industry does not exist in South Africa. From the documents analysed, it was revealed that the sector is dominated by cartels who are exploiting the new entrants to the market particularly SMEs. It is evident that these cartels are monopolising the system as they have long been operating in the system turning it into self- owned entities. Commitment to addressing the challenges faced by fire services and creating a framework for the evolving role that fire brigade services are expected to execute in building safer and sustainable communities is vital. Legislation for the fire sector ought to be concluded with immediate effect. The outdated national fire legislation has necessitated the monopolisation and manipulation of the system by dominating organisations which cause a painful discrimination and exploitation of smaller service providers to enter the market for trading in that occupation. The barrier to entry bears long term negative effects on national priority areas such as employment creation, poverty, and others. This monopolisation and marginalisation practices by cartels in the sector calls for urgent attention by government because if left attended, it will leave a lot of people particularly women and youth being disadvantaged and frustrated. The downcast syndrome exercised within the fire sector has wreaked havoc and is devastating. This is caused by cartels that have been within the sector for some time, who know the strengths and weaknesses of processes, shortcuts, advantages and consequences of various actions. These people take advantage of new entrants to the sector who in turn find it difficult to manoeuvre, find the market dissonant and end up giving up their good ideas and intentions. There are many pieces of legislation which are industry specific such as housing, forestry, agriculture, health, security, environmental which are used to regulate systems within the institutions involved. Other regulations exist as bi-laws for guiding the management within the municipalities.Keywords: sustainable job creation, growth and development, transformation, risk management
Procedia PDF Downloads 175