Search results for: simulation tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8423

Search results for: simulation tools

1193 Design and Analysis of Deep Excavations

Authors: Barham J. Nareeman, Ilham I. Mohammed

Abstract:

Excavations in urban developed area are generally supported by deep excavation walls such as; diaphragm wall, bored piles, soldier piles and sheet piles. In some cases, these walls may be braced by internal braces or tie back anchors. Tie back anchors are by far the predominant method for wall support, the large working space inside the excavation provided by a tieback anchor system has a significant construction advantage. This paper aims to analyze a deep excavation bracing system of contiguous pile wall braced by pre-stressed tie back anchors, which is a part of a huge residential building project, located in Turkey/Gaziantep province. The contiguous pile wall will be constructed with a length of 270 m that consists of 285 piles, each having a diameter of 80 cm, and a center to center spacing of 95 cm. The deformation analysis was carried out by a finite element analysis tool using PLAXIS. In the analysis, beam element method together with an elastic perfect plastic soil model and Soil Hardening Model was used to design the contiguous pile wall, the tieback anchor system, and the soil. The two soil clusters which are limestone and a filled soil were modelled with both Hardening soil and Mohr Coulomb models. According to the basic design, both soil clusters are modelled as drained condition. The simulation results show that the maximum horizontal movement of the walls and the maximum settlement of the ground are convenient with 300 individual case histories which are ranging between 1.2mm and 2.3mm for walls, and 15mm and 6.5mm for the settlements. It was concluded that tied-back contiguous pile wall can be satisfactorily modelled using Hardening soil model.

Keywords: deep excavation, finite element, pre-stressed tie back anchors, contiguous pile wall, PLAXIS, horizontal deflection, ground settlement

Procedia PDF Downloads 228
1192 Energy Efficient Autonomous Lower Limb Exoskeleton for Human Motion Enhancement

Authors: Nazim Mir-Nasiri, Hudyjaya Siswoyo Jo

Abstract:

The paper describes conceptual design, control strategies, and partial simulation for a new fully autonomous lower limb wearable exoskeleton system for human motion enhancement that can support its weight and increase strength and endurance. Various problems still remain to be solved where the most important is the creation of a power and cost efficient system that will allow an exoskeleton to operate for extended period without batteries being frequently recharged. The designed exoskeleton is enabling to decouple the weight/mass carrying function of the system from the forward motion function which reduces the power and size of propulsion motors and thus the overall weight, cost of the system. The decoupling takes place by blocking the motion at knee joint by placing passive air cylinder across the joint. The cylinder is actuated when the knee angle has reached the minimum allowed value to bend. The value of the minimum bending angle depends on usual walk style of the subject. The mechanism of the exoskeleton features a seat to rest the subject’s body weight at the moment of blocking the knee joint motion. The mechanical structure of each leg has six degrees of freedom: four at the hip, one at the knee, and one at the ankle. Exoskeleton legs are attached to subject legs by using flexible cuffs. The operation of all actuators depends on the amount of pressure felt by the feet pressure sensors and knee angle sensor. The sensor readings depend on actual posture of the subject and can be classified in three distinct cases: subject stands on one leg, subject stands still on both legs and subject stands on both legs but transit its weight from one leg to other. This exoskeleton is power efficient because electrical motors are smaller in size and did not participate in supporting the weight like in all other existing exoskeleton designs.

Keywords: energy efficient system, exoskeleton, motion enhancement, robotics

Procedia PDF Downloads 350
1191 BLS-2/BSL-3 Laboratory for Diagnosis of Pathogens on the Colombia-Ecuador Border Region: A Post-COVID Commitment to Public Health

Authors: Anderson Rocha-Buelvas, Jaqueline Mena Huertas, Edith Burbano Rosero, Arsenio Hidalgo Troya, Mauricio Casas Cruz

Abstract:

COVID-19 is a disruptive pandemic for the public health and economic system of whole countries, including Colombia. Nariño Department is the southwest of the country and draws attention to being on the border with Ecuador, constantly facing demographic transition affecting infections between countries. In Nariño, the early routine diagnosis of SARS-CoV-2, which can be handled at BSL-2, has affected the transmission dynamics of COVID-19. However, new emerging and re-emerging viruses with biological flexibility classified as a Risk Group 3 agent can take advantage of epidemiological opportunities, generating the need to increase clinical diagnosis, mainly in border regions between countries. The overall objective of this project was to assure the quality of the analytical process in the diagnosis of high biological risk pathogens in Nariño by building a laboratory that includes biosafety level (BSL)-2 and (BSL)-3 containment zones. The delimitation of zones was carried out according to the Verification Tool of the National Health Institute of Colombia and following the standard requirements for the competence of testing and calibration laboratories of the International Organization for Standardization. This is achieved by harmonization of methods and equipment for effective and durable diagnostics of the large-scale spread of highly pathogenic microorganisms, employing negative-pressure containment systems and UV Systems in accordance with a finely controlled electrical system and PCR systems as new diagnostic tools. That increases laboratory capacity. Protection in BSL-3 zones will separate the handling of potentially infectious aerosols within the laboratory from the community and the environment. It will also allow the handling and inactivation of samples with suspected pathogens and the extraction of molecular material from them, allowing research with pathogens with high risks, such as SARS-CoV-2, Influenza, and syncytial virus, and malaria, among others. The diagnosis of these pathogens will be articulated across the spectrum of basic, applied, and translational research that could receive about 60 daily samples. It is expected that this project will be articulated with the health policies of neighboring countries to increase research capacity.

Keywords: medical laboratory science, SARS-CoV-2, public health surveillance, Colombia

Procedia PDF Downloads 65
1190 Analysis and Design of Inductive Power Transfer Systems for Automotive Battery Charging Applications

Authors: Wahab Ali Shah, Junjia He

Abstract:

Transferring electrical power without any wiring has been a dream since late 19th century. There were some advances in this area as to know more about microwave systems. However, this subject has recently become very attractive due to their practiScal systems. There are low power applications such as charging the batteries of contactless tooth brushes or implanted devices, and higher power applications such as charging the batteries of electrical automobiles or buses. In the first group of applications operating frequencies are in microwave range while the frequency is lower in high power applications. In the latter, the concept is also called inductive power transfer. The aim of the paper is to have an overview of the inductive power transfer for electrical vehicles with a special concentration on coil design and power converter simulation for static charging. Coil design is very important for an efficient and safe power transfer. Coil design is one of the most critical tasks. Power converters are used in both side of the system. The converter on the primary side is used to generate a high frequency voltage to excite the primary coil. The purpose of the converter in the secondary is to rectify the voltage transferred from the primary to charge the battery. In this paper, an inductive power transfer system is studied. Inductive power transfer is a promising technology with several possible applications. Operation principles of these systems are explained, and components of the system are described. Finally, a single phase 2 kW system was simulated and results were presented. The work presented in this paper is just an introduction to the concept. A reformed compensation network based on traditional inductor-capacitor-inductor (LCL) topology is proposed to realize robust reaction to large coupling variation that is common in dynamic wireless charging application. In the future, this type compensation should be studied. Also, comparison of different compensation topologies should be done for the same power level.

Keywords: coil design, contactless charging, electrical automobiles, inductive power transfer, operating frequency

Procedia PDF Downloads 225
1189 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems

Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic

Abstract:

Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.

Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method

Procedia PDF Downloads 106
1188 Calculating Asphaltenes Precipitation Onset Pressure by Using Cardanol as Precipitation Inhibitor: A Strategy to Increment the Oil Well Production

Authors: Camilo A. Guerrero-Martin, Erik Montes Paez, Marcia C. K. Oliveira, Jonathan Campos, Elizabete F. Lucas

Abstract:

Asphaltenes precipitation is considered as a formation damage problem, which can reduce the oil recovery factor. It fouls piping and surface installations, as well as cause serious flow assurance complications and decline oil well production. Therefore, researchers have shown an interest in chemical treatments to control this phenomenon. The aim of this paper is to assess the asphaltenes precipitation onset of crude oils in the presence of cardanol, by titrating the crude with n-heptane. Moreover, based on this results obtained at atmosphere pressure, the asphaltenes precipitation onset pressure were calculated to predict asphaltenes precipitation in the reservoir, by using differential liberation and refractive index data of the oils. The influence of cardanol concentrations in the asphaltenes stabilization of three Brazilian crude oils samples (with similar API densities) was studied. Therefore, four formulations of cardanol in toluene were prepared: 0, 3, 5, 10 and 15 m/m%. The formulations were added to the crude at 2:98 ratio. The petroleum samples were characterized by API density, elemental analysis and differential liberation test. The asphaltenes precipitation onset (APO) was determined by titrating with n-heptane and monitoring with near-infrared (NIR). UV-Vis spectroscopy experiments were also done to assess the precipitate asphaltenes content. The asphaltenes precipitation envelopes (APE) were also determined by numerical simulation (Multiflash). In addition, the adequate artificial lift systems (ALS) for the oils were selected. It was based on the downhole well profile and a screening methodology. Finally, the oil flowrates were modelling by NODAL analysis production system in the PIPESIM software. The results of this study show that the asphaltenes precipitation onset of the crude oils were 2.2, 2.3 and 6.0 mL of n-heptane/g of oil. The cardanol was an effective inhibitor of asphaltenes precipitation for the crude oils used in this study, since it displaces the precipitation pressure of the oil to lower values. This indicates that cardanol can increase the oil wells productivity.

Keywords: asphaltenes, NODAL analysis production system, precipitation pressure onset, inhibitory molecule

Procedia PDF Downloads 154
1187 Implementation of an Induction Programme to Help the International Medical Graduates in the NHS

Authors: Mohammad K. Rumjaun, Sana Amjed, Muhammad A. Ghazi, Safa G. Attar, Jason Raw

Abstract:

Background: National Health Service (NHS) in England is one of the leading healthcare systems in the world and it heavily relies on the recruitment of overseas doctors. 30.7% of the doctors currently serving in NHS are overseas doctors. Most of these doctors do not receive the essential induction required to work in the NHS when they first arrive and therefore, they mostly struggle to work effectively in the first few months of their new jobs as compared to UK graduates. In our hospital, the clinical need for a dedicated induction programme for the International Medical Graduates (IMGs) was identified for their initial settling period and this programme was designed to achieve this. Methods: A questionnaire was designed for the previous 7 IMGs (Group 1) in order to identify the difficulties they faced in their initial phase. Thereafter, an induction programme consisting of presentations explaining the NHS and hospital framework, communication skills practice sessions, the clinical ceiling of care and patient simulation training was implemented for 6 new IMGs (Group 2). Another survey was done and compared with the previous. Results: After this programme, group 2 required only 1 week to understand the complexity of the IT systems as compared 3 weeks in group 1. 83% of group 2 was well-supported for their on-call duties after this programme as compared to 29% and 100% of group 2 was aware of their role in the job after the induction as compared to 0%. Furthermore, group 2 was able to function independently and confidently in their roles after only 1 month as compared to an average of 3 months for group 1. After running the PDSA cycles, our results show clear evidence that this programme has tremendously benefitted the IMGs in settling in the NHS. The IMGs really appreciated this initiative and have given positive feedback. Conclusion: Leaving your home country to begin your career in a different country is not an easy transition and undoubtedly, everyone struggles. It is important to invest in a well-structured induction programme for the IMGs in the initial phase of their jobs as this will improve not only their confidence and efficacy but also patients’ safety.

Keywords: induction programme, international medical graduates, NHS, overseas doctors struggles.

Procedia PDF Downloads 110
1186 Professional Development in EFL Classroom: Motivation and Reflection

Authors: Iman Jabbar

Abstract:

Within the scope of professionalism and in order to compete with the modern world, teachers, are expected to develop their teaching skills and activities in addition to their professional knowledge. At the college level, the teacher should be able to face classroom challenges through his engagement with the learning situation to understand the students and their needs. In our field of TESOL, the role of the English teacher is no longer restricted to teaching English texts, but rather he should endeavor to enhance the students’ skills such as communication and critical analysis. Within the literature of professionalism, there are certain strategies and tools that an English teacher should adopt to develop his competence and performance. Reflective practice, which is an exploratory process, is one of these strategies. Another strategy contributing to classroom development is motivation. It is crucial in students’ learning as it affects the quality of learning English in the classroom in addition to determining success or failure as well as language achievement. This is a qualitative study grounded on interpretive perspectives of teachers and students regarding the process of professional development. This study aims at (a) understanding how teachers at the college level conceptualize reflective practice and motivation inside EFL classroom, and (b) exploring the methods and strategies that they implement to practice reflection and motivation. This study and is based on two questions: 1. How do EFL teachers perceive and view reflection and motivation in relation to their teaching and professional development? 2. How can reflective practice and motivation be developed into practical strategies and actions in EFL teachers’ professional context? The study is organized into two parts, theoretical and practical. The theoretical part reviews the literature on the concept of reflective practice and motivation in relation to professional development through providing certain definitions, theoretical models, and strategies. The practical part draws on the theoretical one, however; it is the core of the study since it deals with two issues. It involves the research design, methodology, and methods of data collection, sampling, and data analysis. It ends up with an overall discussion of findings and the researcher's reflections on the investigated topic. In terms of significance, the study is intended to contribute to the field of TESOL at the academic level through the selection of the topic and investigating it from theoretical and practical perspectives. Professional development is the path that leads to enhancing the quality of teaching English as a foreign or second language in a way that suits the modern trends of globalization and advanced technology.

Keywords: professional development, motivation, reflection, learning

Procedia PDF Downloads 424
1185 Development of Innovative Nuclear Fuel Pellets Using Additive Manufacturing

Authors: Paul Lemarignier, Olivier Fiquet, Vincent Pateloup

Abstract:

In line with the strong desire of nuclear energy players to have ever more effective products in terms of safety, research programs on E-ATF (Enhanced-Accident Tolerant Fuels) that are more resilient, particularly to the loss of coolant, have been launched in all countries with nuclear power plants. Among the multitude of solutions being developed internationally, carcinoembryonic antigen (CEA) and its partners are investigating a promising solution, which is the realization of CERMET (CERamic-METal) type fuel pellets made of a matrix of fissile material, uranium dioxide UO2, which has a low thermal conductivity, and a metallic phase with a high thermal conductivity to improve heat evacuation. Work has focused on the development by powder metallurgy of micro-structured CERMETs, characterized by networks of metallic phase embedded in the UO₂ matrix. Other types of macro-structured CERMETs, based on concepts proposed by thermal simulation studies, have been developed with a metallic phase with a specific geometry to optimize heat evacuation. This solution could not be developed using traditional processes, so additive manufacturing, which revolutionizes traditional design principles, is used to produce these innovative prototype concepts. At CEA Cadarache, work is first carried out on a non-radioactive surrogate material, alumina, in order to acquire skills and to develop the equipment, in particular the robocasting machine, an additive manufacturing technique selected for its simplicity and the possibility of optimizing the paste formulations. A manufacturing chain was set up, with the pastes production, the 3D printing of pellets, and the associated thermal post-treatment. The work leading to the first elaborations of macro-structured alumina/molybdenum CERMETs will be presented. This work was carried out with the support of Framatome and EdF.

Keywords: additive manufacturing, alumina, CERMET, molybdenum, nuclear safety

Procedia PDF Downloads 55
1184 Linguistic Analysis of Argumentation Structures in Georgian Political Speeches

Authors: Mariam Matiashvili

Abstract:

Argumentation is an integral part of our daily communications - formal or informal. Argumentative reasoning, techniques, and language tools are used both in personal conversations and in the business environment. Verbalization of the opinions requires the use of extraordinary syntactic-pragmatic structural quantities - arguments that add credibility to the statement. The study of argumentative structures allows us to identify the linguistic features that make the text argumentative. Knowing what elements make up an argumentative text in a particular language helps the users of that language improve their skills. Also, natural language processing (NLP) has become especially relevant recently. In this context, one of the main emphases is on the computational processing of argumentative texts, which will enable the automatic recognition and analysis of large volumes of textual data. The research deals with the linguistic analysis of the argumentative structures of Georgian political speeches - particularly the linguistic structure, characteristics, and functions of the parts of the argumentative text - claims, support, and attack statements. The research aims to describe the linguistic cues that give the sentence a judgmental/controversial character and helps to identify reasoning parts of the argumentative text. The empirical data comes from the Georgian Political Corpus, particularly TV debates. Consequently, the texts are of a dialogical nature, representing a discussion between two or more people (most often between a journalist and a politician). The research uses the following approaches to identify and analyze the argumentative structures Lexical Classification & Analysis - Identify lexical items that are relevant in argumentative texts creating process - Creating the lexicon of argumentation (presents groups of words gathered from a semantic point of view); Grammatical Analysis and Classification - means grammatical analysis of the words and phrases identified based on the arguing lexicon. Argumentation Schemas - Describe and identify the Argumentation Schemes that are most likely used in Georgian Political Speeches. As a final step, we analyzed the relations between the above mentioned components. For example, If an identified argument scheme is “Argument from Analogy”, identified lexical items semantically express analogy too, and they are most likely adverbs in Georgian. As a result, we created the lexicon with the words that play a significant role in creating Georgian argumentative structures. Linguistic analysis has shown that verbs play a crucial role in creating argumentative structures.

Keywords: georgian, argumentation schemas, argumentation structures, argumentation lexicon

Procedia PDF Downloads 53
1183 Examining Reading Comprehension Skills Based on Different Reading Comprehension Frameworks and Taxonomies

Authors: Seval Kula-Kartal

Abstract:

Developing students’ reading comprehension skills is an aim that is difficult to accomplish and requires to follow long-term and systematic teaching and assessment processes. In these processes, teachers need tools to provide guidance to them on what reading comprehension is and which comprehension skills they should develop. Due to a lack of clear and evidence-based frameworks defining reading comprehension skills, especially in Turkiye, teachers and students mostly follow various processes in the classrooms without having an idea about what their comprehension goals are and what those goals mean. Since teachers and students do not have a clear view of comprehension targets, strengths, and weaknesses in students’ comprehension skills, the formative feedback processes cannot be managed in an effective way. It is believed that detecting and defining influential comprehension skills may provide guidance both to teachers and students during the feedback process. Therefore, in the current study, some of the reading comprehension frameworks that define comprehension skills operationally were examined. The aim of the study is to develop a simple and clear framework that can be used by teachers and students during their teaching, learning, assessment, and feedback processes. The current study is qualitative research in which documents related to reading comprehension skills were analyzed. Therefore, the study group consisted of recourses and frameworks which made big contributions to theoretical and operational definitions of reading comprehension. A content analysis was conducted on the resources included in the study group. To determine the validity of the themes and sub-categories revealed as the result of content analysis, three educational assessment experts were asked to examine the content analysis results. The Fleiss’ Cappa coefficient revealed that there is consistency among themes and categories defined by three different experts. The content analysis of the reading comprehension frameworks revealed that comprehension skills could be examined under four different themes. The first and second themes focus on understanding information given explicitly or implicitly within a text. The third theme includes skills used by the readers to make connections between their personal knowledge and the information given in the text. Lastly, the fourth theme focus on skills used by readers to examine the text with a critical view. The results suggested that fundamental reading comprehension skills can be examined under four themes. Teachers are recommended to use these themes in their reading comprehension teaching and assessment processes. Acknowledgment: This research is supported by Pamukkale University Scientific Research Unit within the project, whose title is Developing A Reading Comprehension Rubric.

Keywords: reading comprehension, assessing reading comprehension, comprehension taxonomies, educational assessment

Procedia PDF Downloads 61
1182 Integrated Coastal Management for the Sustainable Development of Coastal Cities: The Case of El-Mina, Tripoli, Lebanon

Authors: G. Ghamrawi, Y. Abunnasr, M. Fawaz, S. Yazigi

Abstract:

Coastal cities are constantly exposed to environmental degradation and economic regression fueled by rapid and uncontrolled urban growth as well as continuous resource depletion. This is the case of the City of Mina in Tripoli (Lebanon), where lack of awareness to preserve social, ecological, and historical assets, coupled with the increasing development pressures, are threatening the socioeconomic status of the city residents, the quality of life and accessibility to the coast. To address these challenges, a holistic coastal urban design and planning approach was developed to analyze the environmental, political, legal, and socioeconomic context of the city. This approach aims to investigate the potential of balancing urban development with the protection and enhancement of cultural, ecological, and environmental assets under an integrated coastal zone management approach (ICZM). The analysis of Mina's different sectors adopted several tools that include direct field observation, interviews with stakeholders, analysis of available data, historical maps, and previously proposed projects. The findings from the analysis were mapped and graphically represented, allowing the recognition of character zones that become the design intervention units. Consequently, the thesis proposes an urban, city-scale intervention that identifies 6 different character zones (the historical fishing port, Abdul Wahab island, the abandoned Port Said, Hammam el Makloub, the sand beach, and the new developable area) and proposes context-specific design interventions that capitalize on the main characteristics of each zone. Moreover, the intervention builds on the institutional framework of ICZM as well as other studies previously conducted for the coast and adopts nature-based solutions with hybrid systems for providing better environmental design solutions for developing the coast. This enables the realization of an all-inclusive, well-connected shoreline with easy and free access towards the sea; a developed shoreline with an active local economy, and an improved urban environment.

Keywords: blue green infrastructure, coastal cities, hybrid solutions, integrated coastal zone management, sustainable development, urban planning

Procedia PDF Downloads 127
1181 Orbit Determination from Two Position Vectors Using Finite Difference Method

Authors: Akhilesh Kumar, Sathyanarayan G., Nirmala S.

Abstract:

An unusual approach is developed to determine the orbit of satellites/space objects. The determination of orbits is considered a boundary value problem and has been solved using the finite difference method (FDM). Only positions of the satellites/space objects are known at two end times taken as boundary conditions. The technique of finite difference has been used to calculate the orbit between end times. In this approach, the governing equation is defined as the satellite's equation of motion with a perturbed acceleration. Using the finite difference method, the governing equations and boundary conditions are discretized. The resulting system of algebraic equations is solved using Tri Diagonal Matrix Algorithm (TDMA) until convergence is achieved. This methodology test and evaluation has been done using all GPS satellite orbits from National Geospatial-Intelligence Agency (NGA) precise product for Doy 125, 2023. Towards this, two hours of twelve sets have been taken into consideration. Only positions at the end times of each twelve sets are considered boundary conditions. This algorithm is applied to all GPS satellites. Results achieved using FDM compared with the results of NGA precise orbits. The maximum RSS error for the position is 0.48 [m] and the velocity is 0.43 [mm/sec]. Also, the present algorithm is applied on the IRNSS satellites for Doy 220, 2023. The maximum RSS error for the position is 0.49 [m], and for velocity is 0.28 [mm/sec]. Next, a simulation has been done for a Highly Elliptical orbit for DOY 63, 2023, for the duration of 6 hours. The RSS of difference in position is 0.92 [m] and velocity is 1.58 [mm/sec] for the orbital speed of more than 5km/sec. Whereas the RSS of difference in position is 0.13 [m] and velocity is 0.12 [mm/sec] for the orbital speed less than 5km/sec. Results show that the newly created method is reliable and accurate. Further applications of the developed methodology include missile and spacecraft targeting, orbit design (mission planning), space rendezvous and interception, space debris correlation, and navigation solutions.

Keywords: finite difference method, grid generation, NavIC system, orbit perturbation

Procedia PDF Downloads 60
1180 Production of Rhamnolipids from Different Resources and Estimating the Kinetic Parameters for Bioreactor Design

Authors: Olfat A. Mohamed

Abstract:

Rhamnolipids biosurfactants have distinct properties given them importance in many industrial applications, especially their great new future applications in cosmetic and pharmaceutical industries. These applications have encouraged the search for diverse and renewable resources to control the cost of production. The experimental results were then applied to find a suitable mathematical model for obtaining the design criteria of the batch bioreactor. This research aims to produce Rhamnolipids from different oily wastewater sources such as petroleum crude oil (PO) and vegetable oil (VO) by using Pseudomonas aeruginosa ATCC 9027. Different concentrations of the PO and the VO are added to the media broth separately are in arrangement (0.5 1, 1.5, 2, 2.5 % v/v) and (2, 4, 6, 8 and 10%v/v). The effect of the initial concentration of oil residues and the addition of glycerol and palmitic acid was investigated as an inducer in the production of rhamnolipid and the surface tension of the broth. It was found that 2% of the waste (PO) and 6% of the waste (VO) was the best initial substrate concentration for the production of rhamnolipids (2.71, 5.01 g rhamnolipid/l) as arrangement. Addition of glycerol (10-20% v glycerol/v PO) to the 2% PO fermentation broth led to increase the rhamnolipid production (about 1.8-2 times fold). However, the addition of palmitic acid (5 and 10 g/l) to fermentation broth contained 6% VO rarely enhanced the production rate. The experimental data for 2% initially (PO) was used to estimate the various kinetic parameters. The following results were obtained, maximum rate or velocity of reaction (Vmax) = 0.06417 g/l.hr), yield of cell weight per unit weight of substrate utilized (Yx/s = 0.324 g Cx/g Cs) maximum specific growth rate (μmax = 0.05791 hr⁻¹), yield of rhamnolipid weight per unit weight of substrate utilized (Yp/s)=0.2571gCp/g Cs), maintenance coefficient (Ms =0.002419), Michaelis-Menten constant, (Km=6.1237 gmol/l), endogenous decay coefficient (Kd=0.002375 hr⁻¹). Predictive parameters and advanced mathematical models were applied to evaluate the time of the batch bioreactor. The results were as follows: 123.37, 129 and 139.3 hours in respect of microbial biomass, substrate and product concentration, respectively compared with experimental batch time of 120 hours in all cases. The expected mathematical models are compatible with the laboratory results and can, therefore, be considered as tools for expressing the actual system.

Keywords: batch bioreactor design, glycerol, kinetic parameters, petroleum crude oil, Pseudomonas aeruginosa, rhamnolipids biosurfactants, vegetable oil

Procedia PDF Downloads 110
1179 Temperature-Based Detection of Initial Yielding Point in Loading of Tensile Specimens Made of Structural Steel

Authors: Aqsa Jamil, Tamura Hiroshi, Katsuchi Hiroshi, Wang Jiaqi

Abstract:

The yield point represents the upper limit of forces which can be applied to a specimen without causing any permanent deformation. After yielding, the behavior of the specimen suddenly changes, including the possibility of cracking or buckling. So, the accumulation of damage or type of fracture changes depending on this condition. As it is difficult to accurately detect yield points of the several stress concentration points in structural steel specimens, an effort has been made in this research work to develop a convenient technique using thermography (temperature-based detection) during tensile tests for the precise detection of yield point initiation. To verify the applicability of thermography camera, tests were conducted under different loading conditions and measuring the deformation by installing various strain gauges and monitoring the surface temperature with the help of a thermography camera. The yield point of specimens was estimated with the help of temperature dip, which occurs due to the thermoelastic effect during the plastic deformation. The scattering of the data has been checked by performing a repeatability analysis. The effects of temperature imperfection and light source have been checked by carrying out the tests at daytime as well as midnight and by calculating the signal to noise ratio (SNR) of the noised data from the infrared thermography camera, it can be concluded that the camera is independent of testing time and the presence of a visible light source. Furthermore, a fully coupled thermal-stress analysis has been performed by using Abaqus/Standard exact implementation technique to validate the temperature profiles obtained from the thermography camera and to check the feasibility of numerical simulation for the prediction of results extracted with the help of the thermographic technique.

Keywords: signal to noise ratio, thermoelastic effect, thermography, yield point

Procedia PDF Downloads 80
1178 Creation of Computerized Benchmarks to Facilitate Preparedness for Biological Events

Authors: B. Adini, M. Oren

Abstract:

Introduction: Communicable diseases and pandemics pose a growing threat to the well-being of the global population. A vital component of protecting the public health is the creation and sustenance of a continuous preparedness for such hazards. A joint Israeli-German task force was deployed in order to develop an advanced tool for self-evaluation of emergency preparedness for variable types of biological threats. Methods: Based on a comprehensive literature review and interviews with leading content experts, an evaluation tool was developed based on quantitative and qualitative parameters and indicators. A modified Delphi process was used to achieve consensus among over 225 experts from both Germany and Israel concerning items to be included in the evaluation tool. Validity and applicability of the tool for medical institutions was examined in a series of simulation and field exercises. Results: Over 115 German and Israeli experts reviewed and examined the proposed parameters as part of the modified Delphi cycles. A consensus of over 75% of experts was attained for 183 out of 188 items. The relative importance of each parameter was rated as part of the Delphi process, in order to define its impact on the overall emergency preparedness. The parameters were integrated in computerized web-based software that enables to calculate scores of emergency preparedness for biological events. Conclusions: The parameters developed in the joint German-Israeli project serve as benchmarks that delineate actions to be implemented in order to create and maintain an ongoing preparedness for biological events. The computerized evaluation tool enables to continuously monitor the level of readiness and thus strengths and gaps can be identified and corrected appropriately. Adoption of such a tool is recommended as an integral component of quality assurance of public health and safety.

Keywords: biological events, emergency preparedness, bioterrorism, natural biological events

Procedia PDF Downloads 402
1177 Culture Dimensions of Information Systems Security in Saudi Arabia National Health Services

Authors: Saleh Alumaran, Giampaolo Bella, Feng Chen

Abstract:

The study of organisations’ information security cultures has attracted scholars as well as healthcare services industry to research the topic and find appropriate tools and approaches to develop a positive culture. The vast majority of studies in Saudi national health services are on the use of technology to protect and secure health services information. On the other hand, there is a lack of research on the role and impact of an organisation’s cultural dimensions on information security. This research investigated and analysed the role and impact of cultural dimensions on information security in Saudi Arabia health service. Hypotheses were tested and two surveys were carried out in order to collect data and information from three major hospitals in Saudi Arabia (SA). The first survey identified the main cultural-dimension problems in SA health services and developed an initial information security culture framework model. The second survey evaluated and tested the developed framework model to test its usefulness, reliability and applicability. The model is based on human behaviour theory, where the individual’s attitude is the key element of the individual’s intention to behave as well as of his or her actual behaviour. The research identified six cultural dimensions: Saudi national culture, Saudi health service leadership, employees’ trust, technology, multicultural interactions and employees’ job roles. The research also identified a set of cultural sub-dimensions. These include working values and norms, tribe values and norms, attitudes towards women, power sharing, vision, social interaction, respect and understanding, hospital intra-net, hospital employees’ language(s) used, multi-national culture, communication system, employees’ job satisfaction and job security. The research identified that (a) the human behaviour towards medical information in SA is one of the main threats to information security and one of the main challenges to SA health authority, (b) The current situation of SA hospitals’ IS cultures is falling short in protecting medical information due to the current value and norms towards information security, (c) Saudi national culture and employees’ job role are the main dimensions playing major roles in the employees’ attitude, and technology is the least important dimension playing a role in the employees’ attitudes.

Keywords: cultural dimension, electronic health record, information security, privacy

Procedia PDF Downloads 333
1176 Probabilistic Building Life-Cycle Planning as a Strategy for Sustainability

Authors: Rui Calejo Rodrigues

Abstract:

Building Refurbishing and Maintenance is a major area of knowledge ultimately dispensed to user/occupant criteria. The optimization of the service life of a building needs a special background to be assessed as it is one of those concepts that needs proficiency to be implemented. ISO 15686-2 Buildings and constructed assets - Service life planning: Part 2, Service life prediction procedures, states a factorial method based on deterministic data for building components life span. Major consequences result on a deterministic approach because users/occupants are not sensible to understand the end of components life span and so simply act on deterministic periods and so costly and resources consuming solutions do not meet global targets of planet sustainability. The estimation of 2 thousand million conventional buildings in the world, if submitted to a probabilistic method for service life planning rather than a deterministic one provide an immense amount of resources savings. Since 1989 the research team nowadays stating for CEES–Center for Building in Service Studies developed a methodology based on Montecarlo method for probabilistic approach regarding life span of building components, cost and service life care time spans. The research question of this deals with the importance of probabilistic approach of buildings life planning compared with deterministic methods. It is presented the mathematic model developed for buildings probabilistic lifespan approach and experimental data is obtained to be compared with deterministic data. Assuming that buildings lifecycle depends a lot on component replacement this methodology allows to conclude on the global impact of fixed replacements methodologies such as those on result of deterministic models usage. Major conclusions based on conventional buildings estimate are presented and evaluated under a sustainable perspective.

Keywords: building components life cycle, building maintenance, building sustainability, Montecarlo Simulation

Procedia PDF Downloads 188
1175 Micro-Milling Process Development of Advanced Materials

Authors: M. A. Hafiz, P. T. Matevenga

Abstract:

Micro-level machining of metals is a developing field which has shown to be a prospective approach to produce features on the parts in the range of a few to a few hundred microns with acceptable machining quality. It is known that the mechanics (i.e. the material removal mechanism) of micro-machining and conventional machining have significant differences due to the scaling effects associated with tool-geometry, tool material and work piece material characteristics. Shape memory alloys (SMAs) are those metal alloys which display two exceptional properties, pseudoelasticity and the shape memory effect (SME). Nickel-titanium (NiTi) alloys are one of those unique metal alloys. NiTi alloys are known to be difficult-to-cut materials specifically by using conventional machining techniques due to their explicit properties. Their high ductility, high amount of strain hardening, and unusual stress–strain behaviour are the main properties accountable for their poor machinability in terms of tool wear and work piece quality. The motivation of this research work was to address the challenges and issues of micro-machining combining with those of machining of NiTi alloy which can affect the desired performance level of machining outputs. To explore the significance of range of cutting conditions on surface roughness and tool wear, machining tests were conducted on NiTi. Influence of different cutting conditions and cutting tools on surface and sub-surface deformation in work piece was investigated. Design of experiments strategy (L9 Array) was applied to determine the key process variables. The dominant cutting parameters were determined by analysis of variance. These findings showed that feed rate was the dominant factor on surface roughness whereas depth of cut found to be dominant factor as far as tool wear was concerned. The lowest surface roughness was achieved at the feed rate of equal to the cutting edge radius where as the lowest flank wear was observed at lowest depth of cut. Repeated machining trials have yet to be carried out in order to observe the tool life, sub-surface deformation and strain induced hardening which are also expecting to be amongst the critical issues in micro machining of NiTi. The machining performance using different cutting fluids and strategies have yet to be studied.

Keywords: nickel titanium, micro-machining, surface roughness, machinability

Procedia PDF Downloads 322
1174 Energy System Analysis Using Data-Driven Modelling and Bayesian Methods

Authors: Paul Rowley, Adam Thirkill, Nick Doylend, Philip Leicester, Becky Gough

Abstract:

The dynamic performance of all energy generation technologies is impacted to varying degrees by the stochastic properties of the wider system within which the generation technology is located. This stochasticity can include the varying nature of ambient renewable energy resources such as wind or solar radiation, or unpredicted changes in energy demand which impact upon the operational behaviour of thermal generation technologies. An understanding of these stochastic impacts are especially important in contexts such as highly distributed (or embedded) generation, where an understanding of issues affecting the individual or aggregated performance of high numbers of relatively small generators is especially important, such as in ESCO projects. Probabilistic evaluation of monitored or simulated performance data is one technique which can provide an insight into the dynamic performance characteristics of generating systems, both in a prognostic sense (such as the prediction of future performance at the project’s design stage) as well as in a diagnostic sense (such as in the real-time analysis of underperforming systems). In this work, we describe the development, application and outcomes of a new approach to the acquisition of datasets suitable for use in the subsequent performance and impact analysis (including the use of Bayesian approaches) for a number of distributed generation technologies. The application of the approach is illustrated using a number of case studies involving domestic and small commercial scale photovoltaic, solar thermal and natural gas boiler installations, and the results as presented show that the methodology offers significant advantages in terms of plant efficiency prediction or diagnosis, along with allied environmental and social impacts such as greenhouse gas emission reduction or fuel affordability.

Keywords: renewable energy, dynamic performance simulation, Bayesian analysis, distributed generation

Procedia PDF Downloads 471
1173 Hybrid Knowledge and Data-Driven Neural Networks for Diffuse Optical Tomography Reconstruction in Medical Imaging

Authors: Paola Causin, Andrea Aspri, Alessandro Benfenati

Abstract:

Diffuse Optical Tomography (DOT) is an emergent medical imaging technique which employs NIR light to estimate the spatial distribution of optical coefficients in biological tissues for diagnostic purposes, in a noninvasive and non-ionizing manner. DOT reconstruction is a severely ill-conditioned problem due to prevalent scattering of light in the tissue. In this contribution, we present our research in adopting hybrid knowledgedriven/data-driven approaches which exploit the existence of well assessed physical models and build upon them neural networks integrating the availability of data. Namely, since in this context regularization procedures are mandatory to obtain a reasonable reconstruction [1], we explore the use of neural networks as tools to include prior information on the solution. 2. Materials and Methods The idea underlying our approach is to leverage neural networks to solve PDE-constrained inverse problems of the form 𝒒 ∗ = 𝒂𝒓𝒈 𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃), (1) where D is a loss function which typically contains a discrepancy measure (or data fidelity) term plus other possible ad-hoc designed terms enforcing specific constraints. In the context of inverse problems like (1), one seeks the optimal set of physical parameters q, given the set of observations y. Moreover, 𝑦̃ is the computable approximation of y, which may be as well obtained from a neural network but also in a classic way via the resolution of a PDE with given input coefficients (forward problem, Fig.1 box ). Due to the severe ill conditioning of the reconstruction problem, we adopt a two-fold approach: i) we restrict the solutions (optical coefficients) to lie in a lower-dimensional subspace generated by auto-decoder type networks. This procedure forms priors of the solution (Fig.1 box ); ii) we use regularization procedures of type 𝒒̂ ∗ = 𝒂𝒓𝒈𝒎𝒊𝒏𝒒 𝐃(𝒚, 𝒚̃)+ 𝑹(𝒒), where 𝑹(𝒒) is a regularization functional depending on regularization parameters which can be fixed a-priori or learned via a neural network in a data-driven modality. To further improve the generalizability of the proposed framework, we also infuse physics knowledge via soft penalty constraints (Fig.1 box ) in the overall optimization procedure (Fig.1 box ). 3. Discussion and Conclusion DOT reconstruction is severely hindered by ill-conditioning. The combined use of data-driven and knowledgedriven elements is beneficial and allows to obtain improved results, especially with a restricted dataset and in presence of variable sources of noise.

Keywords: inverse problem in tomography, deep learning, diffuse optical tomography, regularization

Procedia PDF Downloads 54
1172 Chemical Kinetics and Computational Fluid-Dynamics Analysis of H2/CO/CO2/CH4 Syngas Combustion and NOx Formation in a Micro-Pilot-Ignited Supercharged Dual Fuel Engine

Authors: Ulugbek Azimov, Nearchos Stylianidis, Nobuyuki Kawahara, Eiji Tomita

Abstract:

A chemical kinetics and computational fluid-dynamics (CFD) analysis was performed to evaluate the combustion of syngas derived from biomass and coke-oven solid feedstock in a micro-pilot ignited supercharged dual-fuel engine under lean conditions. For this analysis, a new reduced syngas chemical kinetics mechanism was constructed and validated by comparing the ignition delay and laminar flame speed data with those obtained from experiments and other detail chemical kinetics mechanisms available in the literature. The reaction sensitivity analysis was conducted for ignition delay at elevated pressures in order to identify important chemical reactions that govern the combustion process. The chemical kinetics of NOx formation was analyzed for H2/CO/CO2/CH4 syngas mixtures by using counter flow burner and premixed laminar flame speed reactor models. The new mechanism showed a very good agreement with experimental measurements and accurately reproduced the effect of pressure, temperature and equivalence ratio on NOx formation. In order to identify the species important for NOx formation, a sensitivity analysis was conducted for pressures 4 bar, 10 bar and 16 bar and preheat temperature 300 K. The results show that the NOx formation is driven mostly by hydrogen based species while other species, such as N2, CO2 and CH4, have also important effects on combustion. Finally, the new mechanism was used in a multidimensional CFD simulation to predict the combustion of syngas in a micro-pilot-ignited supercharged dual-fuel engine and results were compared with experiments. The mechanism showed the closest prediction of the in-cylinder pressure and the rate of heat release (ROHR).

Keywords: syngas, chemical kinetics mechanism, internal combustion engine, NOx formation

Procedia PDF Downloads 387
1171 Cultural Competence and Healthcare Challenges of Migrants in South Wales United Kingdom

Authors: Qirat Naz, Abasiokpon Udoakah

Abstract:

In developed countries, global migration is diversifying. The minority ethnic population, including refugees and asylum seekers who, fled their home countries due to war, terrorism, oppression, or natural disasters, and returning home is dangerous for them. They need sanctuary and peaceful environment in host countries. They begin the process of acculturation, in which a person adopts the social mores and behavioral patterns of the dominant culture, yet they still have unique multicultural needs that the dominant society fails to address. The aim of this research is to provide a holistic understanding of the living experiences of a minority population, particularly migrants, including asylum seekers and refugees, in the health and social care system of South Wales. The purpose of this study is to investigate three research objectives: the multicultural health care needs of minorities, as well as the barriers to seeking health and social care facilities. There are Welsh policies for promoting cultural competence in the health and social care sectors; this research will explore the implications and impact of these policies on the target population. This research study will be conducted using qualitative research methods, tools, and techniques. This research is an inductive approach to coming up with a grounded theory. The sample will be divided into two groups: migrants and professionals providing any kind of services to migrants; each group will contain 30 participants. Interpretive phenomenological analysis would be utilized during the process of coding and developing the main themes of this research. The positionality of the researcher would be minimized by unloaded and open-ended questions, researcher’s work experience in research, continuous evaluation of her positionality, daily base reflection of fieldwork and seeking the help of male and female gatekeepers. The research findings would be based on emic perspective, and by documenting the emic perspective of minorities, this research will contribute to the knowledge of appropriate channels, including organizations, academics, and policymakers, to discover possible solutions and coping mechanisms to deal with the challenges and meet the multicultural demands of minorities. This research will provide a more in-depth understanding of minorities and will help to promote the diversity of health and social care in South Wales.

Keywords: migration, migrants, cultural competence, cultural barriers, healthcare challenges

Procedia PDF Downloads 41
1170 Interventions for Children with Autism Using Interactive Technologies

Authors: Maria Hopkins, Sarah Koch, Fred Biasini

Abstract:

Autism is lifelong disorder that affects one out of every 110 Americans. The deficits that accompany Autism Spectrum Disorders (ASD), such as abnormal behaviors and social incompetence, often make it extremely difficult for these individuals to gain functional independence from caregivers. These long-term implications necessitate an immediate effort to improve social skills among children with an ASD. Any technology that could teach individuals with ASD necessary social skills would not only be invaluable for the individuals affected, but could also effect a massive saving to society in treatment programs. The overall purpose of the first study was to develop, implement, and evaluate an avatar tutor for social skills training in children with ASD. “Face Say” was developed as a colorful computer program that contains several different activities designed to teach children specific social skills, such as eye gaze, joint attention, and facial recognition. The children with ASD were asked to attend to FaceSay or a control painting computer game for six weeks. Children with ASD who received the training had an increase in emotion recognition, F(1, 48) = 23.04, p < 0.001 (adjusted Ms 8.70 and 6.79, respectively) compared to the control group. In addition, children who received the FaceSay training had higher post-test scored in facial recognition, F(1, 48) = 5.09, p < 0.05 (adjusted Ms: 38.11 and 33.37, respectively) compared to controls. The findings provide information about the benefits of computer-based training for children with ASD. Recent research suggests the value of also using socially assistive robots with children who have an ASD. Researchers investigating robots as tools for therapy in ASD have reported increased engagement, increased levels of attention, and novel social behaviors when robots are part of the social interaction. The overall goal of the second study was to develop a social robot designed to teach children specific social skills such as emotion recognition. The robot is approachable, with both an animal-like appearance and features of a human face (i.e., eyes, eyebrows, mouth). The feasibility of the robot is being investigated in children ages 7-12 to explore whether the social robot is capable of forming different facial expressions to accurately display emotions similar to those observed in the human face. The findings of this study will be used to create a potentially effective and cost efficient therapy for improving the cognitive-emotional skills of children with autism. Implications and study findings using the robot as an intervention tool will be discussed.

Keywords: autism, intervention, technology, emotions

Procedia PDF Downloads 357
1169 Engaging the World Bank: Good Governance and Human Rights-Based Approaches

Authors: Lottie Lane

Abstract:

It is habitually assumed and stated that the World Bank should engage and comply with international human rights standards. However, the basis for holding the Bank to such standards is unclear. Most advocates of the idea invoke aspects of international law to argue that the Bank has existing obligations to act in compliance with human rights standards. The Bank itself, however, does not appear to accept such arguments, despite having endorsed the importance of human rights for a considerable length of time. A substantial challenge is that under the current international human rights law framework, the World Bank is considered a non-state actor, and as such, has no direct human rights obligations. In the absence of clear legal duties for the Bank, it is necessary to look at the tools available beyond the international human rights framework to encourage the Bank to comply with human rights standards. This article critically examines several bases for arguing that the Bank should comply and engage with human rights through its policies and practices. Drawing on the Bank’s own ‘good governance’ approach as well as the United Nations’ ‘human rights-based-approach’ to development, a new basis is suggested. First, the relationship between the World Bank and human rights is examined. Three perspectives are considered: (1) the legal position – what the status of the World Bank is under international human rights law, and whether it can be said to have existing legal human rights obligations; (2) the Bank’s own official position – how the Bank envisages its relationship with and role in the protection of human rights; and (3) the relationship between the Bank’s policies and practices and human rights (including how its attitudes are reflected in its policies and how the Bank’s operations impact human rights enjoyment in practice). Here, the article focuses on two examples – the (revised) 2016 Environmental and Social Safeguard Policies and the 2012 case-study regarding Gambella, Ethiopia. Both examples are widely considered missed opportunities for the Bank to actively engage with human rights. The analysis shows that however much pressure is placed on the Bank to improve its human rights footprint, it is extremely reluctant to do so explicitly, and the legal bases available are insufficient for requiring concrete, ex ante action by the Bank. Instead, the Bank’s own ‘good governance’ approach to development – which it has been advocating since the 1990s – can be relied upon. ‘Good governance’ has been used and applied by many actors in many contexts, receiving numerous different definitions. This article argues that human rights protection can now be considered a crucial component of good governance, at least in the context of development. In doing so, the article explains the relationship and interdependence between the two concepts, and provides three rationales for the Bank to take a ‘human rights-based approach’ to good governance. Ultimately, this article seeks to look beyond international human rights law and take a governance approach to provide a convincing basis upon which to argue that the World Bank should comply with human rights standards.

Keywords: World Bank, international human rights law, good governance, human rights-based approach

Procedia PDF Downloads 332
1168 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions

Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri

Abstract:

Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.

Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics

Procedia PDF Downloads 163
1167 Barriers and Facilitators for Telehealth Use during Cervical Cancer Screening and Care: A Literature Review

Authors: Reuben Mugisha, Stella Bakibinga

Abstract:

The cervical cancer burden is a global threat, but more so in low income settings where more than 85% of mortality cases occur due to lack of sufficient screening programs. There is consequently a lack of early detection of cancer and precancerous cells among women. Studies show that 3% to 35% of deaths could have been avoided through early screening depending on prognosis, disease progression, environmental and lifestyle factors. In this study, a systematic literature review is undertaken to understand potential barriers and facilitators as documented in previous studies that focus on the application of telehealth in cervical cancer screening programs for early detection of cancer and precancerous cells. The study informs future studies especially those from low income settings about lessons learned from previous studies and how to be best prepared while planning to implement telehealth for cervical cancer screening. It further identifies the knowledge gaps in the research area and makes recommendations. Using a specified selection criterion, 15 different articles are analyzed based on the study’s aim, theory or conceptual framework used, method applied, study findings and conclusion. Results are then tabulated and presented thematically to better inform readers about emerging facts on barriers and facilitators to telehealth implementation as documented in the reviewed articles, and how they consequently lead to evidence informed conclusions that are relevant to telehealth implementation for cervical cancer screening. Preliminary findings of this study underscore that use of low cost mobile colposcope is an appealing option in cervical cancer screening, particularly when coupled with onsite treatment of suspicious lesions. These tools relay cervical images to the online databases for storage and retrieval, they permit integration of connected devices at the point of care to rapidly collect clinical data for further analysis of the prevalence of cervical dysplasia and cervical cancer. Results however reveal the need for population sensitization prior to use of mobile colposcopies among patients, standardization of mobile colposcopy programs across screening partners, sufficient logistics and good connectivity, experienced experts to review image cases at the point-of-care as important facilitators to the implementation of mobile colposcope as a telehealth cervical cancer screening mechanism.

Keywords: cervical cancer screening, digital technology, hand-held colposcopy, knowledge-sharing

Procedia PDF Downloads 199
1166 Impact of Gold Mining on Crop Production, Livelihood and Environmental Sustainability in West Africa in the Context of Water-Energy-Food Nexus

Authors: Yusif Habib

Abstract:

The Volta River Basin (VRB) is a transboundary resource shared by Six (6) the West African States. It’s utilization spans across irrigation, hydropower generation, domestic/household water use, transportation, industrial processing, among others. Simultaneously, mineral resources such as gold are mined within the VRB catchment. Typically, the extraction/mining operation is earth-surface excavation; known as Artisanal and Small-scale mining. We developed a conceptual framework in the context of Water-Energy-Food (WEF) Nexus to delineate the trade-offs and synergies between the mineral extractive operation’s impact on Agricultural systems, specifically, cereal crops (e.g. Maize, Millet, and Rice) and the environment (water and soil quality, deforestation, etc.) on the VRB. Thus, the study examined the trade-offs and synergies through the WEF nexus lens to explore the extent of an eventual overarching mining preference for gold exploration with high economic returns as opposed to the presumably low yearly harvest and household income from food crops production to inform intervention prioritization. Field survey (household, expert, and stakeholder consultation), bibliometric analysis/literature review, scenario, and simulation models, including land-use land cover (LULC) analyses, were conducted. The selected study area(s) in Ghana was the location where the mineral extractive operation’s presence and impact are widespread co-exist with the Agricultural systems. Overall, the study proposes mechanisms of the virtuous cycle through FEW Nexus instead of the presumably existing vicious cycle to inform decision making and policy implementation.

Keywords: agriculture, environmental sustainability, gold Mining, synergies, trade-off, water-energy-food nexus

Procedia PDF Downloads 135
1165 CRISPR-Mediated Genome Editing for Yield Enhancement in Tomato

Authors: Aswini M. S.

Abstract:

Tomato (Solanum lycopersicum L.) is one of the most significant vegetable crops in terms of its economic benefits. Both fresh and processed tomatoes are consumed. Tomatoes have a limited genetic base, which makes breeding extremely challenging. Plant breeding has become much simpler and more effective with genome editing tools of CRISPR and CRISPR-associated 9 protein (CRISPR/Cas9), which address the problems with traditional breeding, chemical/physical mutagenesis, and transgenics. With the use of CRISPR/Cas9, a number of tomato traits have been functionally distinguished and edited. These traits include plant architecture as well as flower characters (leaf, flower, male sterility, and parthenocarpy), fruit ripening, quality and nutrition (lycopene, carotenoid, GABA, TSS, and shelf-life), disease resistance (late blight, TYLCV, and powdery mildew), tolerance to abiotic stress (heat, drought, and salinity) and resistance to herbicides. This study explores the potential of CRISPR/Cas9 genome editing for enhancing yield in tomato plants. The study utilized the CRISPR/Cas9 genome editing technology to functionally edit various traits in tomatoes. The de novo domestication of elite features from wild cousins to cultivated tomatoes and vice versa has been demonstrated by the introgression of CRISPR/Cas9. The CycB (Lycopene beta someri) gene-mediated Cas9 editing increased the lycopene content in tomato. Also, Cas9-mediated editing of the AGL6 (Agamous-like 6) gene resulted in parthenocarpic fruit development under heat-stress conditions. The advent of CRISPR/Cas has rendered it possible to use digital resources for single guide RNA design and multiplexing, cloning (such as Golden Gate cloning, GoldenBraid, etc.), creating robust CRISPR/Cas constructs, and implementing effective transformation protocols like the Agrobacterium and DNA free protoplast method for Cas9-gRNAs ribonucleoproteins (RNPs) complex. Additionally, homologous recombination (HR)-based gene knock-in (HKI) via geminivirus replicon and base/prime editing (Target-AID technology) remains possible. Hence, CRISPR/Cas facilitates fast and efficient breeding in the improvement of tomatoes.

Keywords: CRISPR-Cas, biotic and abiotic stress, flower and fruit traits, genome editing, polygenic trait, tomato and trait introgression

Procedia PDF Downloads 48
1164 DFT and SCAPS Analysis of an Efficient Lead-Free Inorganic CsSnI₃ Based Perovskite Solar Cell by Modification of Hole Transporting Layer

Authors: Seyedeh Mozhgan Seyed Talebi, Chih -Hao Lee

Abstract:

With an abrupt rise in the power conservation efficiency (PCE) of perovskite solar cells (PSCs) within a short span of time, the toxicity of lead was raised as a major hurdle in the path toward their commercialization. In the present research, a systematic investigation of the electrical and optical characteristics of the all-inorganic CsSnI₃ perovskite absorber layer was performed with the Vienna Ab Initio Simulation Package (VASP) using the projector-augmented wave method. The presence of inorganic halide perovskite offers the advantages of enhancing the degradation resistance of the device, reducing the cost of cells, and minimizing the recombination of generated carriers. The simulated standard device using a 1D simulator like solar cell capacitance simulator (SCAPS) version 3308 involves FTO/n-TiO₂/CsSnI₃ Perovskite absorber/Spiro OmeTAD HTL/Au contact layer. The variation in the device design key parameters such as the thickness and defect density of perovskite absorber, hole transport layer and electron transport layer and interfacial defects are examined with their impact on the photovoltaic characteristic parameters. The effect of an increase in operating temperature from 300 K to 400 K on the performance of CsSnI3-based perovskite devices is also investigated. The optimized standard device at room temperature shows the highest PCE of 25.18 % with FF of 75.71 %, Voc of 0.96 V, and Jsc of 34.67 mA/cm². The outcomes and interpretation of different inorganic Cu-based HTLs presence, such as CuSCN, Cu₂O, CuO, CuI, SrCu₂O₂, and CuSbS₂, here represent a critical avenue for the possibility of fabricating high PCE perovskite devices made of stable, low-cost, efficient, safe, and eco-friendly all-inorganic materials like CsSnI₃ perovskite light absorber.

Keywords: CsSnI₃, hole transporting layer (HTL), lead-free perovskite solar cell, SCAPS-1D software

Procedia PDF Downloads 58