Search results for: real time digital simulator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22659

Search results for: real time digital simulator

17169 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 120
17168 Seismic Assessment of Non-Structural Component Using Floor Design Spectrum

Authors: Amin Asgarian, Ghyslaine McClure

Abstract:

Experiences in the past earthquakes have clearly demonstrated the necessity of seismic design and assessment of Non-Structural Components (NSCs) particularly in post-disaster structures such as hospitals, power plants, etc. as they have to be permanently functional and operational. Meeting this objective is contingent upon having proper seismic performance of both structural and non-structural components. Proper seismic design, analysis, and assessment of NSCs can be attained through generation of Floor Design Spectrum (FDS) in a similar fashion as target spectrum for structural components. This paper presents the developed methodology to generate FDS directly from corresponding Uniform Hazard Spectrum (UHS) (i.e. design spectra for structural components). The methodology is based on the experimental and numerical analysis of a database of 27 real Reinforced Concrete (RC) buildings which are located in Montreal, Canada. The buildings were tested by Ambient Vibration Measurements (AVM) and their dynamic properties have been extracted and used as part of the approach. Database comprises 12 low-rises, 10 medium-rises, and 5 high-rises and they are mostly designated as post-disaster\emergency shelters by the city of Montreal. The buildings are subjected to 20 compatible seismic records to UHS of Montreal and Floor Response Spectra (FRS) are developed for every floors in two horizontal direction considering four different damping ratios of NSCs (i.e. 2, 5, 10, and 20 % viscous damping). Generated FRS (approximately 132’000 curves) are statistically studied and the methodology is proposed to generate the FDS directly from corresponding UHS. The approach is capable of generating the FDS for any selection of floor level and damping ratio of NSCs. It captures the effect of: dynamic interaction between primary (structural) and secondary (NSCs) systems, higher and torsional modes of primary structure. These are important improvements of this approach compared to conventional methods and code recommendations. Application of the proposed approach are represented here through two real case-study buildings: one low-rise building and one medium-rise. The proposed approach can be used as practical and robust tool for seismic assessment and design of NSCs especially in existing post-disaster structures.

Keywords: earthquake engineering, operational and functional components, operational modal analysis, seismic assessment and design

Procedia PDF Downloads 209
17167 GBKMeans: A Genetic Based K-Means Applied to the Capacitated Planning of Reading Units

Authors: Anderson S. Fonseca, Italo F. S. Da Silva, Robert D. A. Santos, Mayara G. Da Silva, Pedro H. C. Vieira, Antonio M. S. Sobrinho, Victor H. B. Lemos, Petterson S. Diniz, Anselmo C. Paiva, Eliana M. G. Monteiro

Abstract:

In Brazil, the National Electric Energy Agency (ANEEL) establishes that electrical energy companies are responsible for measuring and billing their customers. Among these regulations, it’s defined that a company must bill your customers within 27-33 days. If a relocation or a change of period is required, the consumer must be notified in writing, in advance of a billing period. To make it easier to organize a workday’s measurements, these companies create a reading plan. These plans consist of grouping customers into reading groups, which are visited by an employee responsible for measuring consumption and billing. The creation process of a plan efficiently and optimally is a capacitated clustering problem with constraints related to homogeneity and compactness, that is, the employee’s working load and the geographical position of the consuming unit. This process is a work done manually by several experts who have experience in the geographic formation of the region, which takes a large number of days to complete the final planning, and because it’s human activity, there is no guarantee of finding the best optimization for planning. In this paper, the GBKMeans method presents a technique based on K-Means and genetic algorithms for creating a capacitated cluster that respects the constraints established in an efficient and balanced manner, that minimizes the cost of relocating consumer units and the time required for final planning creation. The results obtained by the presented method are compared with the current planning of a real city, showing an improvement of 54.71% in the standard deviation of working load and 11.97% in the compactness of the groups.

Keywords: capacitated clustering, k-means, genetic algorithm, districting problems

Procedia PDF Downloads 191
17166 Comparison of Cognitive Load in Virtual Reality and Conventional Simulation-Based Training: A Randomized Controlled Trial

Authors: Michael Wagner, Philipp Steinbauer, Andrea Katharina Lietz, Alexander Hoffelner, Johannes Fessler

Abstract:

Background: Cardiopulmonary resuscitations are stressful situations in which vital decisions must be made within seconds. Lack of routine due to the infrequency of pediatric emergencies can lead to serious medical and communication errors. Virtual reality can fundamentally change the way simulation training is conducted in the future. It appears to be a useful learning tool for technical and non-technical skills. It is important to investigate the use of VR in providing a strong sense of presence within simulations. Methods: In this randomized study, we will enroll doctors and medical students from the Medical University of Vienna, who will receive learning material regarding the resuscitation of a one-year-old child. The study will be conducted in three phases. In the first phase, 20 physicians and 20 medical students from the Medical University of Vienna will be included. They will perform simulation-based training with a standardized scenario of a critically ill child with a hypovolemic shock. The main goal of this phase is to establish a baseline for the following two phases to generate comparative values regarding cognitive load and stress. In phase 2 and 3, the same participants will perform the same scenario in a VR setting. In both settings, on three set points of progression, one of three predefined events is triggered. For each event, three different stress levels (easy, medium, difficult) will be defined. Stress and cognitive load will be analyzed using the NASA Task Load Index, eye-tracking parameters, and heart rate. Subsequently, these values will be compared between VR training and traditional simulation-based training. Hypothesis: We hypothesize that the VR training and the traditional training groups will not differ in physiological response (cognitive load, heart rate, and heart rate variability). We further assume that virtual reality training can be used as cost-efficient additional training. Objectives: The aim of this study is to measure cognitive load and stress level during a real-life simulation training and compare it with VR training in order to show that VR training evokes the same physiological response and cognitive load as real-life simulation training.

Keywords: virtual reality, cognitive load, simulation, adaptive virtual reality training

Procedia PDF Downloads 109
17165 Fundamentals of Theorizing Power in International Relations

Authors: Djehich Mohamed Yousri

Abstract:

The field of political science is one of the sciences in which there is much controversy, in terms of the multiplicity of schools, trends, and goals. This overlap and complexity in the interpretation of the political phenomenon in political science has been linked to other disciplines associated with it, and the science of international relations and the huge amount of theories that have found a wide range and a decisive position after the national tide in the history of Western political thought, especially after the Westphalia Conference 1648, and as a result was approved The new foundations of international politics, the most important of which is respect for state sovereignty. Historical events continued and coincided with scientific, intellectual, and economic developments following the emergence of the industrial revolution, followed by the technological revolutions in all their contents, which led to the rooting and establishment of a comprehensive political system that is more complex and overlapping than it was in the past during the First and Second World Wars. The international situation has become dependent on the digital revolution and its aspirations in The comprehensive transformation witnessed by international political relations after the Cold War.

Keywords: theorizing, international relations, approaches to international relations, political science, the political system

Procedia PDF Downloads 99
17164 The Pricing-Out Phenomenon in the U.S. Housing Market

Authors: Francesco Berald, Yunhui Zhao

Abstract:

The COVID-19 pandemic further extended the multi-year housing boom in advanced economies and emerging markets alike against massive monetary easing during the pandemic. In this paper, we analyze the pricing-out phenomenon in the U.S. residential housing market due to higher house prices associated with monetary easing. We first set up a stylized general equilibrium model and show that although monetary easing decreases the mortgage payment burden, it would raise house prices and lower housing affordability for first-time homebuyers (through the initial housing wealth channel and the liquidity constraint channel that increases repeat buyers’ housing demand), and increase housing wealth inequality between first-time and repeat homebuyers. We then use the U.S. household-level data to quantify the effect of the house price change on housing affordability relative to that of the interest rate change. We find evidence of the pricing-out effect for all homebuyers; moreover, we find that the pricing-out effect is stronger for first-time homebuyers than for repeat homebuyers. The paper highlights the importance of accounting for general equilibrium effects and distributional implications of monetary policy while assessing housing affordability. It also calls for complementing monetary easing with well-targeted policy measures that can boost housing affordability, particularly for first-time and lower-income households. Such measures are also needed during aggressive monetary tightening, given that the fall in house prices may be insufficient or too slow to fully offset the immediate adverse impact of higher rates on housing affordability.

Keywords: pricing-out, U.S. housing market, housing affordability, distributional effects, monetary policy

Procedia PDF Downloads 30
17163 The Existential in a Practical Phenomenology Research: A Study on the Political Participation of Young Women

Authors: Amanda Aliende da Matta, Maria del Pilar Fogueiras Bertomeu, Valeria de Ormaechea Otalora, Maria Paz Sandin Esteban, Miriam Comet Donoso

Abstract:

This communication presents proposed questions about the existential in research on the political participation of young women. The study follows a qualitative methodology, in particular, the applied hermeneutic phenomenological (AHP) method, and the general objective of the research is to give an account of the experience of political participation as a young woman. The study participants are women aged 18 to 35 who have experience in political participation. The techniques of data collection are the descriptive story and the phenomenological interview. Hermeneutic phenomenology as a research approach is based on phenomenological philosophy and applied hermeneutics. The ultimate objective of HP is to gain access to the meaning structures of lived experience by appropriating them, clarifying them, and reflectively making them explicit. Human experiences are always lived through existential: fundamental themes that are useful in exploring meaningful aspects of our life worlds. Everyone experiences the world through the existential of lived relationships, the lived body, lived space, lived time, and lived things. The phenomenological research, then, also tacitly asks about the existential. Existentials are universal themes useful for exploring significant aspects of our life world and of the particular phenomena under study. Four main existentials prove especially helpful as guides for reflection in the research process: relationship, body, space, and time. For example, in our case, we may ask ourselves how can the existentials of relationship, body, space, and time guide us in exploring the structures of meaning in the lived experience of political participation as a woman and a young person. The study is still not finished, as we are currently conducting phenomenological thematic analysis on the collected stories of lived experience. Yet, we have already identified some fragments of texts that show the existential in their experiences, which we will transcribe below. 1) Relationality - The experienced I-Other. It regards how relationships are experienced in our narratives about political participation as young women. One example would be: “As we had known each other for a long time, we understood each other with our eyes; we were all a little bit on the same page, thinking the same thing.” 2) Corporeality - The lived body. It regards how the lived body is experienced in activities of political participation as a young woman. One example would be: “My blood was boiling, but it was not the time to throw anything in their face, we had to look for solutions.”; “I had a lump in my throat and I wanted to cry.”. 3) Spatiality - The lived space. It regards how one experiences the lived space in political participation activities as a young woman. One example would be: “And the feeling I got when I saw [it] it's like watching everybody going into a mousetrap.” 4) Temporality - Lived time. It regards how one experiences the lived time in political participation activities as a young woman. One example would be: “Then, there were also meetings that went on forever…”

Keywords: applied hermeneutic phenomenology, existentials, hermeneutics, phenomenology, political participation

Procedia PDF Downloads 77
17162 A Low-Cost of Foot Plantar Shoes for Gait Analysis

Authors: Zulkifli Ahmad, Mohd Razlan Azizan, Nasrul Hadi Johari

Abstract:

This paper presents a study on development and conducting of a wearable sensor system for gait analysis measurement. For validation, the method of plantar surface measurement by force plate was prepared. In general gait analysis, force plate generally represents a studies about barefoot in whole steps and do not allow analysis of repeating movement step in normal walking and running. The measurements that were usually perform do not represent the whole daily plantar pressures in the shoe insole and only obtain the ground reaction force. The force plate measurement is usually limited a few step and it is done indoor and obtaining coupling information from both feet during walking is not easily obtained. Nowadays, in order to measure pressure for a large number of steps and obtain pressure in each insole part, it could be done by placing sensors within an insole. With this method, it will provide a method for determine the plantar pressures while standing, walking or running of a shoe wearing subject. Inserting pressure sensors in the insole will provide specific information and therefore the point of the sensor placement will result in obtaining the critical part under the insole. In the wearable shoe sensor project, the device consists left and right shoe insole with ten FSR. Arduino Mega was used as a micro-controller that read the analog input from FSR. The analog inputs were transmitted via bluetooth data transmission that gains the force data in real time on smartphone. Blueterm software which is an android application was used as an interface to read the FSR reading on the shoe wearing subject. The subject consist of two healthy men with different age and weight doing test while standing, walking (1.5 m/s), jogging (5 m/s) and running (9 m/s) on treadmill. The data obtain will be saved on the android device and for making an analysis and comparison graph.

Keywords: gait analysis, plantar pressure, force plate, earable sensor

Procedia PDF Downloads 446
17161 Applying Intelligent Material in Food Packaging

Authors: Kasra Ghaemi, Syeda Tasnim, Shohel Mahmud

Abstract:

One of the main issues affecting the quality and shelf life of food products is temperature fluctuation during transportation and storage. Packaging plays an important role in protecting food from environmental conditions, especially thermal variations. In this study, the performance of using microencapsulated Phase Change Material (PCM) as a promising thermal buffer layer in smart food packaging is investigated. The considered insulation layer is evaluated for different thicknesses and the absorbed heat from the environment. The results are presented in terms of the melting time of PCM or provided thermal protection period.

Keywords: food packaging, phase change material, thermal buffer, protection time

Procedia PDF Downloads 87
17160 The Synthesis and Analysis of Two Long Lasting Phosphorescent Compounds: SrAl2O4: Eu2+, Dy3+

Authors: Ghayah Alsaleem

Abstract:

This research project focussed on specific compounds, whereas a literature review was completed on the broader subject of long-lasting phosphorescence. For the review and subsequent laboratory work, long lasting phosphorescence compounds were defined as materials that have an afterglow decay time greater than a few minutes. The decay time is defined as the time between the end of excitation and the moment the light intensity drops below 0.32mcd/m2. This definition is widely used in industry and in most research studies. The experimental work focused on known long-lasting phosphorescence compounds – strontium aluminate (SrAl2O4: Eu2+, Dy3+). At first, preparation was similar to literary methods. Temperature, dopant levels and mixing methods were then varied in order to expose their effects on long-lasting phosphorescence. The effect of temperature was investigated for SrAl2O4: Eu2+, Dy3+, and resulted in the discovery that 1350°C was the only temperature that the compound could be heated to in the Differential scanning calorimetry (DSC) in order to achieve any phosphorescence. However, no temperatures above 1350°C were investigated. The variation of mixing method and co-dopant level in the strontium aluminate compounds resulted in the finding that the dry mixing method using a Turbula mixer resulted in the longest afterglow. It was also found that an increase of europium inclusion, from 1mol% to 2mol% in these compounds, increased the brightest of the phosphorescence. As this increased batch was mixed using sonication, the phosphorescent time was actually reduced which produced green long-lasting phosphorescence for up to 20 minutes following 30 minutes excitation and 50 minutes when the europium content was doubled and mixed using sonication.

Keywords: long lasting, phosphorescence, excitation, europium

Procedia PDF Downloads 178
17159 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: hybrid storage system, data mining, recurrent neural network, support vector machine

Procedia PDF Downloads 303
17158 Analysis of Brake System for Vehicle Off-Road

Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale, José Ubiragi de Lima Mendes

Abstract:

In elapsing of the years it elaborates automobile it is developing automobiles more and more modern that, every year, the vehicles recently of the assembly lines, practically they push for the past produced models there is very little time. Those innovations didn't also pass unperceived in 0respect the safety of the vehicles. It is in this development apprenticeship the brakes systems equipped more and more with resources sophisticated. In that way, before of that context, this research tried to project a brake system for a vehicle off-road and to analyze your acting as the brakes efficiency: distances traveled and time, concluding with possible improvements in the system.

Keywords: brakes system, off-road, vehicle acting, automotive and mechanical engineering

Procedia PDF Downloads 480
17157 The Safety Transfer in Acute Critical Patient by Telemedicine (START) Program at Udonthani General Hospital

Authors: Wisit Wichitkosoom

Abstract:

Objective:The majority of the hisk-risk patients (ST-elevation myocardial infarction (STEMI), Acute cerebrovascular accident, Sepsis, Acute Traumatic patient ) are admitted to district or lacal hospitals (average 1-1.30 hr. from Udonthani general hospital, Northeastern province, Thailand) without proper facilities. The referral system was support to early care and early management at pre-hospital stage and prepare for the patient data to higher hospital. This study assessed the reduction in treatment delay achieved by pre-hospital diagnosis and referral directly to Udonthani General Hospital. Methods and results: Four district or local hospitals without proper facilities for treatment the very high-risk patient were serving the study region. Pre-hospital diagnoses were established with the simple technology such as LINE, SMS, telephone and Fax for concept of LEAN process and then the telemedicine, by ambulance monitoring (ECG, SpO2, BT, BP) in both real time and snapshot mode was administrated during the period of transfer for safety transfer concept (inter-hospital stage). The standard treatment for patients with STEMI, Intracranial injury and acute cerebrovascular accident were done. From 1 October 2012 to 30 September 2013, the 892 high-risk patients transported by ambulance and transferred to Udonthani general hospital were registered. Patients with STEMI diagnosed pre-hospitally and referred directly to the Udonthani general hospital with telemedicine closed monitor (n=248). The mortality rate decreased from 11.69% in 2011 to 6.92 in 2012. The 34 patients were arrested on the way and successful to CPR during transfer with the telemedicine consultation were 79.41%. Conclusion: The proper innovation could apply for health care system. The very high-risk patients must had the closed monitoring with two-way communication for the “safety transfer period”. It could modified to another high-risk group too.

Keywords: safety transfer, telemedicine, critical patients, medical and health sciences

Procedia PDF Downloads 302
17156 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 427
17155 The Use of TV and the Internet in the Social Context

Authors: Khulood Miliany

Abstract:

This study examines the media habits of young people in Saudi Arabia, in particular their use of the Internet and television in the domestic sphere, and how use of the Internet impacts upon other activities. In order to address the research questions, focus group interviews were conducted with Saudi university students. The study found that television has become a central part of social life within the household where television represents a main source for family time, particularly in Ramadan while the Internet is a solitary activity where it is used in more private spaces. Furthermore, Saudi females were also more likely to have their Internet access monitored and circumscribed by family members, with parents controlling the location and the amount of time spent using the Internet.

Keywords: domestication of technology, internet, social context, television, young people

Procedia PDF Downloads 292
17154 Integrating Life Skills Education for Mental Health and Academic Benefits of Adolescents in Schools in Schools

Authors: Sarwat Sultan, Muhammad Saleem, Frasat Kanwal

Abstract:

Adolescence is a transition period of life that brings physical and psychological changes and always results in several challenges for an adolescent. An adolescent must learn life skills for a healthy transition from adolescence period to adulthood. Therefore this study was planned to examine the effects of life skill education on adolescents' mental health and academic benefits. A random sample of 720 school students aged between 13-17 years was categorized into two groups; experimental (n=360) and control (n=360). Life skill education was given to the students of the intervention group with repeated assessments of mental health and academic benefits at pre-intervention (T1) and post-intervention (T2) for both groups. Both groups were compared on scores of mental health and academic benefits across two times T1 and T2 by employing a mixed between-within-subjects analysis of variance. Findings showed the main effect of time suggesting the largest changes in mental health and academic benefits over time. Interaction effects between time and both groups were also found significant indicating the largest changes across time between both groups. Results of between-group comparisons showed significant values for Wilks’ Lambda and partial eta squared for students of the intervention group who scored higher on mental health and academic benefits after receiving life skills training than the students of the control group. Results of the present study determined the efficacy of life skill education and have implications for both teachers and psychotherapists to improve the students’ mental health and academic performance.

Keywords: academic benefits, life skills, mental health, adolescents

Procedia PDF Downloads 52
17153 Identification Strategies for Unknown Victims from Mass Disasters and Unknown Perpetrators from Violent Crime or Terrorist Attacks

Authors: Michael Josef Schwerer

Abstract:

Background: The identification of unknown victims from mass disasters, violent crimes, or terrorist attacks is frequently facilitated through information from missing persons lists, portrait photos, old or recent pictures showing unique characteristics of a person such as scars or tattoos, or simply reference samples from blood relatives for DNA analysis. In contrast, the identification or at least the characterization of an unknown perpetrator from criminal or terrorist actions remains challenging, particularly in the absence of material or data for comparison, such as fingerprints, which had been previously stored in criminal records. In scenarios that result in high levels of destruction of the perpetrator’s corpse, for instance, blast or fire events, the chance for a positive identification using standard techniques is further impaired. Objectives: This study shows the forensic genetic procedures in the Legal Medicine Service of the German Air Force for the identification of unknown individuals, including such cases in which reference samples are not available. Scenarios requiring such efforts predominantly involve aircraft crash investigations, which are routinely carried out by the German Air Force Centre of Aerospace Medicine as one of the Institution’s essential missions. Further, casework by military police or military intelligence is supported based on administrative cooperation. In the talk, data from study projects, as well as examples from real casework, will be demonstrated and discussed with the audience. Methods: Forensic genetic identification in our laboratories involves the analysis of Short Tandem Repeats and Single Nucleotide Polymorphisms in nuclear DNA along with mitochondrial DNA haplotyping. Extended DNA analysis involves phenotypic markers for skin, hair, and eye color together with the investigation of a person’s biogeographic ancestry. Assessment of the biological age of an individual employs CpG-island methylation analysis using bisulfite-converted DNA. Forensic Investigative Genealogy assessment allows the detection of an unknown person’s blood relatives in reference databases. Technically, end-point-PCR, real-time PCR, capillary electrophoresis, pyrosequencing as well as next generation sequencing using flow-cell-based and chip-based systems are used. Results and Discussion: Optimization of DNA extraction from various sources, including difficult matrixes like formalin-fixed, paraffin-embedded tissues, degraded specimens from decomposed bodies or from decedents exposed to blast or fire events, provides soil for successful PCR amplification and subsequent genetic profiling. For cases with extremely low yields of extracted DNA, whole genome preamplification protocols are successfully used, particularly regarding genetic phenotyping. Improved primer design for CpG-methylation analysis, together with validated sampling strategies for the analyzed substrates from, e.g., lymphocyte-rich organs, allows successful biological age estimation even in bodies with highly degraded tissue material. Conclusions: Successful identification of unknown individuals or at least their phenotypic characterization using pigmentation markers together with age-informative methylation profiles, possibly supplemented by family tree search employing Forensic Investigative Genealogy, can be provided in specialized laboratories. However, standard laboratory procedures must be adapted to work with difficult and highly degraded sample materials.

Keywords: identification, forensic genetics, phenotypic markers, CPG methylation, biological age estimation, forensic investigative genealogy

Procedia PDF Downloads 43
17152 Elaboration and Characterization of Tin Sulfide Thin Films Prepared by Spray Ultrasonic

Authors: A. Attaf, I. Bouhaf Kharkhachi

Abstract:

Hexagonal tin disulfide (SnS2) films were deposited by spray ultrasonic technique on glass substrates at different experimental conditions. The effect of deposition time (2, 4, 6, and 7 min) on different properties of SnS2 thin films was investigated by XRD and UV spectroscopy visible spectrum. X-ray diffraction study detected the preferential orientation growth of SnS2 compound having structure along (001) plane increased with the deposition time. The results of UV spectroscopy visible spectrum showed that films deposited at 4 min have high transmittance, up to 60%, in the visible region.

Keywords: structural and optical properties, tin sulfide, thin films, ultrasonic spray

Procedia PDF Downloads 464
17151 Investigation of Electrochemical, Morphological, Rheological and Mechanical Properties of Nano-Layered Graphene/Zinc Nanoparticles Incorporated Cold Galvanizing Compound at Reduced Pigment Volume Concentration

Authors: Muhammad Abid

Abstract:

The ultimate goal of this research was to produce a cold galvanizing compound (CGC) at reduced pigment volume concentration (PVC) to protect metallic structures from corrosion. The influence of the partial replacement of Zn dust by nano-layered graphene (NGr) and Zn metal nanoparticles on the electrochemical, morphological, rheological, and mechanical properties of CGC was investigated. EIS was used to explore the electrochemical nature of coatings. The EIS results revealed that the partial replacement of Zn by NGr and Zn nanoparticles enhanced the cathodic protection at reduced PVC (4:1) by improving the electrical contact between the Zn particles and the metal substrate. The Tafel scan was conducted to support the cathodic behaviour of the coatings. The sample formulated solely with Zn at PVC 4:1 was found to be dominated in physical barrier characteristics over cathodic protection. By increasing the concentration of NGr in the formulation, the corrosion potential shifted towards a more negative side. The coating with 1.5% NGr showed the highest galvanic action at reduced PVC. FE-SEM confirmed the interconnected network of conducting particles. The coating without NGr and Zn nanoparticles at PVC 4:1 showed significant gaps between the Zn dust particles. The novelty was evidenced when micrographs showed the consistent distribution of NGr and Zn nanoparticles all over the surface, which acted as a bridge between spherical Zn particles and provided cathodic protection at a reduced PVC. The layered structure of graphene also improved the physical shielding effect of the coatings, which limited the diffusion of electrolytes and corrosion products (oxides/hydroxides) into the coatings, which was reflected by the salt spray test. The rheological properties of coatings showed good liquid/fluid properties. All the coatings showed excellent adhesion but had different strength values. A real-time scratch resistance assessment showed all the coatings had good scratch resistance.

Keywords: protective coatings, anti-corrosion, galvanization, graphene, nanomaterials, polymers

Procedia PDF Downloads 90
17150 Variables, Annotation, and Metadata Schemas for Early Modern Greek

Authors: Eleni Karantzola, Athanasios Karasimos, Vasiliki Makri, Ioanna Skouvara

Abstract:

Historical linguistics unveils the historical depth of languages and traces variation and change by analyzing linguistic variables over time. This field of linguistics usually deals with a closed data set that can only be expanded by the (re)discovery of previously unknown manuscripts or editions. In some cases, it is possible to use (almost) the entire closed corpus of a language for research, as is the case with the Thesaurus Linguae Graecae digital library for Ancient Greek, which contains most of the extant ancient Greek literature. However, concerning ‘dynamic’ periods when the production and circulation of texts in printed as well as manuscript form have not been fully mapped, representative samples and corpora of texts are needed. Such material and tools are utterly lacking for Early Modern Greek (16th-18th c.). In this study, the principles of the creation of EMoGReC, a pilot representative corpus of Early Modern Greek (16th-18th c.) are presented. Its design follows the fundamental principles of historical corpora. The selection of texts aims to create a representative and balanced corpus that gives insight into diachronic, diatopic and diaphasic variation. The pilot sample includes data derived from fully machine-readable vernacular texts, which belong to 4-5 different textual genres and come from different geographical areas. We develop a hierarchical linguistic annotation scheme, further customized to fit the characteristics of our text corpus. Regarding variables and their variants, we use as a point of departure the bundle of twenty-four features (or categories of features) for prose demotic texts of the 16th c. Tags are introduced bearing the variants [+old/archaic] or [+novel/vernacular]. On the other hand, further phenomena that are underway (cf. The Cambridge Grammar of Medieval and Early Modern Greek) are selected for tagging. The annotated texts are enriched with metalinguistic and sociolinguistic metadata to provide a testbed for the development of the first comprehensive set of tools for the Greek language of that period. Based on a relational management system with interconnection of data, annotations, and their metadata, the EMoGReC database aspires to join a state-of-the-art technological ecosystem for the research of observed language variation and change using advanced computational approaches.

Keywords: early modern Greek, variation and change, representative corpus, diachronic variables.

Procedia PDF Downloads 60
17149 Attendance Management System Implementation Using Face Recognition

Authors: Zainab S. Abdullahi, Zakariyya H. Abdullahi, Sahnun Dahiru

Abstract:

Student attendance in schools is a very important aspect in school management record. In recent years, security systems have become one of the most demanding systems in school. Every institute have its own method of taking attendance, many schools in Nigeria use the old fashion way of taking attendance. That is writing the students name and registration number in a paper and submitting it to the lecturer at the end of the lecture which is time-consuming and insecure, because some students can write for their friends without the lecturer’s knowledge. In this paper, we propose a system that takes attendance using face recognition. There are many automatic methods available for this purpose i.e. biometric attendance, but they all waste time, because the students have to follow a queue to put their thumbs on a scanner which is time-consuming. This attendance is recorded by using a camera attached in front of the class room and capturing the student images, detect the faces in the image and compare the detected faces with database and mark the attendance. The principle component analysis was used to recognize the faces detected with a high accuracy rate. The paper reviews the related work in the field of attendance system, then describe the system architecture, software algorithm and result.

Keywords: attendance system, face detection, face recognition, PCA

Procedia PDF Downloads 356
17148 Factors That Influence Willingness to Pay for Theatre Performances: The Case of Lithuanian National Drama Theatre

Authors: Rusne Kregzdaite

Abstract:

The value of the cultural sector stems from the symbolic exploration that differentiates cultural organisations from other product or service organisations. As a result, the cultural sector has a dual impact on the socio-economic system: the economic value (expressed in terms of market relations) created influences the dynamics of the country's financial indicators, while the cultural (non-market) value indirectly contributes to the welfare of the state through changes in societal values, creativity transformations and cultural needs of the country. Measurement of indirect (cultural value) impacts is difficult, but in the case of the cultural sector (especially when it comes to economically inefficient state-funded culture), it helps to reveal the essential characteristics of the sector. The study aims to analyze the value of cultural organisations that are invisible in market processes and to base it on quantified calculations. This was be done by analyzing the usefulness of the consumer, incorporating not only the price paid but also the social and cultural decision-making factors that determine the spectator's choice (time dedicated for a visit, additional costs, content, previous experiences, corporate image). This may reflect the consumer's real choice to consume (all the costs he incurs may be considered the financial equivalent of his experience with the cultural establishment). The research methodology was tested by analyzing the performing arts sector and applying methods to the Lithuanian national drama theatre case. The empirical research consisted of a survey (more than 800 participants) of Lithuanian national drama theatre visitors to different performances. The willingness to pay and travel costs methods were used. Analysis of different performances lets identifies the factor that increases willingness to pay for the performance and affects theatre attendance. The research stresses the importance of cultural value and social perspective of the cultural sector and relates it to the discussions of public funding of culture.

Keywords: cultural economics, performing arts, willingness to pay, travel cost analysis, performing arts management

Procedia PDF Downloads 84
17147 Dynamic Balance and Functional Performance in Total Hip Arthroplasty

Authors: Mahmoud Ghazy, Ahmed R. Z. Baghdadi

Abstract:

Background: With the perceived pain and poor function experienced following total hip Arthroplasty (THA), patients usually feel un-satisfied. Methods: Thirty patients with THA (group I) and thirty indicated for arthroplasty but weren’t operated on yet (group II) participated in the study. The mean age was 54.53±3.44 and 55.33±2.32 years and BMI 35.7±3.03 and 35.73±1.03 kg/m2 for group I and III respectively. The Berg Balance Scale (BBS), Timed Up-and-Go (TUG) and Stair-Climbing (SC) tests were used for assessment. Assessments were conducted four weeks pre- and post-operatively and three months post-operatively with the control group being assessed at the same time intervals. The post-operative rehabilitation involved hospitalization (1st week), home-based (2nd-4th weeks), and outpatient clinic (5th-12th weeks) programs. Results: group I had significantly lower TUG and SC time compared with group II four weeks and three months post-operatively. Moreover, the BBS scores increased significantly and the pain scores and TUG and SC time decreased significantly four weeks and three months post-operatively compared with four weeks pre- operatively in group. But no significant differences in BBS scores four weeks and three months post-operatively in group I compared with group II. Interpretation/Conclusion : Patients with THA still have defects in proprioception, so they needs more concentration on proprioception training.

Keywords: dynamic balance, functional performance, hip arthroplasty, total

Procedia PDF Downloads 367
17146 Time to Second Line Treatment Initiation Among Drug-Resistant Tuberculosis Patients in Nepal

Authors: Shraddha Acharya, Sharad Kumar Sharma, Ratna Bhattarai, Bhagwan Maharjan, Deepak Dahal, Serpahine Kaminsa

Abstract:

Background: Drug-resistant (DR) tuberculosis (TB) continues to be a threat in Nepal, with an estimated 2800 new cases every year. The treatment of DR-TB with second line TB drugs is complex and takes longer time with comparatively lower treatment success rate than drug-susceptible TB. Delay in treatment initiation for DR-TB patients might further result in unfavorable treatment outcomes and increased transmission. This study thus aims to determine median time taken to initiate second-line treatment among Rifampicin Resistant (RR) diagnosed TB patients and to assess the proportion of treatment delays among various type of DR-TB cases. Method: A retrospective cohort study was done using national routine electronic data (DRTB and TB Laboratory Patient Tracking System-DHIS2) on drug resistant tuberculosis patients between January 2020 and December 2022. The time taken for treatment initiation was computed as– days from first diagnosis as RR TB through Xpert MTB/Rif test to enrollment on second-line treatment. The treatment delay (>7 days after diagnosis) was calculated. Results: Among total RR TB cases (N=954) diagnosed via Xpert nationwide, 61.4% were enrolled under shorter-treatment regimen (STR), 33.0% under longer treatment regimen (LTR), 5.1% for Pre-extensively drug resistant TB (Pre-XDR) and 0.4% for Extensively drug resistant TB (XDR) treatment. Among these cases, it was found that the median time from diagnosis to treatment initiation was 6 days (IQR:2-15.8). The median time was 5 days (IQR:2.0-13.3) among STR, 6 days (IQR:3.0-15.0) among LTR, 30 days (IQR:5.5-66.8) among Pre-XDR and 4 days (IQR:2.5-9.0) among XDR TB cases. The overall treatment delay (>7 days after diagnosis) was observed in 42.4% of the patients, among which, cases enrolled under Pre-XDR contributed substantially to treatment delay (72.0%), followed by LTR (43.6%), STR (39.1%) and XDR (33.3%). Conclusion: Timely diagnosis and prompt treatment initiation remain fundamental focus of the National TB program. The findings of the study, however suggest gaps in timeliness of treatment initiation for the drug-resistant TB patients, which could bring adverse treatment outcomes. Moreover, there is an alarming delay in second line treatment initiation for the Pre-XDR TB patients. Therefore, this study generates evidence to identify existing gaps in treatment initiation and highlights need for formulating specific policies and intervention in creating effective linkage between the RR TB diagnosis and enrollment on second line TB treatment with intensified efforts from health providers for follow-ups and expansion of more decentralized, adequate, and accessible diagnostic and treatment services for DR-TB, especially Pre-XDR TB cases, due to the observed long treatment delays.

Keywords: drug-resistant, tuberculosis, treatment initiation, Nepal, treatment delay

Procedia PDF Downloads 76
17145 User Requirements Analysis for the Development of Assistive Navigation Mobile Apps for Blind and Visually Impaired People

Authors: Paraskevi Theodorou, Apostolos Meliones

Abstract:

In the context of the development process of two assistive navigation mobile apps for blind and visually impaired people (BVI) an extensive qualitative analysis of the requirements of potential users has been conducted. The analysis was based on interviews with BVIs and aimed to elicit not only their needs with respect to autonomous navigation but also their preferences on specific features of the apps under development. The elicited requirements were structured into four main categories, namely, requirements concerning the capabilities, functionality and usability of the apps, as well as compatibility requirements with respect to other apps and services. The main categories were then further divided into nine sub-categories. This classification, along with its content, aims to become a useful tool for the researcher or the developer who is involved in the development of digital services for BVI.

Keywords: accessibility, assistive mobile apps, blind and visually impaired people, user requirements analysis

Procedia PDF Downloads 118
17144 A Review of Different Studies on Hidden Markov Models for Multi-Temporal Satellite Images: Stationarity and Non-Stationarity Issues

Authors: Ali Ben Abbes, Imed Riadh Farah

Abstract:

Due to the considerable advances in Multi-Temporal Satellite Images (MTSI), remote sensing application became more accurate. Recently, many advances in modeling MTSI are developed using various models. The purpose of this article is to present an overview of studies using Hidden Markov Model (HMM). First of all, we provide a background of using HMM and their applications in this context. A comparison of the different works is discussed, and possible areas and challenges are highlighted. Secondly, we discussed the difference on vegetation monitoring as well as urban growth. Nevertheless, most research efforts have been used only stationary data. From another point of view, in this paper, we describe a new non-stationarity HMM, that is defined with a set of parts of the time series e.g. seasonal, trend and random. In addition, a new approach giving more accurate results and improve the applicability of the HMM in modeling a non-stationary data series. In order to assess the performance of the HMM, different experiments are carried out using Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI time series of the northwestern region of Tunisia and Landsat time series of tres Cantos-Madrid in Spain.

Keywords: multi-temporal satellite image, HMM , nonstationarity, vegetation, urban

Procedia PDF Downloads 349
17143 Residual Lifetime Estimation for Weibull Distribution by Fusing Expert Judgements and Censored Data

Authors: Xiang Jia, Zhijun Cheng

Abstract:

The residual lifetime of a product is the operation time between the current time and the time point when the failure happens. The residual lifetime estimation is rather important in reliability analysis. To predict the residual lifetime, it is necessary to assume or verify a particular distribution that the lifetime of the product follows. And the two-parameter Weibull distribution is frequently adopted to describe the lifetime in reliability engineering. Due to the time constraint and cost reduction, a life testing experiment is usually terminated before all the units have failed. Then the censored data is usually collected. In addition, other information could also be obtained for reliability analysis. The expert judgements are considered as it is common that the experts could present some useful information concerning the reliability. Therefore, the residual lifetime is estimated for Weibull distribution by fusing the censored data and expert judgements in this paper. First, the closed-forms concerning the point estimate and confidence interval for the residual lifetime under the Weibull distribution are both presented. Next, the expert judgements are regarded as the prior information and how to determine the prior distribution of Weibull parameters is developed. For completeness, the cases that there is only one, and there are more than two expert judgements are both focused on. Further, the posterior distribution of Weibull parameters is derived. Considering that it is difficult to derive the posterior distribution of residual lifetime, a sample-based method is proposed to generate the posterior samples of Weibull parameters based on the Monte Carlo Markov Chain (MCMC) method. And these samples are used to obtain the Bayes estimation and credible interval for the residual lifetime. Finally, an illustrative example is discussed to show the application. It demonstrates that the proposed method is rather simple, satisfactory, and robust.

Keywords: expert judgements, information fusion, residual lifetime, Weibull distribution

Procedia PDF Downloads 137
17142 The Effect of "Trait" Variance of Personality on Depression: Application of the Trait-State-Occasion Modeling

Authors: Pei-Chen Wu

Abstract:

Both preexisting cross-sectional and longitudinal studies of personality-depression relationship have suffered from one main limitation: they ignored the stability of the construct of interest (e.g., personality and depression) can be expected to influence the estimate of the association between personality and depression. To address this limitation, the Trait-State-Occasion (TSO) modeling was adopted to analyze the sources of variance of the focused constructs. A TSO modeling was operated by partitioning a state variance into time-invariant (trait) and time-variant (occasion) components. Within a TSO framework, it is possible to predict change on the part of construct that really changes (i.e., time-variant variance), when controlling the trait variances. 750 high school students were followed for 4 waves over six-month intervals. The baseline data (T1) were collected from the senior high schools (aged 14 to 15 years). Participants were given Beck Depression Inventory and Big Five Inventory at each assessment. TSO modeling revealed that 70~78% of the variance in personality (five constructs) was stable over follow-up period; however, 57~61% of the variance in depression was stable. For personality construct, there were 7.6% to 8.4% of the total variance from the autoregressive occasion factors; for depression construct there were 15.2% to 18.1% of the total variance from the autoregressive occasion factors. Additionally, results showed that when controlling initial symptom severity, the time-invariant components of all five dimensions of personality were predictive of change in depression (Extraversion: B= .32, Openness: B = -.21, Agreeableness: B = -.27, Conscientious: B = -.36, Neuroticism: B = .39). Because five dimensions of personality shared some variance, the models in which all five dimensions of personality were simultaneous to predict change in depression were investigated. The time-invariant components of five dimensions were still significant predictors for change in depression (Extraversion: B = .30, Openness: B = -.24, Agreeableness: B = -.28, Conscientious: B = -.35, Neuroticism: B = .42). In sum, the majority of the variability of personality was stable over 2 years. Individuals with the greater tendency of Extraversion and Neuroticism have higher degrees of depression; individuals with the greater tendency of Openness, Agreeableness and Conscientious have lower degrees of depression.

Keywords: assessment, depression, personality, trait-state-occasion model

Procedia PDF Downloads 172
17141 Magnitude of Infection and Associated factor in Open Tibial Fractures Treated Operatively at Addis Ababa Burn Emergency and Trauma Center April, 2023

Authors: Tuji Mohammed Sani

Abstract:

Back ground: An open tibial fracture is an injury where the fractured bone directly communicates with the outside environment. Due to the specific anatomical features of the tibia (limited soft tissue coverage), more than quarter of its fractures are classified as open, representing the most common open long-bone injuries. Open tibial fractures frequently cause significant bone comminution, periosteal stripping, soft tissue loss, contamination and are prone to bacterial entry with biofilm formation, which increases the risk of deep bone infection. Objective: The main objective of the study was to determine Prevalence of infection and its associated factors in surgically treated open tibial fracture in Addis Ababa Burn Emergency and Trauma (AaBET) center. Method: A facility based retrospective cross-sectional study was conducted among patient treated for open tibial fracture at AaBET center from September 2018 to September 2021. The data was collected from patient’s chart using structured data collection form, and Data was entered and analyzed using SPSS version 26. Bivariable and multiple binary logistic regression were fitted. Multicollinearity was checked among candidate variables using variance inflation factor and tolerance, which were less than 5 and greater than 0.2, respectively. Model adequacy were tested using Hosmer-Lemeshow goodness of fitness test (P=0.711). AOR at 95% CI was reported, and P-value < 0.05 was considered statistically significant. Result: This study found that 33.9% of the study participants had an infection. Initial IV antibiotic time (AOR=2.924, 95% CI:1.160- 7.370) and time of wound closure from injury (AOR=3.524, 95% CI: 1.798-6.908), injury to admission time (AOR=2.895, 95% CI: 1.402 – 5.977). and definitive fixation method (AOR=0.244, 95% CI: 0.113 – 0.4508) were the factors found to have a statistically significant association with the occurrence of infection. Conclusion: The rate of infection in open tibial fractures indicates that there is a need to improve the management of open tibial fracture treated at AaBET center. Time from injury to admission, time from injury to first debridement, wound closure time, and initial Intra Venous antibiotic time from the injury are an important factor that can be readily amended to improve the infection rate. Whether wound closed before seven days or not were more important factor associated with occurrences of infection.

Keywords: infection, open tibia, fracture, magnitude

Procedia PDF Downloads 74
17140 Elasto-Viscoplastic Constitutive Modelling of Slow-Moving Landslides

Authors: Deepak Raj Bhat, Kazushige Hayashi, Yorihiro Tanaka, Shigeru Ogita, Akihiko Wakai

Abstract:

Slow-moving landslides are one of the major natural disasters in mountainous regions. Therefore, study of the creep displacement behaviour of a landslide and associated geological and geotechnical issues seem important. This study has addressed and evaluated the slow-moving behaviour of landslide using the 2D-FEM based Elasto-viscoplastic constitutive model. To our based knowledge, two new control constitutive parameters were incorporated in the numerical model for the first time to better understand the slow-moving behaviour of a landslide. First, the predicted time histories of horizontal displacement of the landslide are presented and discussed, which may be useful for landslide displacement prediction in the future. Then, the simulation results of deformation pattern and shear strain pattern is presented and discussed. Moreover, the possible failure mechanism along the slip surface of such landslide is discussed based on the simulation results. It is believed that this study will be useful to understand the slow-moving behaviour of landslides, and at the same time, long-term monitoring and management of the landslide disaster will be much easier.

Keywords: numerical simulation, ground water fluctuations, elasto-viscoplastic model, slow-moving behaviour

Procedia PDF Downloads 64