Search results for: hazard maps
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1316

Search results for: hazard maps

1256 Analysis of Rockfall Hazard along Himalayan Road Cut Slopes

Authors: Sarada Prasad Pradhan, Vikram Vishal, Tariq Siddique

Abstract:

With a vast area of India comprising of hilly terrain and road cut slopes, landslides and rockfalls are a common phenomenon. However, while landslide studies have received much attention in the past in India, very little literature and analysis is available regarding rockfall hazard of many rockfall prone areas, specifically in Uttarakhand Himalaya, India. The subsequent lack of knowledge and understanding of the rockfall phenomenon as well as frequent incidences of rockfall led fatalities urge the necessity of conducting site-specific rockfall studies to highlight the importance of addressing this issue as well as to provide data for safe design of preventive structures. The present study has been conducted across 10 rockfall prone road cut slopes for a distance of 15 km starting from Devprayag, India along National Highway 58 (NH-58). In order to make a qualitative assessment of Rockfall Hazard posed by these slopes, Rockfall Hazard Rating using standards for Indian Rockmass has been conducted at 10 locations under different slope conditions. Moreover, to accurately predict the characteristics of the possible rockfall phenomenon, numerical simulation was carried out to calculate the maximum bounce heights, total kinetic energies, translational velocities and trajectories of the falling rockmass blocks when simulated on each of these slopes according to real-life conditions. As it was observed that varying slope geometry had more fatal impacts on Rockfall hazard than size of rock masses, several optimizations have been suggested for each slope regarding location of barriers and modification of slope geometries in order to minimize damage by falling rocks. This study can be extremely useful in emphasizing the significance of rockfall studies and construction of mitigative barriers and structures along NH-58 around Devprayag.

Keywords: rockfall, slope stability, rockmass, hazard

Procedia PDF Downloads 184
1255 Bayesian Using Markov Chain Monte Carlo and Lindley's Approximation Based on Type-I Censored Data

Authors: Al Omari Moahmmed Ahmed

Abstract:

These papers describe the Bayesian Estimator using Markov Chain Monte Carlo and Lindley’s approximation and the maximum likelihood estimation of the Weibull distribution with Type-I censored data. The maximum likelihood method can’t estimate the shape parameter in closed forms, although it can be solved by numerical methods. Moreover, the Bayesian estimates of the parameters, the survival and hazard functions cannot be solved analytically. Hence Markov Chain Monte Carlo method and Lindley’s approximation are used, where the full conditional distribution for the parameters of Weibull distribution are obtained via Gibbs sampling and Metropolis-Hastings algorithm (HM) followed by estimate the survival and hazard functions. The methods are compared to Maximum Likelihood counterparts and the comparisons are made with respect to the Mean Square Error (MSE) and absolute bias to determine the better method in scale and shape parameters, the survival and hazard functions.

Keywords: weibull distribution, bayesian method, markov chain mote carlo, survival and hazard functions

Procedia PDF Downloads 449
1254 Modeling Geogenic Groundwater Contamination Risk with the Groundwater Assessment Platform (GAP)

Authors: Joel Podgorski, Manouchehr Amini, Annette Johnson, Michael Berg

Abstract:

One-third of the world’s population relies on groundwater for its drinking water. Natural geogenic arsenic and fluoride contaminate ~10% of wells. Prolonged exposure to high levels of arsenic can result in various internal cancers, while high levels of fluoride are responsible for the development of dental and crippling skeletal fluorosis. In poor urban and rural settings, the provision of drinking water free of geogenic contamination can be a major challenge. In order to efficiently apply limited resources in the testing of wells, water resource managers need to know where geogenically contaminated groundwater is likely to occur. The Groundwater Assessment Platform (GAP) fulfills this need by providing state-of-the-art global arsenic and fluoride contamination hazard maps as well as enabling users to create their own groundwater quality models. The global risk models were produced by logistic regression of arsenic and fluoride measurements using predictor variables of various soil, geological and climate parameters. The maps display the probability of encountering concentrations of arsenic or fluoride exceeding the World Health Organization’s (WHO) stipulated concentration limits of 10 µg/L or 1.5 mg/L, respectively. In addition to a reconsideration of the relevant geochemical settings, these second-generation maps represent a great improvement over the previous risk maps due to a significant increase in data quantity and resolution. For example, there is a 10-fold increase in the number of measured data points, and the resolution of predictor variables is generally 60 times greater. These same predictor variable datasets are available on the GAP platform for visualization as well as for use with a modeling tool. The latter requires that users upload their own concentration measurements and select the predictor variables that they wish to incorporate in their models. In addition, users can upload additional predictor variable datasets either as features or coverages. Such models can represent an improvement over the global models already supplied, since (a) users may be able to use their own, more detailed datasets of measured concentrations and (b) the various processes leading to arsenic and fluoride groundwater contamination can be isolated more effectively on a smaller scale, thereby resulting in a more accurate model. All maps, including user-created risk models, can be downloaded as PDFs. There is also the option to share data in a secure environment as well as the possibility to collaborate in a secure environment through the creation of communities. In summary, GAP provides users with the means to reliably and efficiently produce models specific to their region of interest by making available the latest datasets of predictor variables along with the necessary modeling infrastructure.

Keywords: arsenic, fluoride, groundwater contamination, logistic regression

Procedia PDF Downloads 314
1253 Investigations on the Application of Avalanche Simulations: A Survey Conducted among Avalanche Experts

Authors: Korbinian Schmidtner, Rudolf Sailer, Perry Bartelt, Wolfgang Fellin, Jan-Thomas Fischer, Matthias Granig

Abstract:

This study focuses on the evaluation of snow avalanche simulations, based on a survey that has been carried out among avalanche experts. In the last decades, the application of avalanche simulation tools has gained recognition within the realm of hazard management. Traditionally, avalanche runout models were used to predict extreme avalanche runout and prepare avalanche maps. This has changed rather dramatically with the application of numerical models. For safety regulations such as road safety simulation tools are now being coupled with real-time meteorological measurements to predict frequent avalanche hazard. That places new demands on model accuracy and requires the simulation of physical processes that previously could be ignored. These simulation tools are based on a deterministic description of the avalanche movement allowing to predict certain quantities (e.g. pressure, velocities, flow heights, runout lengths etc.) of the avalanche flow. Because of the highly variable regimes of the flowing snow, no uniform rheological law describing the motion of an avalanche is known. Therefore, analogies to fluid dynamical laws of other materials are stated. To transfer these constitutional laws to snow flows, certain assumptions and adjustments have to be imposed. Besides these limitations, there exist high uncertainties regarding the initial and boundary conditions. Further challenges arise when implementing the underlying flow model equations into an algorithm executable by a computer. This implementation is constrained by the choice of adequate numerical methods and their computational feasibility. Hence, the model development is compelled to introduce further simplifications and the related uncertainties. In the light of these issues many questions arise on avalanche simulations, on their assets and drawbacks, on potentials for improvements as well as their application in practice. To address these questions a survey among experts in the field of avalanche science (e.g. researchers, practitioners, engineers) from various countries has been conducted. In the questionnaire, special attention is drawn on the expert’s opinion regarding the influence of certain variables on the simulation result, their uncertainty and the reliability of the results. Furthermore, it was tested to which degree a simulation result influences the decision making for a hazard assessment. A discrepancy could be found between a large uncertainty of the simulation input parameters as compared to a relatively high reliability of the results. This contradiction can be explained taking into account how the experts employ the simulations. The credibility of the simulations is the result of a rather thoroughly simulation study, where different assumptions are tested, comparing the results of different flow models along with the use of supplemental data such as chronicles, field observation, silent witnesses i.a. which are regarded as essential for the hazard assessment and for sanctioning simulation results. As the importance of avalanche simulations grows within the hazard management along with their further development studies focusing on the modeling fashion could contribute to a better understanding how knowledge of the avalanche process can be gained by running simulations.

Keywords: expert interview, hazard management, modeling, simulation, snow avalanche

Procedia PDF Downloads 295
1252 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 307
1251 Different Data-Driven Bivariate Statistical Approaches to Landslide Susceptibility Mapping (Uzundere, Erzurum, Turkey)

Authors: Azimollah Aleshzadeh, Enver Vural Yavuz

Abstract:

The main goal of this study is to produce landslide susceptibility maps using different data-driven bivariate statistical approaches; namely, entropy weight method (EWM), evidence belief function (EBF), and information content model (ICM), at Uzundere county, Erzurum province, in the north-eastern part of Turkey. Past landslide occurrences were identified and mapped from an interpretation of high-resolution satellite images, and earlier reports as well as by carrying out field surveys. In total, 42 landslide incidence polygons were mapped using ArcGIS 10.4.1 software and randomly split into a construction dataset 70 % (30 landslide incidences) for building the EWM, EBF, and ICM models and the remaining 30 % (12 landslides incidences) were used for verification purposes. Twelve layers of landslide-predisposing parameters were prepared, including total surface radiation, maximum relief, soil groups, standard curvature, distance to stream/river sites, distance to the road network, surface roughness, land use pattern, engineering geological rock group, topographical elevation, the orientation of slope, and terrain slope gradient. The relationships between the landslide-predisposing parameters and the landslide inventory map were determined using different statistical models (EWM, EBF, and ICM). The model results were validated with landslide incidences, which were not used during the model construction. In addition, receiver operating characteristic curves were applied, and the area under the curve (AUC) was determined for the different susceptibility maps using the success (construction data) and prediction (verification data) rate curves. The results revealed that the AUC for success rates are 0.7055, 0.7221, and 0.7368, while the prediction rates are 0.6811, 0.6997, and 0.7105 for EWM, EBF, and ICM models, respectively. Consequently, landslide susceptibility maps were classified into five susceptibility classes, including very low, low, moderate, high, and very high. Additionally, the portion of construction and verification landslides incidences in high and very high landslide susceptibility classes in each map was determined. The results showed that the EWM, EBF, and ICM models produced satisfactory accuracy. The obtained landslide susceptibility maps may be useful for future natural hazard mitigation studies and planning purposes for environmental protection.

Keywords: entropy weight method, evidence belief function, information content model, landslide susceptibility mapping

Procedia PDF Downloads 104
1250 [Keynote Talk]: Applying p-Balanced Energy Technique to Solve Liouville-Type Problems in Calculus

Authors: Lina Wu, Ye Li, Jia Liu

Abstract:

We are interested in solving Liouville-type problems to explore constancy properties for maps or differential forms on Riemannian manifolds. Geometric structures on manifolds, the existence of constancy properties for maps or differential forms, and energy growth for maps or differential forms are intertwined. In this article, we concentrate on discovery of solutions to Liouville-type problems where manifolds are Euclidean spaces (i.e. flat Riemannian manifolds) and maps become real-valued functions. Liouville-type results of vanishing properties for functions are obtained. The original work in our research findings is to extend the q-energy for a function from finite in Lq space to infinite in non-Lq space by applying p-balanced technique where q = p = 2. Calculation skills such as Hölder's Inequality and Tests for Series have been used to evaluate limits and integrations for function energy. Calculation ideas and computational techniques for solving Liouville-type problems shown in this article, which are utilized in Euclidean spaces, can be universalized as a successful algorithm, which works for both maps and differential forms on Riemannian manifolds. This innovative algorithm has a far-reaching impact on research work of solving Liouville-type problems in the general settings involved with infinite energy. The p-balanced technique in this algorithm provides a clue to success on the road of q-energy extension from finite to infinite.

Keywords: differential forms, holder inequality, Liouville-type problems, p-balanced growth, p-harmonic maps, q-energy growth, tests for series

Procedia PDF Downloads 208
1249 Process Safety Evaluation of a Nuclear Power Plant through Virtual Process Hazard Analysis Using Hazard and Operability Technique

Authors: Elysa V. Largo, Lormaine Anne A. Branzuela, Julie Marisol D. Pagalilauan, Neil C. Concibido, Monet Concepcion M. Detras

Abstract:

The energy demand in the country is increasing; thus, nuclear energy is recently mandated to add to the energy mix. The Philippines has the Bataan Nuclear Power Plant (BNPP), which can be a source of nuclear energy; however, it has not been operated since the completion of its construction. Thus, evaluating the safety of BNPP is vital. This study explored the possible deviations that may occur in the operation of a nuclear power plant with a pressurized water reactor, which is similar to BNPP, through a virtual process hazard analysis (PHA) using the hazard and operability (HAZOP) technique. Temperature, pressure, and flow were used as parameters. A total of 86 causes of various deviations were identified, wherein the primary system and line from reactor coolant pump to reactor vessel are the most critical system and node, respectively. A total of 348 scenarios were determined. The critical events are radioactive leaks due to nuclear meltdown and sump overflow that could lead to multiple worker fatalities, one or more public fatalities, and environmental remediation. There were existing safeguards identified; however, further recommendations were provided to have additional and supplemental barriers to reduce the risk.

Keywords: PSM, PHA, HAZOP, nuclear power plant

Procedia PDF Downloads 114
1248 Verification of the Effect of the Hazard-Perception Training Tool for Drivers Ported from a Tablet Device to a Smartphone

Authors: K. Shimazaki, M. Mishina, A. Fujii

Abstract:

In a previous study, we developed a hazard-perception training tool for drivers using a tablet device and verified its effectiveness. Accident movies recorded by drive recorders were separated into scenes before and after the collision. The scene before the collision is presented to the driver. The driver then touches the screen to point out where he/she feels danger. After the screen is touched, the tool presents the collision scene and tells the driver if what he/she pointed out is correct. Various effects were observed such as this tool increased the discovery rate of collision targets and reduced the reaction time. In this study, we optimized this tool for the smartphone and verified its effectiveness. Verifying in the same way as in the previous study on tablet devices clarified that the same effect can be obtained on the smartphone screen.

Keywords: hazard perception, smartphone, tablet devices, driver education

Procedia PDF Downloads 199
1247 Cognitive Characteristics of Industrial Workers in Fuzzy Risk Assessment

Authors: Hyeon-Kyo Lim, Sang-Hun Byun

Abstract:

Risk assessment is carried out in most industrial plants for accident prevention, but there exists insufficient data for statistical decision making. It is commonly said that risk can be expressed as a product of consequence and likelihood of a corresponding hazard factor. Eventually, therefore, risk assessment involves human decision making which cannot be objective per se. This study was carried out to comprehend perceptive characteristics of human beings in industrial plants. Subjects were shown a set of illustrations describing scenes of industrial plants, and were asked to assess the risk of each scene with not only linguistic variables but also numeric scores in the aspect of consequence and likelihood. After that, their responses were formulated as fuzzy membership functions, and compared with those of university students who had no experience of industrial works. The results showed that risk level of industrial workers were lower than those of any other groups, which implied that the workers might generally have a tendency to neglect more hazard factors in their work fields.

Keywords: fuzzy, hazard, linguistic variable, risk assessment

Procedia PDF Downloads 226
1246 Quantum Algebra from Generalized Q-Algebra

Authors: Muna Tabuni

Abstract:

The paper contains an investigation of the notion of Q algebras. A brief introduction to quantum mechanics is given, in that systems the state defined by a vector in a complex vector space H which have Hermitian inner product property. H may be finite or infinite-dimensional. In quantum mechanics, operators must be hermitian. These facts are saved by Lie algebra operators but not by those of quantum algebras. A Hilbert space H consists of a set of vectors and a set of scalars. Lie group is a differentiable topological space with group laws given by differentiable maps. A Lie algebra has been introduced. Q-algebra has been defined. A brief introduction to BCI-algebra is given. A BCI sub algebra is introduced. A brief introduction to BCK=BCH-algebra is given. Every BCI-algebra is a BCH-algebra. Homomorphism maps meanings are introduced. Homomorphism maps between two BCK algebras are defined. The mathematical formulations of quantum mechanics can be expressed using the theory of unitary group representations. A generalization of Q algebras has been introduced, and their properties have been considered. The Q- quantum algebra has been studied, and various examples have been given.

Keywords: Q-algebras, BCI, BCK, BCH-algebra, quantum mechanics

Procedia PDF Downloads 171
1245 Coupled Analysis for Hazard Modelling of Debris Flow Due to Extreme Rainfall

Authors: N. V. Nikhil, S. R. Lee, Do Won Park

Abstract:

Korean peninsula receives about two third of the annual rainfall during summer season. The extreme rainfall pattern due to typhoon and heavy rainfall results in severe mountain disasters among which 55% of them are debris flows, a major natural hazard especially when occurring around major settlement areas. The basic mechanism underlined for this kind of failure is the unsaturated shallow slope failure by reduction of matric suction due to infiltration of water and liquefaction of the failed mass due to generation of positive pore water pressure leading to abrupt loss of strength and commencement of flow. However only an empirical model cannot simulate this complex mechanism. Hence, we have employed an empirical-physical based approach for hazard analysis of debris flow using TRIGRS, a debris flow initiation criteria and DAN3D in mountain Woonmyun, South Korea. Debris flow initiation criteria is required to discern the potential landslides which can transform into debris flow. DAN-3D, being a new model, does not have the calibrated values of rheology parameters for Korean conditions. Thus, in our analysis we have used the recent 2011 debris flow event in mountain Woonmyun san for calibration of both TRIGRS model and DAN-3D, thereafter identifying and predicting the debris flow initiation points, path, run out velocity, and area of spreading for future extreme rainfall based scenarios.

Keywords: debris flow, DAN-3D, extreme rainfall, hazard analysis

Procedia PDF Downloads 214
1244 Using Self Organizing Feature Maps for Classification in RGB Images

Authors: Hassan Masoumi, Ahad Salimi, Nazanin Barhemmat, Babak Gholami

Abstract:

Artificial neural networks have gained a lot of interest as empirical models for their powerful representational capacity, multi input and output mapping characteristics. In fact, most feed-forward networks with nonlinear nodal functions have been proved to be universal approximates. In this paper, we propose a new supervised method for color image classification based on self organizing feature maps (SOFM). This algorithm is based on competitive learning. The method partitions the input space using self-organizing feature maps to introduce the concept of local neighborhoods. Our image classification system entered into RGB image. Experiments with simulated data showed that separability of classes increased when increasing training time. In additional, the result shows proposed algorithms are effective for color image classification.

Keywords: classification, SOFM algorithm, neural network, neighborhood, RGB image

Procedia PDF Downloads 444
1243 Evaluation of Groundwater Suitability for Irrigation Purposes: A Case Study for an Arid Region

Authors: Mustafa M. Bob, Norhan Rahman, Abdalla Elamin, Saud Taher

Abstract:

The objective of this study was to assess the suitability of Madinah city groundwater for irrigation purposes. Of the twenty three wells that were drilled in different locations in the city for the purposes of this study, twenty wells were sampled for water quality analyses. The United States Department of Agriculture (USDA) classification of irrigation water that is based on Sodium hazard (SAR) and salinity hazard was used for suitability assessment. In addition, the residual sodium carbonate (RSC) was calculated for all samples and also used for irrigation suitability assessment. Results showed that all groundwater samples are in the acceptable quality range for irrigation based on RSC values. When SAR and salinity hazard were assessed, results showed that while all groundwater samples (except one) fell in the acceptable range of SAR, they were either in the high or very high salinity zone which indicates that care should be taken regarding the type of soil and crops in the study area.

Keywords: irrigation suitability, TDS, salinity, SAR

Procedia PDF Downloads 346
1242 Ground Response Analysis at the Rukni Irrigation Project Site Located in Assam, India

Authors: Tauhidur Rahman, Kasturi Bhuyan

Abstract:

In the present paper, Ground Response Analysis at the Rukni irrigation project has been thoroughly investigated. Surface level seismic hazard is mainly used by the practical Engineers for designing the important structures. Surface level seismic hazard can be obtained accounting the soil factor. Structures on soft soil will show more ground shaking than the structure located on a hard soil. The Surface level ground motion depends on the type of soil. Density and shear wave velocity is different for different types of soil. The intensity of the soil amplification depends on the density and shear wave velocity of the soil. Rukni irrigation project is located in the North Eastern region of India, near the Dauki fault (550 Km length) which has already produced earthquakes of magnitude (Mw= 8.5) in the past. There is a probability of a similar type of earthquake occuring in the future. There are several faults also located around the project site. There are 765 recorded strong ground motion time histories available for the region. These data are used to determine the soil amplification factor by incorporation of the engineering properties of soil. With this in view, three of soil bore holes have been studied at the project site up to a depth of 30 m. It has been observed that in Soil bore hole 1, the shear wave velocity vary from 99.44 m/s to 239.28 m/s. For Soil Bore Hole No 2 and 3, shear wave velocity vary from 93.24 m/s to 241.39 m/s and 93.24m/s to 243.01 m/s. In the present work, surface level seismic hazard at the project site has been calculated based on the Probabilistic seismic hazard approach accounting the soil factor.

Keywords: Ground Response Analysis, shear wave velocity, soil amplification, surface level seismic hazard

Procedia PDF Downloads 531
1241 Slope Stability and Landslides Hazard Analysis, Limitations of Existing Approaches, and a New Direction

Authors: Alisawi Alaa T., Collins P. E. F.

Abstract:

The analysis and evaluation of slope stability and landslide hazards are landslide hazards are critically important in civil engineering projects and broader considerations of safety. The level of slope stability risk should be identified due to its significant and direct financial and safety effects. Slope stability hazard analysis is performed considering static and/or dynamic loading circumstances. To reduce and/or prevent the failure hazard caused by landslides, a sophisticated and practical hazard analysis method using advanced constitutive modeling should be developed and linked to an effective solution that corresponds to the specific type of slope stability and landslides failure risk. Previous studies on slope stability analysis methods identify the failure mechanism and its corresponding solution. The commonly used approaches include used approaches include limit equilibrium methods, empirical approaches for rock slopes (e.g., slope mass rating and Q-slope), finite element or finite difference methods, and district element codes. This study presents an overview and evaluation of these analysis techniques. Contemporary source materials are used to examine these various methods on the basis of hypotheses, the factor of safety estimation, soil types, load conditions, and analysis conditions and limitations. Limit equilibrium methods play a key role in assessing the level of slope stability hazard. The slope stability safety level can be defined by identifying the equilibrium of the shear stress and shear strength. The slope is considered stable when the movement resistance forces are greater than those that drive the movement with a factor of safety (ratio of the resistance of the resistance of the driving forces) that is greater than 1.00. However, popular and practical methods, including limit equilibrium approaches, are not effective when the slope experiences complex failure mechanisms, such as progressive failure, liquefaction, internal deformation, or creep. The present study represents the first episode of an ongoing project that involves the identification of the types of landslides hazards, assessment of the level of slope stability hazard, development of a sophisticated and practical hazard analysis method, linkage of the failure type of specific landslides conditions to the appropriate solution and application of an advanced computational method for mapping the slope stability properties in the United Kingdom, and elsewhere through geographical information system (GIS) and inverse distance weighted spatial interpolation(IDW) technique. This study investigates and assesses the different assesses the different analysis and solution techniques to enhance the knowledge on the mechanism of slope stability and landslides hazard analysis and determine the available solutions for each potential landslide failure risk.

Keywords: slope stability, finite element analysis, hazard analysis, landslides hazard

Procedia PDF Downloads 71
1240 Calculation of Instrumental Results of the Tohoku Earthquake, Japan (Mw 9.0) on March 11, 2011 and Other Destructive Earthquakes during Seismic Hazard Assessment

Authors: J. K. Karapetyan

Abstract:

In this paper seismological-statistical analysis of actual instrumental data on the main tremor of the Great Japan earthquake 11.03.2011 is implemented for finding out the dependence between maximal values of peak ground accelerations (PGA) and epicentric distances. A number of peculiarities of manifestation of accelerations' maximum values at the interval of long epicentric distances are revealed which do not correspond with current scales of seismic intensity.

Keywords: earthquakes, instrumental records, seismic hazard, Japan

Procedia PDF Downloads 341
1239 Hearing Threshold Levels among Steel Industry Workers in Samut Prakan Province, Thailand

Authors: Petcharat  Kerdonfag, Surasak Taneepanichskul, Winai Wadwongtham

Abstract:

Industrial noise is usually considered as the main impact of the environmental health and safety because its exposure can cause permanently serious hearing damage. Despite providing strictly hearing protection standards and campaigning extensively encouraging public health awareness among industrial workers in Thailand, hazard noise-induced hearing loss has dramatically been massive obstacles for workers’ health. The aims of the study were to explore and specify the hearing threshold levels among steel industrial workers responsible in which higher noise levels of work zone and to examine the relationships of hearing loss and workers’ age and the length of employment in Samut Prakan province, Thailand. Cross-sectional study design was done. Ninety-three steel industrial workers in the designated zone of higher noise (> 85dBA) with more than 1 year of employment from two factories by simple random sampling and available to participate in were assessed by the audiometric screening at regional Samut Prakan hospital. Data of doing screening were collected from October to December, 2016 by the occupational medicine physician and a qualified occupational nurse. All participants were examined by the same examiners for the validity. An Audiometric testing was performed at least 14 hours after the last noise exposure from the workplace. Workers’ age and the length of employment were gathered by the developed occupational record form. Results: The range of workers’ age was from 23 to 59 years, (Mean = 41.67, SD = 9.69) and the length of employment was from 1 to 39 years, (Mean = 13.99, SD = 9.88). Fifty three (60.0%) out of all participants have been exposing to the hazard of noise in the workplace for more than 10 years. Twenty-three (24.7%) of them have been exposing to the hazard of noise less than or equal to 5 years. Seventeen (18.3%) of them have been exposing to the hazard of noise for 5 to 10 years. Using the cut point of less than or equal to 25 dBA of hearing thresholds, the average means of hearing thresholds for participants at 4, 6, and 8 kHz were 31.34, 29.62, and 25.64 dB, respectively for the right ear and 40.15, 32.20, and 25.48 dB for the left ear, respectively. The more developing age of workers in the work zone with hazard of noise, the more the hearing thresholds would be increasing at frequencies of 4, 6, and 8 kHz (p =.012, p =.026, p =.024) for the right ear, respectively and for the left ear only at the frequency 4 kHz (p =.009). Conclusion: The participants’ age in the hazard of noise work zone was significantly associated with the hearing loss in different levels while the length of participants’ employment was not significantly associated with the hearing loss. Thus hearing threshold levels among industrial workers would be regularly assessed and needed to be protected at the beginning of working.

Keywords: hearing threshold levels, hazard of noise, hearing loss, audiometric testing

Procedia PDF Downloads 197
1238 A World Map of Seabed Sediment Based on 50 Years of Knowledge

Authors: T. Garlan, I. Gabelotaud, S. Lucas, E. Marchès

Abstract:

Production of a global sedimentological seabed map has been initiated in 1995 to provide the necessary tool for searches of aircraft and boats lost at sea, to give sedimentary information for nautical charts, and to provide input data for acoustic propagation modelling. This original approach had already been initiated one century ago when the French hydrographic service and the University of Nancy had produced maps of the distribution of marine sediments of the French coasts and then sediment maps of the continental shelves of Europe and North America. The current map of the sediment of oceans presented was initiated with a UNESCO's general map of the deep ocean floor. This map was adapted using a unique sediment classification to present all types of sediments: from beaches to the deep seabed and from glacial deposits to tropical sediments. In order to allow good visualization and to be adapted to the different applications, only the granularity of sediments is represented. The published seabed maps are studied, if they present an interest, the nature of the seabed is extracted from them, the sediment classification is transcribed and the resulted map is integrated in the world map. Data come also from interpretations of Multibeam Echo Sounder (MES) imagery of large hydrographic surveys of deep-ocean. These allow a very high-quality mapping of areas that until then were represented as homogeneous. The third and principal source of data comes from the integration of regional maps produced specifically for this project. These regional maps are carried out using all the bathymetric and sedimentary data of a region. This step makes it possible to produce a regional synthesis map, with the realization of generalizations in the case of over-precise data. 86 regional maps of the Atlantic Ocean, the Mediterranean Sea, and the Indian Ocean have been produced and integrated into the world sedimentary map. This work is permanent and permits a digital version every two years, with the integration of some new maps. This article describes the choices made in terms of sediment classification, the scale of source data and the zonation of the variability of the quality. This map is the final step in a system comprising the Shom Sedimentary Database, enriched by more than one million punctual and surface items of data, and four series of coastal seabed maps at 1:10,000, 1:50,000, 1:200,000 and 1:1,000,000. This step by step approach makes it possible to take into account the progresses in knowledge made in the field of seabed characterization during the last decades. Thus, the arrival of new classification systems for seafloor has improved the recent seabed maps, and the compilation of these new maps with those previously published allows a gradual enrichment of the world sedimentary map. But there is still a lot of work to enhance some regions, which are still based on data acquired more than half a century ago.

Keywords: marine sedimentology, seabed map, sediment classification, world ocean

Procedia PDF Downloads 202
1237 Enhancing Flood Modeling: Unveiling the Role of Hazard Parameters in Building Vulnerability

Authors: Mohammad Shoraka, Raulina Wojtkiewicz, Karthik Ramanathan

Abstract:

Following the devastating summer 2021 floods in Germany, catastrophe modelers realized that hazard parameters, such as flow velocity, flood duration, and debris flow, play a significant role in capturing the overall damage potential of such events. Accounting for the location-specific static depth as the only hazard intensity metric may lead to a substantial underestimation of the vulnerability of building stock and, eventually, the loss potential of such catastrophic events. As the flow velocity increases, the hydrodynamic forces acting on various building components are amplified. Longer flood duration leads to water permeating porous components, incurring additional cleanup costs that contribute to an overall increase in damage. Debris flow possesses the power to erode extensive sections of buildings, thus substantially augmenting the extent of losses. This paper introduces four flow velocity classes, ranging from no flow velocity to major velocity, along with two flood duration classes: short and long, in estimating the vulnerability of the building stock. Additionally, the study examines the impact of the presence of debris flow and its role in exacerbating flood damage. The paper delves into the effects of each of these parameters on building component damageability and their collective impact on the overall building vulnerability.

Keywords: catastrophe modeling, building vulnerability, hazard parameters, component damage function

Procedia PDF Downloads 36
1236 A Comparative Assessment Method For Map Alignment Techniques

Authors: Rema Daher, Theodor Chakhachiro, Daniel Asmar

Abstract:

In the era of autonomous robot mapping, assessing the goodness of the generated maps is important, and is usually performed by aligning them to ground truth. Map alignment is difficult for two reasons: first, the query maps can be significantly distorted from ground truth, and second, establishing what constitutes ground truth for different settings is challenging. Most map alignment techniques to this date have addressed the first problem, while paying too little importance to the second. In this paper, we propose a benchmark dataset, which consists of synthetically transformed maps with their corresponding displacement fields. Furthermore, we propose a new system for comparison, where the displacement field of any map alignment technique can be computed and compared to the ground truth using statistical measures. The local information in displacement fields renders the evaluation system applicable to any alignment technique, whether it is linear or not. In our experiments, the proposed method was applied to different alignment methods from the literature, allowing for a comparative assessment between them all.

Keywords: assessment methods, benchmark, image deformation, map alignment, robot mapping, robot motion

Procedia PDF Downloads 93
1235 Combining Diffusion Maps and Diffusion Models for Enhanced Data Analysis

Authors: Meng Su

Abstract:

High-dimensional data analysis often presents challenges in capturing the complex, nonlinear relationships and manifold structures inherent to the data. This article presents a novel approach that leverages the strengths of two powerful techniques, Diffusion Maps and Diffusion Probabilistic Models (DPMs), to address these challenges. By integrating the dimensionality reduction capability of Diffusion Maps with the data modeling ability of DPMs, the proposed method aims to provide a comprehensive solution for analyzing and generating high-dimensional data. The Diffusion Map technique preserves the nonlinear relationships and manifold structure of the data by mapping it to a lower-dimensional space using the eigenvectors of the graph Laplacian matrix. Meanwhile, DPMs capture the dependencies within the data, enabling effective modeling and generation of new data points in the low-dimensional space. The generated data points can then be mapped back to the original high-dimensional space, ensuring consistency with the underlying manifold structure. Through a detailed example implementation, the article demonstrates the potential of the proposed hybrid approach to achieve more accurate and effective modeling and generation of complex, high-dimensional data. Furthermore, it discusses possible applications in various domains, such as image synthesis, time-series forecasting, and anomaly detection, and outlines future research directions for enhancing the scalability, performance, and integration with other machine learning techniques. By combining the strengths of Diffusion Maps and DPMs, this work paves the way for more advanced and robust data analysis methods.

Keywords: diffusion maps, diffusion probabilistic models (DPMs), manifold learning, high-dimensional data analysis

Procedia PDF Downloads 66
1234 Radium Equivalent and External Hazard Indices of Trace Elements Concentrations in Aquatic Species by Neutron Activation Analysis (NAA) and Inductively Coupled Plasma Mass Spectrometry (ICP-MS)

Authors: B. G. Muhammad, S. M. Jafar

Abstract:

Neutron Activation Analysis (NAA) and Inductively Coupled Plasma Mass Spectrometry (ICP-MS) were employed to analyze the level of trace elements concentrations in sediment samples and their bioaccumulation in some aquatic species selected randomly from surface water resources in the Northern peninsula of Malaysia. The NAA results of the sediment samples indicated a wide range in concentration of different elements were observed. Fe, K, and Na were found to have major concentration values that ranges between 61,000 ± 1400 to 4,500 ± 100 ppm, 20100±1000 to 3100±600 and 3,100±600 and 200±10 ppm, respectively. Traces of heavy metals with much more contamination health concern, such as Cr and As, were also identified in many of the samples analyzed. The average specific activities of 40K, 232Th and 226Ra in soil and the corresponding radium equivalent activity and the external hazard index were all found to be lower than the maximum permissible limits (370 Bq kg-1 and 1).

Keywords: external hazard index, Neutron Activation Analysis, radium equivalent, trace elements concentrations

Procedia PDF Downloads 396
1233 Preserving Heritage in the Face of Natural Disasters: Lessons from the Bam Experience in Iran

Authors: Mohammad Javad Seddighi, Avar Almukhtar

Abstract:

The occurrence of natural disasters, such as floods and earthquakes, can cause significant damage to heritage sites and surrounding areas. In Iran, the city of Bam was devastated by an earthquake in 2003, which had a major impact on the rivers and watercourses around the city. This study aims to investigate the environmental design techniques and sustainable hazard mitigation strategies that can be employed to preserve heritage sites in the face of natural disasters, using the Bam experience as a case study. The research employs a mixed-methods approach, combining both qualitative and quantitative data collection and analysis methods. The study begins with a comprehensive literature review of recent publications on environmental design techniques and sustainable hazard mitigation strategies in heritage conservation. This is followed by a field study of the rivers and watercourses around Bam, including the Adoori River (Talangoo) and other watercourses, to assess the current conditions and identify potential hazards. The data collected from the field study is analysed using statistical methods and GIS mapping techniques. The findings of this study reveal the importance of sustainable hazard mitigation strategies and environmental design techniques in preserving heritage sites during natural disasters. The study suggests that these techniques can be used to prevent the outbreak of another natural disaster in Bam and the surrounding areas. Specifically, the study recommends the establishment of a comprehensive early warning system, the creation of flood-resistant landscapes, and the use of eco-friendly building materials in the reconstruction of heritage sites. These findings contribute to the current knowledge of sustainable hazard mitigation and environmental design in heritage conservation.

Keywords: natural disasters, heritage conservation, sustainable hazard mitigation, environmental design, landscape architecture, flood management, disaster resilience

Procedia PDF Downloads 53
1232 The Investigate Relationship between Moral Hazard and Corporate Governance with Earning Forecast Quality in the Tehran Stock Exchange

Authors: Fatemeh Rouhi, Hadi Nassiri

Abstract:

Earning forecast is a key element in economic decisions but there are some situations, such as conflicts of interest in financial reporting, complexity and lack of direct access to information has led to the phenomenon of information asymmetry among individuals within the organization and external investors and creditors that appear. The adverse selection and moral hazard in the investor's decision and allows direct assessment of the difficulties associated with data by users makes. In this regard, the role of trustees in corporate governance disclosure is crystallized that includes controls and procedures to ensure the lack of movement in the interests of the company's management and move in the direction of maximizing shareholder and company value. Therefore, the earning forecast of companies in the capital market and the need to identify factors influencing this study was an attempt to make relationship between moral hazard and corporate governance with earning forecast quality companies operating in the capital market and its impact on Earnings Forecasts quality by the company to be established. Getting inspiring from the theoretical basis of research, two main hypotheses and sub-hypotheses are presented in this study, which have been examined on the basis of available models, and with the use of Panel-Data method, and at the end, the conclusion has been made at the assurance level of 95% according to the meaningfulness of the model and each independent variable. In examining the models, firstly, Chow Test was used to specify either Panel Data method should be used or Pooled method. Following that Housman Test was applied to make use of Random Effects or Fixed Effects. Findings of the study show because most of the variables are positively associated with moral hazard with earnings forecasts quality, with increasing moral hazard, earning forecast quality companies listed on the Tehran Stock Exchange is increasing. Among the variables related to corporate governance, board independence variables have a significant relationship with earnings forecast accuracy and earnings forecast bias but the relationship between board size and earnings forecast quality is not statistically significant.

Keywords: corporate governance, earning forecast quality, moral hazard, financial sciences

Procedia PDF Downloads 288
1231 Effects of Empathy Priming on Idea Generation

Authors: Tejas Dhadphale

Abstract:

The user-centered design (UCD) approach has led to an increased interest in empathy within the product development process. Designers have explored several empathetic methods and tools such as personas, empathy maps, journey maps, user needs statements and user scenarios to capture and visualize users’ needs. The goal of these tools is not only to generate a deeper and shared understanding of user needs but also to become a point of reference for subsequent decision making, brainstorming and concept evaluation tasks. The purpose of this study is to measure the effect of empathy priming on divergent brainstorming tasks. This study compares the effects of three empathy tools, personas, empathy maps and user needs statements, on ideation fluency and originality of ideas during brainstorming tasks. In a three-between-subjects experimental design study, sixty product design students were randomly assigned to one of three conditions: persona, empathy maps and user needs statements. A one-way, between-subjects analysis of variance (ANOVA) revealed a a statistically significant difference in empathy priming on fluency and originality of ideas. Participants in the persona group showed higher ideation fluency and generated a greater number of original ideas compared to the other groups. The results show that participants in the user need statement group to generate a greater number of feasible and relevant ideas. The study also aims to understand how formatting and visualization of empathy tools impact divergent brainstorming tasks. Participants were interviewed to understand how different visualizations of users’ needs (personas, empathy maps and user needs statements) facilitated idea generation during brainstorming tasks. Implications for design education are discussed.

Keywords: empathy, persona, priming, Design research

Procedia PDF Downloads 50
1230 Competing Risk Analyses in Survival Trials During COVID-19 Pandemic

Authors: Ping Xu, Gregory T. Golm, Guanghan (Frank) Liu

Abstract:

In the presence of competing events, traditional survival analysis may not be appropriate and can result in biased estimates, as it assumes independence between competing events and the event of interest. Instead, competing risk analysis should be considered to correctly estimate the survival probability of the event of interest and the hazard ratio between treatment groups. The COVID-19 pandemic has provided a potential source of competing risks in clinical trials, as participants in trials may experienceCOVID-related competing events before the occurrence of the event of interest, for instance, death due to COVID-19, which can affect the incidence rate of the event of interest. We have performed simulation studies to compare multiple competing risk analysis models, including the cumulative incidence function, the sub-distribution hazard function, and the cause-specific hazard function, to the traditional survival analysis model under various scenarios. We also provide a general recommendation on conducting competing risk analysis in randomized clinical trials during the era of the COVID-19 pandemic based on the extensive simulation results.

Keywords: competing risk, survival analysis, simulations, randomized clinical trial, COVID-19 pandemic

Procedia PDF Downloads 161
1229 Evaluation of Hydrocarbon Prospects of 'ADE' Field, Niger Delta

Authors: Oluseun A. Sanuade, Sanlinn I. Kaka, Adesoji O. Akanji, Olukole A. Akinbiyi

Abstract:

Prospect evaluation of ‘the ‘ADE’ field was done using 3D seismic data and well log data. The field is located in the offshore Niger Delta where water depth ranges from 450 to 800 m. The objectives of this study are to explore deeper prospects and to ascertain the kind of traps that are favorable for the accumulation of hydrocarbon in the field. Six horizons with major and minor faults were identified and mapped in the field. Time structure maps of these horizons were generated and using the available check-shot data the maps were converted to top structure maps which were used to calculate the hydrocarbon volume. The results show that regional structural highs that are trending in northeast-southwest (NE-SW) characterized a large portion of the field. These highs were observed across all horizons revealing a regional post-depositional deformation. Three prospects were identified and evaluated to understand the different opportunities in the field. These include stratigraphic pinch out and bi-directional downlap. The results of this study show that the field has potentials for new opportunities that could be explored for further studies.

Keywords: hydrocarbon, play, prospect, stratigraphy

Procedia PDF Downloads 233
1228 Impact of Map Generalization in Spatial Analysis

Authors: Lin Li, P. G. R. N. I. Pussella

Abstract:

When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.

Keywords: generalization, GIS, scales, spatial analysis

Procedia PDF Downloads 305
1227 Reliability-Based Ductility Seismic Spectra of Structures with Tilting

Authors: Federico Valenzuela-Beltran, Sonia E. Ruiz, Alfredo Reyes-Salazar, Juan Bojorquez

Abstract:

A reliability-based methodology which uses structural demand hazard curves to consider the increment of the ductility demands of structures with tilting is proposed. The approach considers the effect of two orthogonal components of the ground motions as well as the influence of soil-structure interaction. The approach involves the calculation of ductility demand hazard curves for symmetric systems and, alternatively, for systems with different degrees of asymmetry. To get this objective, demand hazard curves corresponding to different global ductility demands of the systems are calculated. Next, Uniform Exceedance Rate Spectra (UERS) are developed for a specific mean annual rate of exceedance value. Ratios between UERS corresponding to asymmetric and to symmetric systems located in soft soil of the valley of Mexico are obtained. Results indicate that the ductility demands corresponding to tilted structures may be several times higher than those corresponding to symmetric structures, depending on several factors such as tilting angle and vibration period of structure and soil.

Keywords: asymmetric yielding, seismic performance, structural reliability, tilted structures

Procedia PDF Downloads 483