Search results for: discrete hazard model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17081

Search results for: discrete hazard model

16901 Enhancing Flood Modeling: Unveiling the Role of Hazard Parameters in Building Vulnerability

Authors: Mohammad Shoraka, Raulina Wojtkiewicz, Karthik Ramanathan

Abstract:

Following the devastating summer 2021 floods in Germany, catastrophe modelers realized that hazard parameters, such as flow velocity, flood duration, and debris flow, play a significant role in capturing the overall damage potential of such events. Accounting for the location-specific static depth as the only hazard intensity metric may lead to a substantial underestimation of the vulnerability of building stock and, eventually, the loss potential of such catastrophic events. As the flow velocity increases, the hydrodynamic forces acting on various building components are amplified. Longer flood duration leads to water permeating porous components, incurring additional cleanup costs that contribute to an overall increase in damage. Debris flow possesses the power to erode extensive sections of buildings, thus substantially augmenting the extent of losses. This paper introduces four flow velocity classes, ranging from no flow velocity to major velocity, along with two flood duration classes: short and long, in estimating the vulnerability of the building stock. Additionally, the study examines the impact of the presence of debris flow and its role in exacerbating flood damage. The paper delves into the effects of each of these parameters on building component damageability and their collective impact on the overall building vulnerability.

Keywords: catastrophe modeling, building vulnerability, hazard parameters, component damage function

Procedia PDF Downloads 36
16900 Improved Regression Relations Between Different Magnitude Types and the Moment Magnitude in the Western Balkan Earthquake Catalogue

Authors: Anila Xhahysa, Migena Ceyhan, Neki Kuka, Klajdi Qoshi, Damiano Koxhaj

Abstract:

The seismic event catalog has been updated in the framework of a bilateral project supported by the Central European Investment Fund and with the extensive support of Global Earthquake Model Foundation to update Albania's national seismic hazard model. The earthquake catalogue prepared within this project covers the Western Balkan area limited by 38.0° - 48°N, 12.5° - 24.5°E and includes 41,806 earthquakes that occurred in the region between 510 BC and 2022. Since the moment magnitude characterizes the earthquake size accurately and the selected ground motion prediction equations for the seismic hazard assessment employ this scale, it was chosen as the uniform magnitude scale for the catalogue. Therefore, proxy values of moment magnitude had to be obtained by using new magnitude conversion equations between the local and other magnitude types to this unified scale. The Global Centroid Moment Tensor Catalogue was considered the most authoritative for moderate to large earthquakes for moment magnitude reports; hence it was used as a reference for calibrating other sources. The best fit was observed when compared to some regional agencies, whereas, with reports of moment magnitudes from Italy, Greece and Turkey, differences were observed in all magnitude ranges. For teleseismic magnitudes, to account for the non-linearity of the relationships, we used the exponential model for the derivation of the regression equations. The obtained regressions for the surface wave magnitude and short-period body-wave magnitude show considerable differences with Global Earthquake Model regression curves, especially for low magnitude ranges. Moreover, a conversion relation was obtained between the local magnitude of Albania and the corresponding moment magnitude as reported by the global and regional agencies. As errors were present in both variables, the Deming regression was used.

Keywords: regression, seismic catalogue, local magnitude, tele-seismic magnitude, moment magnitude

Procedia PDF Downloads 40
16899 Inverse Scattering for a Second-Order Discrete System via Transmission Eigenvalues

Authors: Abdon Choque-Rivero

Abstract:

The Jacobi system with the Dirichlet boundary condition is considered on a half-line lattice when the coefficients are real valued. The inverse problem of recovery of the coefficients from various data sets containing the so-called transmission eigenvalues is analyzed. The Marchenko method is utilized to solve the corresponding inverse problem.

Keywords: inverse scattering, discrete system, transmission eigenvalues, Marchenko method

Procedia PDF Downloads 115
16898 An Alternative Stratified Cox Model for Correlated Variables in Infant Mortality

Authors: K. A. Adeleke

Abstract:

Often in epidemiological research, introducing stratified Cox model can account for the existence of interactions of some inherent factors with some major/noticeable factors. This research work aimed at modelling correlated variables in infant mortality with the existence of some inherent factors affecting the infant survival function. An alternative semiparametric Stratified Cox model is proposed with a view to take care of multilevel factors that have interactions with others. This, however, was used as a tool to model infant mortality data from Nigeria Demographic and Health Survey (NDHS) with some multilevel factors (Tetanus, Polio, and Breastfeeding) having correlation with main factors (Sex, Size, and Mode of Delivery). Asymptotic properties of the estimators are also studied via simulation. The tested model via data showed good fit and performed differently depending on the levels of the interaction of the strata variable Z*. An evidence that the baseline hazard functions and regression coefficients are not the same from stratum to stratum provides a gain in information as against the usage of Cox model. Simulation result showed that the present method produced better estimates in terms of bias, lower standard errors, and or mean square errors.

Keywords: stratified Cox, semiparametric model, infant mortality, multilevel factors, cofounding variables

Procedia PDF Downloads 535
16897 Radium Equivalent and External Hazard Indices of Trace Elements Concentrations in Aquatic Species by Neutron Activation Analysis (NAA) and Inductively Coupled Plasma Mass Spectrometry (ICP-MS)

Authors: B. G. Muhammad, S. M. Jafar

Abstract:

Neutron Activation Analysis (NAA) and Inductively Coupled Plasma Mass Spectrometry (ICP-MS) were employed to analyze the level of trace elements concentrations in sediment samples and their bioaccumulation in some aquatic species selected randomly from surface water resources in the Northern peninsula of Malaysia. The NAA results of the sediment samples indicated a wide range in concentration of different elements were observed. Fe, K, and Na were found to have major concentration values that ranges between 61,000 ± 1400 to 4,500 ± 100 ppm, 20100±1000 to 3100±600 and 3,100±600 and 200±10 ppm, respectively. Traces of heavy metals with much more contamination health concern, such as Cr and As, were also identified in many of the samples analyzed. The average specific activities of 40K, 232Th and 226Ra in soil and the corresponding radium equivalent activity and the external hazard index were all found to be lower than the maximum permissible limits (370 Bq kg-1 and 1).

Keywords: external hazard index, Neutron Activation Analysis, radium equivalent, trace elements concentrations

Procedia PDF Downloads 396
16896 Infant and Child Mortality among the Low Socio-Economic Households in India

Authors: Narendra Kumar

Abstract:

This study uses data from the ‘National Family Health Survey (NFHS-3) 2005-06’ to investigate the predictors of infant and child mortality among low economic households in East and Northeast region. The cross tabulation, life table survival estimates and Cox proportional hazard model techniques have been used to estimate the predictors of infant and child mortality. The life table survival estimates for infant and child mortality shows that infant mortality in female child is lower in comparison to male child but with child mortality, the rates are higher for female in comparison to male child and the Cox proportional hazard model also give highly significant in female in comparison to male child. The infant and child mortality rates among poor households highest in the Central region followed by North and Northeast region and the lowest in South region in comparison to all regions of India. Education of respondent has been found a significant characteristics in both analyzes, further birth interval, respondent occupation, caste/tribe and place of delivery has substantial impact on infant and child mortality among low economic households in East and Northeast region. Finally these findings specified that an increase in parents’ education, improve health care services and improve socioeconomic conditions of low economic households which should in turn raise infant and child survival and should decrease child mortality among low economic households in India.

Keywords: infant, child, mortality, socio-economic, India

Procedia PDF Downloads 285
16895 Stator Short-Circuits Fault Diagnosis in Induction Motors

Authors: K. Yahia, M. Sahraoui, A. Guettaf

Abstract:

This paper deals with the problem of stator faults diagnosis in induction motors. Using the discrete wavelet transform (DWT) for the current Park’s vector modulus (CPVM) analysis, the inter-turn short-circuit faults diagnosis can be achieved. This method is based on the decomposition of the CPVM signal, where wavelet approximation and detail coefficients of this signal have been extracted. The energy evaluation of a known bandwidth detail permits to define a fault severity factor (FSF). This method has been tested through the simulation of an induction motor using a mathematical model based on the winding-function approach. Simulation, as well as experimental results, show the effectiveness of the used method.

Keywords: induction motors (IMs), inter-turn short-circuits diagnosis, discrete wavelet transform (DWT), Current Park’s Vector Modulus (CPVM)

Procedia PDF Downloads 429
16894 A Discrete Element Method-Based Simulation of Toppling Failure Considering Block Interaction

Authors: Hooman Dabirmanesh, Attila M. Zsaki

Abstract:

The toppling failure mode in a rock mass is considerably different from the most common sliding failure type along an existing or an induced slip plane. Block toppling is observed in a rock mass which consists of both a widely-spaced basal cross-joint set and a closely-spaced discontinuity set dipping into the slope. For this case, failure occurs when the structure cannot bear the tensile portion of bending stress, and the columns or blocks overturn by their own weight. This paper presents a particle-based discrete element model of rock blocks subjected to a toppling failure where geometric conditions and interaction among blocks are investigated. A series of parametric studies have been conducted on particles’ size, arrangement and bond contact among of particles which are made the blocks. Firstly, a numerical investigation on a one-block system was verified. Afterward, a slope consisting of multi-blocks was developed to study toppling failure and interaction forces between blocks. The results show that the formation of blocks, especially between the block and basal plane surface, can change the process of failure. The results also demonstrate that the initial configuration of particles used to form the blocks has a significant role in achieving accurate simulation results. The size of particles and bond contacts have a considerable influence to change the progress of toppling failure.

Keywords: block toppling failure, contact interaction, discrete element, particle size, random generation

Procedia PDF Downloads 158
16893 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data

Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa

Abstract:

A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.

Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation

Procedia PDF Downloads 167
16892 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair

Authors: Seyedvahid Najafi, Viliam Makis

Abstract:

In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ. 

Keywords: condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems

Procedia PDF Downloads 94
16891 Preserving Heritage in the Face of Natural Disasters: Lessons from the Bam Experience in Iran

Authors: Mohammad Javad Seddighi, Avar Almukhtar

Abstract:

The occurrence of natural disasters, such as floods and earthquakes, can cause significant damage to heritage sites and surrounding areas. In Iran, the city of Bam was devastated by an earthquake in 2003, which had a major impact on the rivers and watercourses around the city. This study aims to investigate the environmental design techniques and sustainable hazard mitigation strategies that can be employed to preserve heritage sites in the face of natural disasters, using the Bam experience as a case study. The research employs a mixed-methods approach, combining both qualitative and quantitative data collection and analysis methods. The study begins with a comprehensive literature review of recent publications on environmental design techniques and sustainable hazard mitigation strategies in heritage conservation. This is followed by a field study of the rivers and watercourses around Bam, including the Adoori River (Talangoo) and other watercourses, to assess the current conditions and identify potential hazards. The data collected from the field study is analysed using statistical methods and GIS mapping techniques. The findings of this study reveal the importance of sustainable hazard mitigation strategies and environmental design techniques in preserving heritage sites during natural disasters. The study suggests that these techniques can be used to prevent the outbreak of another natural disaster in Bam and the surrounding areas. Specifically, the study recommends the establishment of a comprehensive early warning system, the creation of flood-resistant landscapes, and the use of eco-friendly building materials in the reconstruction of heritage sites. These findings contribute to the current knowledge of sustainable hazard mitigation and environmental design in heritage conservation.

Keywords: natural disasters, heritage conservation, sustainable hazard mitigation, environmental design, landscape architecture, flood management, disaster resilience

Procedia PDF Downloads 55
16890 Investigation of Single Particle Breakage inside an Impact Mill

Authors: E. Ghasemi Ardi, K. J. Dong, A. B. Yu, R. Y. Yang

Abstract:

In current work, a numerical model based on the discrete element method (DEM) was developed which provided information about particle dynamic and impact event condition inside a laboratory scale impact mill (Fritsch). It showed that each particle mostly experiences three impacts inside the mill. While the first impact frequently happens at front surface of the rotor’s rib, the frequent location of the second impact is side surfaces of the rotor’s rib. It was also showed that while the first impact happens at small impact angle mostly varying around 35º, the second impact happens at around 70º which is close to normal impact condition. Also analyzing impact energy revealed that varying mill speed from 6000 to 14000 rpm, the ratio of first impact’s average impact energy and minimum required energy to break particle (Wₘᵢₙ) increased from 0.30 to 0.85. Moreover, it was seen that second impact poses intense impact energy on particle which can be considered as the main cause of particle splitting. Finally, obtained information from DEM simulation along with obtained data from conducted experiments was implemented in semi-empirical equations in order to find selection and breakage functions. Then, using a back-calculation approach, those parameters were used to predict the PSDs of ground particles under different impact energies. Results were compared with experiment results and showed reasonable accuracy and prediction ability.

Keywords: single particle breakage, particle dynamic, population balance model, particle size distribution, discrete element method

Procedia PDF Downloads 262
16889 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus

Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti

Abstract:

Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.

Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel

Procedia PDF Downloads 168
16888 Optimal Linear Quadratic Digital Tracker for the Discrete-Time Proper System with an Unknown Disturbance

Authors: Jason Sheng-Hong Tsai, Faezeh Ebrahimzadeh, Min-Ching Chung, Shu-Mei Guo, Leang-San Shieh, Tzong-Jiy Tsai, Li Wang

Abstract:

In this paper, we first construct a new state and disturbance estimator using discrete-time proportional plus integral observer to estimate the system state and the unknown external disturbance for the discrete-time system with an input-to-output direct-feedthrough term. Then, the generalized optimal linear quadratic digital tracker design is applied to construct a proportional plus integral observer-based tracker for the system with an unknown external disturbance to have a desired tracking performance. Finally, a numerical simulation is given to demonstrate the effectiveness of the new application of our proposed approach.

Keywords: non-minimum phase system, optimal linear quadratic tracker, proportional plus integral observer, state and disturbance estimator

Procedia PDF Downloads 480
16887 Multi-Layer Perceptron and Radial Basis Function Neural Network Models for Classification of Diabetic Retinopathy Disease Using Video-Oculography Signals

Authors: Ceren Kaya, Okan Erkaymaz, Orhan Ayar, Mahmut Özer

Abstract:

Diabetes Mellitus (Diabetes) is a disease based on insulin hormone disorders and causes high blood glucose. Clinical findings determine that diabetes can be diagnosed by electrophysiological signals obtained from the vital organs. 'Diabetic Retinopathy' is one of the most common eye diseases resulting on diabetes and it is the leading cause of vision loss due to structural alteration of the retinal layer vessels. In this study, features of horizontal and vertical Video-Oculography (VOG) signals have been used to classify non-proliferative and proliferative diabetic retinopathy disease. Twenty-five features are acquired by using discrete wavelet transform with VOG signals which are taken from 21 subjects. Two models, based on multi-layer perceptron and radial basis function, are recommended in the diagnosis of Diabetic Retinopathy. The proposed models also can detect level of the disease. We show comparative classification performance of the proposed models. Our results show that proposed the RBF model (100%) results in better classification performance than the MLP model (94%).

Keywords: diabetic retinopathy, discrete wavelet transform, multi-layer perceptron, radial basis function, video-oculography (VOG)

Procedia PDF Downloads 231
16886 Simulations in Structural Masonry Walls with Chases Horizontal Through Models in State Deformation Plan (2D)

Authors: Raquel Zydeck, Karina Azzolin, Luis Kosteski, Alisson Milani

Abstract:

This work presents numerical models in plane deformations (2D), using the Discrete Element Method formedbybars (LDEM) andtheFiniteElementMethod (FEM), in structuralmasonrywallswith horizontal chasesof 20%, 30%, and 50% deep, located in the central part and 1/3 oftheupperpartofthewall, withcenteredandeccentricloading. Differentcombinationsofboundaryconditionsandinteractionsbetweenthemethodswerestudied.

Keywords: chases in structural masonry walls, discrete element method formed by bars, finite element method, numerical models, boundary condition

Procedia PDF Downloads 135
16885 Landslide Hazard Zonation and Risk Studies Using Multi-Criteria Decision-Making and Slope Stability Analysis

Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James

Abstract:

In India, landslides are the most frequently occurring disaster in the regions of the Himalayas and the Western Ghats. The steep slopes and land use in these areas are quite apprehensive. In the recent past, many landslide hazard zonation (LHZ) works have been carried out in the Himalayas. However, the preparation of LHZ maps considering temporal factors such as seismic ground shaking, seismic amplification at surface level, and rainfall are limited. Hence this study presents a comprehensive use of the multi-criteria decision-making (MCDM) method in landslide risk assessment. In this research, we conducted both geospatial and geotechnical analysis to minimize the danger of landslides. Geospatial analysis is performed using high-resolution satellite data to produce landslide causative factors which were given weightage using the MCDM method. The geotechnical analysis includes a slope stability check, which was done to determine the potential landslide slope. The landslide risk map can provide useful information which helps people to understand the risk of living in an area.

Keywords: landslide hazard zonation, PHA, AHP, GIS

Procedia PDF Downloads 162
16884 The Influence of Air Temperature Controls in Estimation of Air Temperature over Homogeneous Terrain

Authors: Fariza Yunus, Jasmee Jaafar, Zamalia Mahmud, Nurul Nisa’ Khairul Azmi, Nursalleh K. Chang, Nursalleh K. Chang

Abstract:

Variation of air temperature from one place to another is cause by air temperature controls. In general, the most important control of air temperature is elevation. Another significant independent variable in estimating air temperature is the location of meteorological stations. Distances to coastline and land use type are also contributed to significant variations in the air temperature. On the other hand, in homogeneous terrain direct interpolation of discrete points of air temperature work well to estimate air temperature values in un-sampled area. In this process the estimation is solely based on discrete points of air temperature. However, this study presents that air temperature controls also play significant roles in estimating air temperature over homogenous terrain of Peninsular Malaysia. An Inverse Distance Weighting (IDW) interpolation technique was adopted to generate continuous data of air temperature. This study compared two different datasets, observed mean monthly data of T, and estimation error of T–T’, where T’ estimated value from a multiple regression model. The multiple regression model considered eight independent variables of elevation, latitude, longitude, coastline, and four land use types of water bodies, forest, agriculture and build up areas, to represent the role of air temperature controls. Cross validation analysis was conducted to review accuracy of the estimation values. Final results show, estimation values of T–T’ produced lower errors for mean monthly mean air temperature over homogeneous terrain in Peninsular Malaysia.

Keywords: air temperature control, interpolation analysis, peninsular Malaysia, regression model, air temperature

Procedia PDF Downloads 351
16883 Climate Change Adaptation in the U.S. Coastal Zone: Data, Policy, and Moving Away from Moral Hazard

Authors: Thomas Ruppert, Shana Jones, J. Scott Pippin

Abstract:

State and federal government agencies within the United States have recently invested substantial resources into studies of future flood risk conditions associated with climate change and sea-level rise. A review of numerous case studies has uncovered several key themes that speak to an overall incoherence within current flood risk assessment procedures in the U.S. context. First, there are substantial local differences in the quality of available information about basic infrastructure, particularly with regard to local stormwater features and essential facilities that are fundamental components of effective flood hazard planning and mitigation. Second, there can be substantial mismatch between regulatory Flood Insurance Rate Maps (FIRMs) as produced by the National Flood Insurance Program (NFIP) and other 'current condition' flood assessment approaches. This is of particular concern in areas where FIRMs already seem to underestimate extant flood risk, which can only be expected to become a greater concern if future FIRMs do not appropriately account for changing climate conditions. Moreover, while there are incentives within the NFIP’s Community Rating System (CRS) to develop enhanced assessments that include future flood risk projections from climate change, the incentive structures seem to have counterintuitive implications that would tend to promote moral hazard. In particular, a technical finding of higher future risk seems to make it easier for a community to qualify for flood insurance savings, with much of these prospective savings applied to individual properties that have the most physical risk of flooding. However, there is at least some case study evidence to indicate that recognition of these issues is prompting broader discussion about the need to move beyond FIRMs as a standalone local flood planning standard. The paper concludes with approaches for developing climate adaptation and flood resilience strategies in the U.S. that move away from the social welfare model being applied through NFIP and toward more of an informed risk approach that transfers much of the investment responsibility over to individual private property owners.

Keywords: climate change adaptation, flood risk, moral hazard, sea-level rise

Procedia PDF Downloads 78
16882 A Hybrid Watermarking Model Based on Frequency of Occurrence

Authors: Hamza A. A. Al-Sewadi, Adnan H. M. Al-Helali, Samaa A. K. Khamis

Abstract:

Ownership proofs of multimedia such as text, image, audio or video files can be achieved by the burial of watermark is them. It is achieved by introducing modifications into these files that are imperceptible to the human senses but easily recoverable by a computer program. These modifications would be in the time domain or frequency domain or both. This paper presents a procedure for watermarking by mixing amplitude modulation with frequency transformation histogram; namely a specific value is used to modulate the intensity component Y of the YIQ components of the carrier image. This scheme is referred to as histogram embedding technique (HET). Results comparison with those of other techniques such as discrete wavelet transform (DWT), discrete cosine transform (DCT) and singular value decomposition (SVD) have shown an enhance efficiency in terms of ease and performance. It has manifested a good degree of robustness against various environment effects such as resizing, rotation and different kinds of noise. This method would prove very useful technique for copyright protection and ownership judgment.

Keywords: authentication, copyright protection, information hiding, ownership, watermarking

Procedia PDF Downloads 541
16881 Critically Sampled Hybrid Trigonometry Generalized Discrete Fourier Transform for Multistandard Receiver Platform

Authors: Temidayo Otunniyi

Abstract:

This paper presents a low computational channelization algorithm for the multi-standards platform using poly phase implementation of a critically sampled hybrid Trigonometry generalized Discrete Fourier Transform, (HGDFT). An HGDFT channelization algorithm exploits the orthogonality of two trigonometry Fourier functions, together with the properties of Quadrature Mirror Filter Bank (QMFB) and Exponential Modulated filter Bank (EMFB), respectively. HGDFT shows improvement in its implementation in terms of high reconfigurability, lower filter length, parallelism, and medium computational activities. Type 1 and type 111 poly phase structures are derived for real-valued HGDFT modulation. The design specifications are decimated critically and over-sampled for both single and multi standards receiver platforms. Evaluating the performance of oversampled single standard receiver channels, the HGDFT algorithm achieved 40% complexity reduction, compared to 34% and 38% reduction in the Discrete Fourier Transform (DFT) and tree quadrature mirror filter (TQMF) algorithm. The parallel generalized discrete Fourier transform (PGDFT) and recombined generalized discrete Fourier transform (RGDFT) had 41% complexity reduction and HGDFT had a 46% reduction in oversampling multi-standards mode. While in the critically sampled multi-standard receiver channels, HGDFT had complexity reduction of 70% while both PGDFT and RGDFT had a 34% reduction.

Keywords: software defined radio, channelization, critical sample rate, over-sample rate

Procedia PDF Downloads 95
16880 An Exploratory Study of Effects of Parenting Styles on Maternal Expectation and Perception of Compliance among Adolescents

Authors: Anton James

Abstract:

This study explored the contribution of parenting styles in the Maternal Perception of Compliance Model (MPCM). This model explores maternal expectations to illustrate the formation of maternal perception of severity of noncompliance in adolescent children. The methodology consisted of three stages: In the first stage, a focus group was held, and the data was analysed to fine-tune the interview schedule. In the second stage, a single interview was held, and the interview schedule was further modified. The third and the final stage consisted of interviewing six mothers who had adolescent children. They were chosen with ‘maximum variation’ approach to represent three tiered socioeconomic statuses, and Asian, white and black ethnicities. The data was thematically analysed in a hybrid fashion: inductive coding and deductive assignment of codes into discrete parenting styles. The study found: a) parenting styles are not always discrete and sometimes it can be mixed. b) The parenting styles are influenced by culture, socioeconomic status, transgenerational knowledge, academic knowledge, observational knowledge, self-reflective knowledge, and parental anxiety. c) The parenting style functioned a mediating mechanism where it attempted to converge discrepancies between parental expectations of compliance with maternal perception of severity of noncompliance. The findings of parenting styles were discussed in relation to MPCM.

Keywords: compliance, expectation, parenting styles, perception

Procedia PDF Downloads 745
16879 Frequency of Occurrence Hybrid Watermarking Scheme

Authors: Hamza A. Ali, Adnan H. M. Al-Helali

Abstract:

Generally, a watermark is information that identifies the ownership of multimedia (text, image, audio or video files). It is achieved by introducing modifications into these files that are imperceptible to the human senses but easily recoverable by a computer program. These modifications are done according to a secret key in a descriptive model that would be either in the time domain or frequency domain or both. This paper presents a procedure for watermarking by mixing amplitude modulation with frequency transformation histogram; namely a specific value is used to modulate the intensity component Y of the YIQ components of the carrier image. This scheme is referred to as histogram embedding technique (HET). Results comparison with those of other techniques such as discrete wavelet transform (DWT), discrete cosine transform (DCT) and singular value decomposition (SVD) have shown an enhance efficiency in terms of ease and performance. It has manifested a good degree of robustness against various environment effects such as resizing, rotation and different kinds of noise. This method would prove very useful technique for copyright protection and ownership judgment.

Keywords: watermarking, ownership, copyright protection, steganography, information hiding, authentication

Procedia PDF Downloads 346
16878 On the Survival of Individuals with Type 2 Diabetes Mellitus in the United Kingdom: A Retrospective Case-Control Study

Authors: Njabulo Ncube, Elena Kulinskaya, Nicholas Steel, Dmitry Pshezhetskiy

Abstract:

Life expectancy in the United Kingdom (UK) has been near constant since 2010, particularly for the individuals of 65 years and older. This trend has been also noted in several other countries. This slowdown in the increase of life expectancy was concurrent with the increase in the number of deaths caused by non-communicable diseases. Of particular concern is the world-wide exponential increase in the number of diabetes related deaths. Previous studies have reported increased mortality hazards among diabetics compared to non-diabetics, and on the differing effects of antidiabetic drugs on mortality hazards. This study aimed to estimate the all-cause mortality hazards and related life expectancies among type 2 diabetes (T2DM) patients in the UK using the time-variant Gompertz-Cox model with frailty. The study also aimed to understand the major causes of the change in life expectancy growth in the last decade. A total of 221 182 (30.8% T2DM, 57.6% Males) individuals aged 50 years and above, born between 1930 and 1960, inclusive, and diagnosed between 2000 and 2016, were selected from The Health Improvement Network (THIN) database of the UK primary care data and followed up to 31 December 2016. About 13.4% of participants died during the follow-up period. The overall all-cause mortality hazard ratio of T2DM compared to non-diabetic controls was 1.467 (1.381-1.558) and 1.38 (1.307-1.457) when diagnosed between 50 to 59 years and 60 to 74 years, respectively. The estimated life expectancies among T2DM individuals without further comorbidities diagnosed at the age of 60 years were 2.43 (1930-1939 birth cohort), 2.53 (1940-1949 birth cohort) and 3.28 (1950-1960 birth cohort) years less than those of non-diabetic controls. However, the 1950-1960 birth cohort had a steeper hazard function compared to the 1940-1949 birth cohort for both T2DM and non-diabetic individuals. In conclusion, mortality hazards for people with T2DM continue to be higher than for non-diabetics. The steeper mortality hazard slope for the 1950-1960 birth cohort might indicate the sub-population contributing to a slowdown in the growth of the life expectancy.

Keywords: T2DM, Gompetz-Cox model with frailty, all-cause mortality, life expectancy

Procedia PDF Downloads 97
16877 Reliability-Based Ductility Seismic Spectra of Structures with Tilting

Authors: Federico Valenzuela-Beltran, Sonia E. Ruiz, Alfredo Reyes-Salazar, Juan Bojorquez

Abstract:

A reliability-based methodology which uses structural demand hazard curves to consider the increment of the ductility demands of structures with tilting is proposed. The approach considers the effect of two orthogonal components of the ground motions as well as the influence of soil-structure interaction. The approach involves the calculation of ductility demand hazard curves for symmetric systems and, alternatively, for systems with different degrees of asymmetry. To get this objective, demand hazard curves corresponding to different global ductility demands of the systems are calculated. Next, Uniform Exceedance Rate Spectra (UERS) are developed for a specific mean annual rate of exceedance value. Ratios between UERS corresponding to asymmetric and to symmetric systems located in soft soil of the valley of Mexico are obtained. Results indicate that the ductility demands corresponding to tilted structures may be several times higher than those corresponding to symmetric structures, depending on several factors such as tilting angle and vibration period of structure and soil.

Keywords: asymmetric yielding, seismic performance, structural reliability, tilted structures

Procedia PDF Downloads 483
16876 Future Projection of Glacial Lake Outburst Floods Hazard: A Hydrodynamic Study of the Highest Lake in the Dhauliganga Basin, Uttarakhand

Authors: Ashim Sattar, Ajanta Goswami, Anil V. Kulkarni

Abstract:

Glacial lake outburst floods (GLOF) highly contributes to mountain hazards in the Himalaya. Over the past decade, high altitude lakes in the Himalaya has been showing notable growth in their size and number. The key reason is rapid retreat of its glacier front. Hydrodynamic modeling GLOF using shallow water equations (SWE) would result in understanding its impact in the downstream region. The present study incorporates remote sensing based ice thickness modeling to determine the future extent of the Dhauliganga Lake to map the over deepening extent around the highest lake in the Dhauliganga basin. The maximum future volume of the lake calculated using area-volume scaling is used to model a GLOF event. The GLOF hydrograph is routed along the channel using one dimensional and two dimensional model to understand the flood wave propagation till it reaches the 1st hydropower station located 72 km downstream of the lake. The present extent of the lake calculated using SENTINEL 2 images is 0.13 km². The maximum future extent of the lake, mapped by investigating the glacier bed has a calculated scaled volume of 3.48 x 106 m³. The GLOF modeling releasing the future volume of the lake resulted in a breach hydrograph with a peak flood of 4995 m³/s at just downstream of the lake. Hydraulic routing

Keywords: GLOF, glacial lake outburst floods, mountain hazard, Central Himalaya, future projection

Procedia PDF Downloads 135
16875 A Novel Integration of Berth Allocation, Quay Cranes and Trucks Scheduling Problems in Container Terminals

Authors: M. Moharami Gargari, S. Javdani Zamani, A. Mohammadnejad, S. Abuali

Abstract:

As maritime container transport is developing fast, the need arises for efficient operations at container terminals. One of the most important determinants of container handling efficiency is the productivity of quay cranes and internal transportation vehicles, which are responsible transporting of containers for unloading and loading operations for container vessels. For this reason, this paper presents an integrated mathematical model formulation for discrete berths with quay cranes and internal transportations vehicles. The problems have received increasing attention in the literature and the present paper deals with the integration of these interrelated problems. A new mixed integer linear formulation is developed for the Berth Allocation Problem (BAP), Quay Crane Assignment and Scheduling Problem (QCASP) and Internal Transportation Scheduling (ITS), which accounts for cranes and trucks positioning conditions.

Keywords: discrete berths, container terminal, truck scheduling, dynamic vessel arrival

Procedia PDF Downloads 368
16874 Speech Intelligibility Improvement Using Variable Level Decomposition DWT

Authors: Samba Raju, Chiluveru, Manoj Tripathy

Abstract:

Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methods

Keywords: discrete wavelet transform, speech intelligibility, STOI, standard deviation

Procedia PDF Downloads 114
16873 Discrete State Prediction Algorithm Design with Self Performance Enhancement Capacity

Authors: Smail Tigani, Mohamed Ouzzif

Abstract:

This work presents a discrete quantitative state prediction algorithm with intelligent behavior making it able to self-improve some performance aspects. The specificity of this algorithm is the capacity of self-rectification of the prediction strategy before the final decision. The auto-rectification mechanism is based on two parallel mathematical models. In one hand, the algorithm predicts the next state based on event transition matrix updated after each observation. In the other hand, the algorithm extracts its residues trend with a linear regression representing historical residues data-points in order to rectify the first decision if needs. For a normal distribution, the interactivity between the two models allows the algorithm to self-optimize its performance and then make better prediction. Designed key performance indicator, computed during a Monte Carlo simulation, shows the advantages of the proposed approach compared with traditional one.

Keywords: discrete state, Markov Chains, linear regression, auto-adaptive systems, decision making, Monte Carlo Simulation

Procedia PDF Downloads 475
16872 Seismic Resistant Columns of Buildings against the Differential Settlement of the Foundation

Authors: Romaric Desbrousses, Lan Lin

Abstract:

The objective of this study is to determine how Canadian seismic design provisions affect the column axial load resistance of moment-resisting frame reinforced concrete buildings subjected to the differential settlement of their foundation. To do so, two four-storey buildings are designed in accordance with the seismic design provisions of the Canadian Concrete Design Standards. One building is located in Toronto, which is situated in a moderate seismic hazard zone in Canada, and the other in Vancouver, which is in Canada’s highest seismic hazard zone. A finite element model of each building is developed using SAP 2000. A 100 mm settlement is assigned to the base of the building’s center column. The axial load resistance of the column is represented by the demand capacity ratio. The analysis results show that settlement-induced tensile axial forces have a particularly detrimental effect on the conventional settling columns of the Toronto buildings which fail at a much smaller settlement that those in the Vancouver buildings. The results also demonstrate that particular care should be taken in the design of columns in short-span buildings.

Keywords: Columns, Demand, Foundation differential settlement, Seismic design, Non-linear analysis

Procedia PDF Downloads 106