Search results for: calculating method of cracking probability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20355

Search results for: calculating method of cracking probability

20145 Learning a Bayesian Network for Situation-Aware Smart Home Service: A Case Study with a Robot Vacuum Cleaner

Authors: Eu Tteum Ha, Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

The smart home environment backed up by IoT (internet of things) technologies enables intelligent services based on the awareness of the situation a user is currently in. One of the convenient sensors for recognizing the situations within a home is the smart meter that can monitor the status of each electrical appliance in real time. This paper aims at learning a Bayesian network that models the causal relationship between the user situations and the status of the electrical appliances. Using such a network, we can infer the current situation based on the observed status of the appliances. However, learning the conditional probability tables (CPTs) of the network requires many training examples that cannot be obtained unless the user situations are closely monitored by any means. This paper proposes a method for learning the CPT entries of the network relying only on the user feedbacks generated occasionally. In our case study with a robot vacuum cleaner, the feedback comes in whenever the user gives an order to the robot adversely from its preprogrammed setting. Given a network with randomly initialized CPT entries, our proposed method uses this feedback information to adjust relevant CPT entries in the direction of increasing the probability of recognizing the desired situations. Simulation experiments show that our method can rapidly improve the recognition performance of the Bayesian network using a relatively small number of feedbacks.

Keywords: Bayesian network, IoT, learning, situation -awareness, smart home

Procedia PDF Downloads 523
20144 Optimal Mitigation of Slopes by Probabilistic Methods

Authors: D. De-León-Escobedo, D. J. Delgado-Hernández, S. Pérez

Abstract:

A probabilistic formulation to assess the slopes safety under the hazard of strong storms is presented and illustrated through a slope in Mexico. The formulation is based on the classical safety factor (SF) used in practice to appraise the slope stability, but it is introduced the treatment of uncertainties, and the slope failure probability is calculated as the probability that SF<1. As the main hazard is the rainfall on the area, statistics of rainfall intensity and duration are considered and modeled with an exponential distribution. The expected life-cycle cost is assessed by considering a monetary value on the slope failure consequences. Alternative mitigation measures are simulated, and the formulation is used to get the measures driving to the optimal one (minimum life-cycle costs). For the example, the optimal mitigation measure is the reduction on the slope inclination angle.

Keywords: expected life-cycle cost, failure probability, slopes failure, storms

Procedia PDF Downloads 160
20143 Biodiesel Synthesis Using Animal Excreta-Based Biochar and Waste Cooking Oil

Authors: Sang-Ryong Lee, Min-Woon Jung, Deugwoo Han, Kiyong Kim

Abstract:

This study laid an emphasis on the possible employment of biochar generated from pyrolysis of animal excreta to establish a green platform for producing biodiesel. To this end, the pseudo-catalytic transesterification reaction using chicken manure biochar and waste cooking oil was investigated. Compared with a commercial porous material (SiO2), chicken manure biochar generated from 350 C showed better performance, resulting in 95.6% of the FAME yield at 350C. The Ca species in chicken manure biochar imparted strong catalytic capability by providing the basicity for transesterification. The identified catalytic effect also led to the thermal cracking of unsaturated FAMEs, which decreased the overall FAME yield. For example, 40–60% of converted FAMEs were thermally degraded. To avoid undesirable thermal cracking arising from the high content of the Ca species in chicken manure biochar, the fabrication of chicken manure biochar at temperatures ≥350C was highly recommended.

Keywords: Trasesterification, Animal excreta, FAME, Biochar, Chicken manure

Procedia PDF Downloads 199
20142 A Statistical Model for the Dynamics of Single Cathode Spot in Vacuum Cylindrical Cathode

Authors: Po-Wen Chen, Jin-Yu Wu, Md. Manirul Ali, Yang Peng, Chen-Te Chang, Der-Jun Jan

Abstract:

Dynamics of cathode spot has become a major part of vacuum arc discharge with its high academic interest and wide application potential. In this article, using a three-dimensional statistical model, we simulate the distribution of the ignition probability of a new cathode spot occurring in different magnetic pressure on old cathode spot surface and at different arcing time. This model for the ignition probability of a new cathode spot was proposed in two typical situations, one by the pure isotropic random walk in the absence of an external magnetic field, other by the retrograde motion in external magnetic field, in parallel with the cathode surface. We mainly focus on developed relationship between the ignition probability density distribution of a new cathode spot and the external magnetic field.

Keywords: cathode spot, vacuum arc discharge, transverse magnetic field, random walk

Procedia PDF Downloads 434
20141 The Role of Fluid Catalytic Cracking in Process Optimisation for Petroleum Refineries

Authors: Chinwendu R. Nnabalu, Gioia Falcone, Imma Bortone

Abstract:

Petroleum refining is a chemical process in which the raw material (crude oil) is converted to finished commercial products for end users. The fluid catalytic cracking (FCC) unit is a key asset in refineries, requiring optimised processes in the context of engineering design. Following the first stage of separation of crude oil in a distillation tower, an additional 40 per cent quantity is attainable in the gasoline pool with further conversion of the downgraded product of crude oil (residue from the distillation tower) using a catalyst in the FCC process. Effective removal of sulphur oxides, nitrogen oxides, carbon and heavy metals from FCC gasoline requires greater separation efficiency and involves an enormous environmental significance. The FCC unit is primarily a reactor and regeneration system which employs cyclone systems for separation.  Catalyst losses in FCC cyclones lead to high particulate matter emission on the regenerator side and fines carryover into the product on the reactor side. This paper aims at demonstrating the importance of FCC unit design criteria in terms of technical performance and compliance with environmental legislation. A systematic review of state-of-the-art FCC technology was carried out, identifying its key technical challenges and sources of emissions.  Case studies of petroleum refineries in Nigeria were assessed against selected global case studies. The review highlights the need for further modelling investigations to help improve FCC design to more effectively meet product specification requirements while complying with stricter environmental legislation.

Keywords: design, emission, fluid catalytic cracking, petroleum refineries

Procedia PDF Downloads 137
20140 New Concept for Real Time Selective Harmonics Elimination Based on Lagrange Interpolation Polynomials

Authors: B. Makhlouf, O. Bouchhida, M. Nibouche, K. Laidi

Abstract:

A variety of methods for selective harmonics elimination pulse width modulation have been developed, the most frequently used for real-time implementation based on look-up tables method. To address real-time requirements based in modified carrier signal is proposed in the presented work, with a general formulation to real-time harmonics control/elimination in switched inverters. Firstly, the proposed method has been demonstrated for a single value of the modulation index. However, in reality, this parameter is variable as a consequence of the voltage (amplitude) variability. In this context, a simple interpolation method for calculating the modified sine carrier signal is proposed. The method allows a continuous adjustment in both amplitude and frequency of the fundamental. To assess the performance of the proposed method, software simulations and hardware experiments have been carried out in the case of a single-phase inverter. Obtained results are very satisfactory.

Keywords: harmonic elimination, Particle Swarm Optimisation (PSO), polynomial interpolation, pulse width modulation, real-time harmonics control, voltage inverter

Procedia PDF Downloads 503
20139 Concrete Cracking Simulation Using Vector Form Intrinsic Finite Element Method

Authors: R. Z. Wang, B. C. Lin, C. H. Huang

Abstract:

This study proposes a new method to simulate the crack propagation under mode-I loading using Vector Form Intrinsic Finite Element (VFIFE) method. A new idea which is expected to combine both VFIFE and J-integral is proposed to calculate the stress density factor as the crack critical in elastic crack. The procedure of implement the cohesive crack propagation in VFIFE based on the fictitious crack model is also proposed. In VFIFIE, the structure deformation is described by numbers of particles instead of elements. The strain energy density and the derivatives of the displacement vector of every particle is introduced to calculate the J-integral as the integral path is discrete by particles. The particle on the crack tip separated into two particles once the stress on the crack tip satisfied with the crack critical and then the crack tip propagates to the next particle. The internal force and the cohesive force is applied to the particles.

Keywords: VFIFE, crack propagation, fictitious crack model, crack critical

Procedia PDF Downloads 335
20138 The Multiaxial Load Proportionality Effect on the Fracture Surface Topography of Forged Magnesium Alloys

Authors: Andrew Gryguć, Seyed Behzad Behravesh, Hamid Jahed, Mary Wells, Wojciech Macek, Bruce Williams

Abstract:

This extended abstract investigates the influence of the multiaxial loading on the fatigue behavior of forged magnesium through quantitative analysis of its fracture surface topography and mesoscopic cracking orientation. Fatigue tests were performed on hollow tubular sample geometries extracted from closed-die forged AZ80 Mg components, with three different multiaxial strain paths (axial/shear), proportional, 45° out of phase, and 90° out of phase. Regardless of the strain path, fatigue cracks are initiated at the outer surface of the specimen where the combined stress state is largest. Depending on the salient mode of deformation, distinctive features in the fracture surface manifested themselves with different topographic amplitudes, surface roughness, and mesoscopic cracking orientation in the vicinity of the initiation site. The dominant crack propagation path was in the circumferential direction of the hollow tubular specimen (i.e., cracking transverse to the sample axis, with little to no branching), which is congruent with previous findings of low to moderate shear strain energy density (SED) multiaxial loading. For proportional loading, the initiation zone surface morphology was largely flat and striated, whereas, at phase angles of 45° and 90°, the initiation surface became more faceted and inclined. Overall, both a qualitative and quantitative link was developed between the fracture surface morphology and the level of non-proportionality in the loading providing useful insight into the fracture mechanics of forged magnesium as a relevant focus for future study.

Keywords: fatigue, fracture, magnesium, forging, fractography, anisotropy, strain energy density, asymmetry, multiaxial fatigue

Procedia PDF Downloads 82
20137 A Hybrid Based Algorithm to Solve the Multi-objective Minimum Spanning Tree Problem

Authors: Boumesbah Asma, Chergui Mohamed El-amine

Abstract:

Since it has been shown that the multi-objective minimum spanning tree problem (MOST) is NP-hard even with two criteria, we propose in this study a hybrid NSGA-II algorithm with an exact mutation operator, which is only used with low probability, to find an approximation to the Pareto front of the problem. In a connected graph G, a spanning tree T of G being a connected and cycle-free graph, if k edges of G\T are added to T, we obtain a partial graph H of G inducing a reduced size multi-objective spanning tree problem compared to the initial one. With a weak probability for the mutation operator, an exact method for solving the reduced MOST problem considering the graph H is then used to give birth to several mutated solutions from a spanning tree T. Then, the selection operator of NSGA-II is activated to obtain the Pareto front approximation. Finally, an adaptation of the VNS metaheuristic is called for further improvements on this front. It allows finding good individuals to counterbalance the diversification and the intensification during the optimization search process. Experimental comparison studies with an exact method show promising results and indicate that the proposed algorithm is efficient.

Keywords: minimum spanning tree, multiple objective linear optimization, combinatorial optimization, non-sorting genetic algorithm, variable neighborhood search

Procedia PDF Downloads 91
20136 The Falling Point of Lubricant

Authors: Arafat Husain

Abstract:

The lubricants are one of the most used resource in today’s world. Lot of the superpowers are dependent on the lubricant resource for their country to function. To see that the lubricants are not adulterated we need to develop some efficient ways and to see which fluid has been added to the lubricant. So to observe the these malpractices in the lubricant we need to develop a method. We take a elastic ball and through it at probability circle in the submerged in the lubricant at a fixed force and see the distance of pitching and the point of fall. Then we the ratio of distance of falling to the distance of pitching and if the measured ratio is greater than one the fluid is less viscous and if the ratio is lesser than the lubricant is viscous. We will check the falling point of pure lubricant at fixed force and every pure lubricant would have a fixed falling point. After that we would adulterate the lubricant and note the falling point and if the falling point is less than the standard value then adulterate is solid and if the adulterate is liquid the falling point will be more than the standard value. Hence the comparison with the standard falling point will give the efficiency of the lubricant.

Keywords: falling point of lubricant, falling point ratios, probability circle, octane number

Procedia PDF Downloads 495
20135 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs

Authors: Lokesh Varshney, R. K. Saket

Abstract:

This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.

Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation

Procedia PDF Downloads 558
20134 Investigation on Behaviour of Reinforced Concrete Beam-Column Joints Retrofitted with CFRP

Authors: Ehsan Mohseni

Abstract:

The aim of this thesis is to provide numerical analyses of reinforced concrete beams-column joints with/without CFRP (Carbon Fiber Reinforced Polymer) in order to achieve a better understanding of the behaviour of strengthened beamcolumn joints. A comprehensive literature survey prior to this study revealed that published studies are limited to a handful only; the results are inconclusive and some are even contradictory. Therefore in order to improve on this situation, following that review, a numerical study was designed and performed as presented in this thesis. For the numerical study, dimensions, end supports, and characteristics of the beam and column models were the same as those chosen in an experimental investigation performed previously where ten beamcolumn joint were tested tofailure. Finite element analysis is a useful tool in cases where analytical methods are not capable of solving the problem due to the complexities associated with the problem. The cyclic behaviour of FRP strengthened reinforced concrete beam-columns joints is such a case. Interaction of steel (longitudinal and stirrups), concrete and FRP, yielding of steel bars and stirrups, cracking of concrete, the redistribution of stresses as some elements unload due to crushing or yielding and the confinement of concrete due to the presence of FRP are some of the issues that introduce the complexities into the problem.Numerical solutions, however, can provide further in formation about the behaviour in lieu of the costly experiments or complex closed form solutions. This thesis presents the results of a numerical study on beam-column joints subjected to cyclic loads that are strengthened with CFRP wraps or strrips in a variety of configurations. The analyses are performed by Abaqus finite element program and are calibrated with the experiments. A range of issues in beam-column joints including the cracking load, the ultimate load, lateral load-displacement curves of joints, are investigated.The numerical results for different configurations of strengthening are compared. Finally, the computed numerical results are compared with those obtained from experiments. the cracking load, the ultimate load, lateral load-displacement curves obtained from numerical analysis for all joints were in very good agreement with the corresponding experimental ones.The results obtained from the numerical analysis in most cases implies that this method is conservative and therefore can be used in design applications with confidence.

Keywords: numerical analysis, strengthening, CFRP, reinforced concrete joints

Procedia PDF Downloads 349
20133 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 360
20132 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility

Authors: Akash Verma, Sujit Kumar Samanta

Abstract:

This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.

Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization

Procedia PDF Downloads 42
20131 Stress Corrosion Crack Identification with Direct Assessment Method in Pipeline Downstream from a Compressor Station

Authors: H. Gholami, M. Jalali Azizpour

Abstract:

Stress Corrosion Crack (SCC) in pipeline is a type of environmentally assisted cracking (EAC), since its discovery in 1965 as a possible cause of failure in pipeline, SCC has caused, on average, one of two failures per year in the U.S, According to the NACE SCC DA a pipe line segment is considered susceptible to SCC if all of the following factors are met: The operating stress exceeds 60% of specified minimum yield strength (SMYS), the operating temperature exceeds 38°C, the segment is less than 32 km downstream from a compressor station, the age of the pipeline is greater than 10 years and the coating type is other than Fusion Bonded Epoxy(FBE). In this paper as a practical experience in NISOC, Direct Assessment (DA) Method is used for identification SCC defect in unpiggable pipeline located downstream of compressor station.

Keywords: stress corrosion crack, direct assessment, disbondment, transgranular SCC, compressor station

Procedia PDF Downloads 386
20130 Cooperative Spectrum Sensing Using Hybrid IWO/PSO Algorithm in Cognitive Radio Networks

Authors: Deepa Das, Susmita Das

Abstract:

Cognitive Radio (CR) is an emerging technology to combat the spectrum scarcity issues. This is achieved by consistently sensing the spectrum, and detecting the under-utilized frequency bands without causing undue interference to the primary user (PU). In soft decision fusion (SDF) based cooperative spectrum sensing, various evolutionary algorithms have been discussed, which optimize the weight coefficient vector for maximizing the detection performance. In this paper, we propose the hybrid invasive weed optimization and particle swarm optimization (IWO/PSO) algorithm as a fast and global optimization method, which improves the detection probability with a lesser sensing time. Then, the efficiency of this algorithm is compared with the standard invasive weed optimization (IWO), particle swarm optimization (PSO), genetic algorithm (GA) and other conventional SDF based methods on the basis of convergence and detection probability.

Keywords: cognitive radio, spectrum sensing, soft decision fusion, GA, PSO, IWO, hybrid IWO/PSO

Procedia PDF Downloads 467
20129 Graphical Modeling of High Dimension Processes with an Environmental Application

Authors: Ali S. Gargoum

Abstract:

Graphical modeling plays an important role in providing efficient probability calculations in high dimensional problems (computational efficiency). In this paper, we address one of such problems where we discuss fragmenting puff models and some distributional assumptions concerning models for the instantaneous, emission readings and for the fragmenting process. A graphical representation in terms of a junction tree of the conditional probability breakdown of puffs and puff fragments is proposed.

Keywords: graphical models, influence diagrams, junction trees, Bayesian nets

Procedia PDF Downloads 396
20128 Grid Based Traffic Vulnerability Model Using Betweenness Centrality for Urban Disaster Management Information

Authors: Okyu Kwon, Dongho Kang, Byungsik Kim, Seungkwon Jung

Abstract:

We propose a technique to measure the impact of loss of traffic function in a particular area to surrounding areas. The proposed method is applied to the city of Seoul, which is the capital of South Korea, with a population of about ten million. Based on the actual road network in Seoul, we construct an abstract road network between 1kmx1km grid cells. The link weight of the abstract road network is re-adjusted considering traffic volume measured at several survey points. On the modified abstract road network, we evaluate the traffic vulnerability by calculating a network measure of betweenness centrality (BC) for every single grid cells. This study analyzes traffic impacts caused by road dysfunction due to heavy rainfall in urban areas. We could see the change of the BC value in all other grid cells by calculating the BC value once again when the specific grid cell lost its traffic function, that is, when the node disappeared on the grid-based road network. The results show that it is appropriate to use the sum of the BC variation of other cells as the influence index of each lattice cell on traffic. This research was supported by a grant (2017-MOIS31-004) from Fundamental Technology Development Program for Extreme Disaster Response funded by Korean Ministry of Interior and Safety (MOIS).

Keywords: vulnerability, road network, beweenness centrality, heavy rainfall, road impact

Procedia PDF Downloads 95
20127 An Approach for Detection Efficiency Determination of High Purity Germanium Detector Using Cesium-137

Authors: Abdulsalam M. Alhawsawi

Abstract:

Estimation of a radiation detector's efficiency plays a significant role in calculating the activity of radioactive samples. Detector efficiency is measured using sources that emit a variety of energies from low to high-energy photons along the energy spectrum. Some photon energies are hard to find in lab settings either because check sources are hard to obtain or the sources have short half-lives. This work aims to develop a method to determine the efficiency of a High Purity Germanium Detector (HPGe) based on the 662 keV gamma ray photon emitted from Cs-137. Cesium-137 is readily available in most labs with radiation detection and health physics applications and has a long half-life of ~30 years. Several photon efficiencies were calculated using the MCNP5 simulation code. The simulated efficiency of the 662 keV photon was used as a base to calculate other photon efficiencies in a point source and a Marinelli Beaker form. In the Marinelli Beaker filled with water case, the efficiency of the 59 keV low energy photons from Am-241 was estimated with a 9% error compared to the MCNP5 simulated efficiency. The 1.17 and 1.33 MeV high energy photons emitted by Co-60 had errors of 4% and 5%, respectively. The estimated errors are considered acceptable in calculating the activity of unknown samples as they fall within the 95% confidence level.

Keywords: MCNP5, MonteCarlo simulations, efficiency calculation, absolute efficiency, activity estimation, Cs-137

Procedia PDF Downloads 116
20126 On Virtual Coordination Protocol towards 5G Interference Mitigation: Modelling and Performance Analysis

Authors: Bohli Afef

Abstract:

The fifth-generation (5G) wireless systems is featured by extreme densities of cell stations to overcome the higher future demand. Hence, interference management is a crucial challenge in 5G ultra-dense cellular networks. In contrast to the classical inter-cell interference coordination approach, which is no longer fit for the high density of cell-tiers, this paper proposes a novel virtual coordination based on the dynamic common cognitive monitor channel protocol to deal with the inter-cell interference issue. A tractable and flexible model for the coverage probability of a typical user is developed through the use of the stochastic geometry model. The analyses of the performance of the suggested protocol are illustrated both analytically and numerically in terms of coverage probability.

Keywords: ultra dense heterogeneous networks, dynamic common channel protocol, cognitive radio, stochastic geometry, coverage probability

Procedia PDF Downloads 325
20125 Procedure for Monitoring the Process of Behavior of Thermal Cracking in Concrete Gravity Dams: A Case Study

Authors: Adriana de Paula Lacerda Santos, Bruna Godke, Mauro Lacerda Santos Filho

Abstract:

Several dams in the world have already collapsed, causing environmental, social and economic damage. The concern to avoid future disasters has stimulated the creation of a great number of laws and rules in many countries. In Brazil, Law 12.334/2010 was created, which establishes the National Policy on Dam Safety. Overall, this policy requires the dam owners to invest in the maintenance of their structures and to improve its monitoring systems in order to provide faster and straightforward responses in the case of an increase of risks. As monitoring tools, visual inspections has provides comprehensive assessment of the structures performance, while auscultation’s instrumentation has added specific information on operational or behavioral changes, providing an alarm when a performance indicator exceeds the acceptable limits. These limits can be set using statistical methods based on the relationship between instruments measures and other variables, such as reservoir level, time of the year or others instruments measuring. Besides the design parameters (uplift of the foundation, displacements, etc.) the dam instrumentation can also be used to monitor the behavior of defects and damage manifestations. Specifically in concrete gravity dams, one of the main causes for the appearance of cracks, are the concrete volumetric changes generated by the thermal origin phenomena, which are associated with the construction process of these structures. Based on this, the goal of this research is to propose a monitoring process of the thermal cracking behavior in concrete gravity dams, through the instrumentation data analysis and the establishment of control values. Therefore, as a case study was selected the Block B-11 of José Richa Governor Dam Power Plant, that presents a cracking process, which was identified even before filling the reservoir in August’ 1998, and where crack meters and surface thermometers were installed for its monitoring. Although these instruments were installed in May 2004, the research was restricted to study the last 4.5 years (June 2010 to November 2014), when all the instruments were calibrated and producing reliable data. The adopted method is based on simple linear correlations procedures to understand the interactions among the instruments time series, verifying the response times between them. The scatter plots were drafted from the best correlations, which supported the definition of the limit control values. Among the conclusions, it is shown that there is a strong or very strong correlation between ambient temperature and the crack meters and flowmeters measurements. Based on the results of the statistical analysis, it was possible to develop a tool for monitoring the behavior of the case study cracks. Thus it was fulfilled the goal of the research to develop a proposal for a monitoring process of the behavior of thermal cracking in concrete gravity dams.

Keywords: concrete gravity dam, dams safety, instrumentation, simple linear correlation

Procedia PDF Downloads 292
20124 The Effect of Nanoclay on Long Term Performance of Asphalt Concrete Pavement

Authors: A. Khodadadi, Hasani, Salehi

Abstract:

The advantages of using modified asphalt binders are widely recognized—primarily, improved rutting resistance, reduced fatigue cracking and less cold-temperature cracking. Nanoclays are known to enhance the properties of many polymers. Nanoclays are used to improve modulus and tensile strength, flame resistance and thermal and structural properties of many materials. This paper intends to investigate the application and development of nano-technological concepts for bituminous materials and asphalt pavements. The application of nano clay on the fatigue life of asphalt pavement have not been yet thoroughly understood. In this research, two type of highway asphalt materials, dense Marshall specimens, with 2% nano clay and without nano clay, were employed for the fatigue behavior of the asphalt pavement.The effect of nano additive on the performance of flexible pavements has been investigated through the indirect tensile test for the samples prepared with 2% nano clay and without nano clay in four stress levels from 200–500 kPa. The primary results indicated samples with 2% nano clay have almost double or even more fatigue life in most of stress levels.

Keywords: Nano clay, Asphalt, fatigue life, pavement

Procedia PDF Downloads 455
20123 Assessing the Disability-Free Life Expectancy and Decomposition of Its Difference: A Gender Perspective on India over the Decade 2001-2011

Authors: Kajori Banerjee, Laxmi Kant Dwivedi

Abstract:

“Health transition” is defined to be “a process through which high levels of mortality, morbidity and disability are reduced to low levels by influencing cultural, social and behavioural factors”. Life expectancy in India has been on the rise and parallel the burden of disease and disability has also risen noticeably. Borrowing data from Indian Census (2001, 2011), this study identifies the gender-wise burden of disability by calculating disability free life expectancy (DFLE) and life lived with disability (LWD). Sullivan’s method of calculating DFLE using proportion of disabled is used for this purpose. The change in person years lived with disability in the decade 2001-11 is further decomposed using Arriaga’s method into mortality and disability effects (ME and DE) to check the magnitude and direction of contribution of mortality and disability. Nationally, along with DFLE, LWD has amplified too. Despite having the highest life expectancy and DFLE, LWD in Kerala, was highest for both sexes in 2001. But in 2011, the LWD was highest among the males of Orissa and females of Rajasthan. For the overall population, DE is positive for the prime working age groups of 20-40years indicating that there has been an increase in the disability proportion holding mortality constant for 2001-2011. Females exhibit higher positive DE implying greater loss of healthy years due to disability than males. The findings call for an immediate attention to the causes of rising disability burden among the working population, especially females, as this might heavily effect the availability of quality labour force and its relative economic output in the Indian labour market. This also hints at the degrading quality of the elongated life and needs to be given the required attention to enhance the quality of life lead in the Nation.

Keywords: disability-free life expectancy, disability effect, life expectancy, mortality effect

Procedia PDF Downloads 397
20122 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions

Authors: Valerii Dashuk

Abstract:

The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.

Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function

Procedia PDF Downloads 174
20121 Leather Quality of Some Sudan Goats under Range Condition

Authors: Mohammed Alhadi Ebrahiem

Abstract:

This study was designed to investigate the effect of breed and feeding level before slaughter on the skin\leather quality of the three main breeds of Sudan goats. Thirty (30) pieces of fresh skins from the three goat breeds (an average age 1-1.5 years) were chosen for the study purpose. For whole variations between the three breeds in two levels of feeding (poor and rich pastures) Complete Randomized Design (CRD) was used for data analysis. The results revealed that, leather weight (kg), elongation%, tensile strength (kg/cm2), cracking load (kg), thickness (mm), tear load (kg/cm) and chrome% findings were significantly affected (P≥0.05) by breed variation. Flexibility, moisture%, Ash% and fat % were not significantly affected (P ≥ 0.05) by breed. On the other hand, skin weight (kg), Cracking load (kg), Tear load (kg/cm) and Ash% were significantly affected (P≥0.05) by pasture quality. While Leather Elongation%, Tensile strength (kg/cm2), Thickness (mm), Flexibility, Moisture%, Fat % and Chrome% were not statistically (P ≥ 0.05) affected by pastures quality.

Keywords: skin\leather quality, goats leather, natural pasture, Sudan

Procedia PDF Downloads 359
20120 A Study of Flow near the Leading Edge of a Flat Plate by New Idea in Analytical Methods

Authors: M. R. Akbari, S. Akbari, L. Abdollahpour

Abstract:

The present paper is concerned with calculating the 2-dimensional velocity profile of a viscous flow for an incompressible fluid along the leading edge of a flat plate by using the continuity and motion equations with a simple and innovative approach. A Comparison between Numerical method and AGM has been made and the results have been revealed that AGM is very accurate and easy and can be applied for a wide variety of nonlinear problems. It is notable that most of the differential equations can be solved in this approach which in the other approaches they do not have this capability. Moreover, there are some valuable benefits in this method of solving differential equations, for instance: Without any dimensionless procedure, we can solve many differential equation(s), that is, differential equations are directly solvable by this method. In addition, it is not necessary to convert variables into new ones. According to the afore-mentioned expressions which will be proved in this literature, the process of solving nonlinear differential equation(s) will be very simple and convenient in contrast to the other approaches.

Keywords: leading edge, new idea, flat plate, incompressible fluid

Procedia PDF Downloads 287
20119 Bundle Block Detection Using Spectral Coherence and Levenberg Marquardt Neural Network

Authors: K. Padmavathi, K. Sri Ramakrishna

Abstract:

This study describes a procedure for the detection of Left and Right Bundle Branch Block (LBBB and RBBB) ECG patterns using spectral Coherence(SC) technique and LM Neural Network. The Coherence function finds common frequencies between two signals and evaluate the similarity of the two signals. The QT variations of Bundle Blocks are observed in lead V1 of ECG. Spectral Coherence technique uses Welch method for calculating PSD. For the detection of normal and Bundle block beats, SC output values are given as the input features for the LMNN classifier. Overall accuracy of LMNN classifier is 99.5 percent. The data was collected from MIT-BIH Arrhythmia database.

Keywords: bundle block, SC, LMNN classifier, welch method, PSD, MIT-BIH, arrhythmia database

Procedia PDF Downloads 281
20118 Machine Learning Based Approach for Measuring Promotion Effectiveness in Multiple Parallel Promotions’ Scenarios

Authors: Revoti Prasad Bora, Nikita Katyal

Abstract:

Promotion is a key element in the retail business. Thus, analysis of promotions to quantify their effectiveness in terms of Revenue and/or Margin is an essential activity in the retail industry. However, measuring the sales/revenue uplift is based on estimations, as the actual sales/revenue without the promotion is not present. Further, the presence of Halo and Cannibalization in a multiple parallel promotions’ scenario complicates the problem. Calculating Baseline by considering inter-brand/competitor items or using Halo and Cannibalization's impact on Revenue calculations by considering Baseline as an interpretation of items’ unit sales in neighboring nonpromotional weeks individually may not capture the overall Revenue uplift in the case of multiple parallel promotions. Hence, this paper proposes a Machine Learning based method for calculating the Revenue uplift by considering the Halo and Cannibalization impact on the Baseline and the Revenue. In the first section of the proposed methodology, Baseline of an item is calculated by incorporating the impact of the promotions on its related items. In the later section, the Revenue of an item is calculated by considering both Halo and Cannibalization impacts. Hence, this methodology enables correct calculation of the overall Revenue uplift due a given promotion.

Keywords: Halo, Cannibalization, promotion, Baseline, temporary price reduction, retail, elasticity, cross price elasticity, machine learning, random forest, linear regression

Procedia PDF Downloads 177
20117 Probabilistic and Stochastic Analysis of a Retaining Wall for C-Φ Soil Backfill

Authors: André Luís Brasil Cavalcante, Juan Felix Rodriguez Rebolledo, Lucas Parreira de Faria Borges

Abstract:

A methodology for the probabilistic analysis of active earth pressure on retaining wall for c-Φ soil backfill is described in this paper. The Rosenblueth point estimate method is used to measure the failure probability of a gravity retaining wall. The basic principle of this methodology is to use two point estimates, i.e., the standard deviation and the mean value, to examine a variable in the safety analysis. The simplicity of this framework assures to its wide application. For the calculation is required 2ⁿ repetitions during the analysis, since the system is governed by n variables. In this study, a probabilistic model based on the Rosenblueth approach for the computation of the overturning probability of failure of a retaining wall is presented. The obtained results have shown the advantages of this kind of models in comparison with the deterministic solution. In a relatively easy way, the uncertainty on the wall and fill parameters are taken into account, and some practical results can be obtained for the retaining structure design.

Keywords: retaining wall, active earth pressure, backfill, probabilistic analysis

Procedia PDF Downloads 418
20116 A Packet Loss Probability Estimation Filter Using Most Recent Finite Traffic Measurements

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

A packet loss probability (PLP) estimation filter with finite memory structure is proposed to estimate the packet rate mean and variance of the input traffic process in real-time while removing undesired system and measurement noises. The proposed PLP estimation filter is developed under a weighted least square criterion using only the finite traffic measurements on the most recent window. The proposed PLP estimation filter is shown to have several inherent properties such as unbiasedness, deadbeat, robustness. A guideline for choosing appropriate window length is described since it can affect significantly the estimation performance. Using computer simulations, the proposed PLP estimation filter is shown to be superior to the Kalman filter for the temporarily uncertain system. One possible explanation for this is that the proposed PLP estimation filter can have greater convergence time of a filtered estimate as the window length M decreases.

Keywords: packet loss probability estimation, finite memory filter, infinite memory filter, Kalman filter

Procedia PDF Downloads 672