Search results for: non-IID data distribution
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27400

Search results for: non-IID data distribution

27340 Predicting the Uniaxial Strength Distribution of Brittle Materials Based on a Uniaxial Test

Authors: Benjamin Sonnenreich

Abstract:

Brittle fracture failure probability is best described using a stochastic approach which is based on the 'weakest link concept' and the connection between a microstructure and macroscopic fracture scale. A general theoretical and experimental framework is presented to predict the uniaxial strength distribution according to independent uniaxial test data. The framework takes as input the applied stresses, the geometry, the materials, the defect distributions and the relevant random variables from uniaxial test results and gives as output an overall failure probability that can be used to improve the reliability of practical designs. Additionally, the method facilitates comparisons of strength data from several sources, uniaxial tests, and sample geometries.

Keywords: brittle fracture, strength distribution, uniaxial, weakest link concept

Procedia PDF Downloads 296
27339 Examining the Relationship between Chi-Square Test Statistics and Skewness of Weibull Distribution: Simulation Study

Authors: Rafida M. Elobaid

Abstract:

Most of the literature on goodness-of-fit test try to provide a theoretical basis for studying empirical distribution functions. Such goodness-of-fit tests are Kolmogorove-Simirnov and Crumer-Von Mises Type tests. However, it is likely that most of literature has not focused in details on the relationship of the values of the test statistics and skewness or kurtosis. The aim of this study is to investigate the behavior of the values of the χ2 test statistic with the variation of the skewness of right skewed distribution. A simulation study is conducted to generate random numbers from Weibull distribution. For a fixed sample sizes, different levels of skewness are considered, and the corresponding values of the χ2 test statistic are calculated. Using different sample sizes, the results show an inverse relationship between the value of χ2 test and the level of skewness for Wiebull distribution, i.e the value of χ2 test statistic decreases as the value of skewness increases. The research results also show that with large values of skewness we are more confident that the data follows the assumed distribution. Nonparametric Kendall τ test is used to confirm these results.

Keywords: goodness-of-fit test, chi-square test, simulation, continuous right skewed distributions

Procedia PDF Downloads 377
27338 Assessing Functional Structure in European Marine Ecosystems Using a Vector-Autoregressive Spatio-Temporal Model

Authors: Katyana A. Vert-Pre, James T. Thorson, Thomas Trancart, Eric Feunteun

Abstract:

In marine ecosystems, spatial and temporal species structure is an important component of ecosystems’ response to anthropological and environmental factors. Although spatial distribution patterns and fish temporal series of abundance have been studied in the past, little research has been allocated to the joint dynamic spatio-temporal functional patterns in marine ecosystems and their use in multispecies management and conservation. Each species represents a function to the ecosystem, and the distribution of these species might not be random. A heterogeneous functional distribution will lead to a more resilient ecosystem to external factors. Applying a Vector-Autoregressive Spatio-Temporal (VAST) model for count data, we estimate the spatio-temporal distribution, shift in time, and abundance of 140 species of the Eastern English Chanel, Bay of Biscay and Mediterranean Sea. From the model outputs, we determined spatio-temporal clusters, calculating p-values for hierarchical clustering via multiscale bootstrap resampling. Then, we designed a functional map given the defined cluster. We found that the species distribution within the ecosystem was not random. Indeed, species evolved in space and time in clusters. Moreover, these clusters remained similar over time deriving from the fact that species of a same cluster often shifted in sync, keeping the overall structure of the ecosystem similar overtime. Knowing the co-existing species within these clusters could help with predicting data-poor species distribution and abundance. Further analysis is being performed to assess the ecological functions represented in each cluster.

Keywords: cluster distribution shift, European marine ecosystems, functional distribution, spatio-temporal model

Procedia PDF Downloads 166
27337 Development of Value Based Planning Methodology Incorporating Risk Assessment for Power Distribution Network

Authors: Asnawi Mohd Busrah, Au Mau Teng, Tan Chin Hooi, Lau Chee Chong

Abstract:

This paper describes value based planning (VBP) methodology incorporating risk assessment as an enhanced and more practical approach to evaluate distribution network projects in Peninsular Malaysia. Assessment indicators associated with economics, performance and risks are formulated to evaluate distribution projects to quantify their benefits against investment. The developed methodology is implemented in a web-based software customized to capture investment and network data, compute assessment indicators and rank the proposed projects according to their benefits. Value based planning approach addresses economic factors in the power distribution planning assessment, so as to minimize cost solution to the power utility while at the same time provide maximum benefits to customers.

Keywords: value based planning, distribution network, value of loss load (VoLL), energy not served (ENS)

Procedia PDF Downloads 453
27336 A Study on Solutions to Connect Distribution Power Grid up to Renewable Energy Sources at KEPCO

Authors: Seung Yoon Hyun, Hyeong Seung An, Myeong Ho Choi, Sung Hwan Bae, Yu Jong Sim

Abstract:

In 2015, the southern part of the Korean Peninsula has 8.6 million poles, 1.25 million km power lines, and 2 million transformers, etc. It is the massive amount of distribution equipments which could cover a round-trip distance from the earth to the moon and 11 turns around the earth. These distribution equipments are spread out like capillaries and supplying power to every corner of the Korean Peninsula. In order to manage these huge power facility efficiently, KEPCO use DAS (Distribution Automation System) to operate distribution power system since 1997. DAS is integrated system that enables to remotely supervise and control breakers and switches on distribution network. Using DAS, we can reduce outage time and power loss. KEPCO has about 160,000 switches, 50%(about 80,000) of switches are automated, and 41 distribution center monitoring&control these switches 24-hour 365 days to get the best efficiency of distribution networks. However, the rapid increasing renewable energy sources become the problem in the efficient operation of distributed power system. (currently 2,400 MW, 75,000 generators operate in distribution power system). In this paper, it suggests the way to interconnect between renewable energy source and distribution power system.

Keywords: distribution, renewable, connect, DAS (Distribution Automation System)

Procedia PDF Downloads 584
27335 Water Distribution Uniformity of Solid-Set Sprinkler Irrigation under Low Operating Pressure

Authors: Manal Osman

Abstract:

Sprinkler irrigation system became more popular to reduce water consumption and increase irrigation efficiency. The water distribution uniformity plays an important role in the performance of the sprinkler irrigation system. The use of low operating pressure instead of high operating pressure can be achieved many benefits including energy and water saving. An experimental study was performed to investigate the water distribution uniformity of the solid-set sprinkler irrigation system under low operating pressure. Different low operating pressures (62, 82, 102 and 122 kPa) were selected. The range of operating pressure was lower than the recommended in the previous studies to investigate the effect of low pressure on the water distribution uniformity. Different nozzle diameters (4, 5, 6 and 7 mm) were used. The outdoor single sprinkler test was performed. The water distribution of single sprinkler, the coefficients of uniformity such as coefficient of uniformity (CU), distribution uniformity of low quarter (DUlq), distribution uniformity of low half (DUlh), coefficient of variation (CV) and the distribution characteristics like rotation speed, throw radius and overlapping distance are presented in this paper.

Keywords: low operating pressure, sprinkler irrigation system, water distribution uniformity

Procedia PDF Downloads 557
27334 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 489
27333 On Modeling Data Sets by Means of a Modified Saddlepoint Approximation

Authors: Serge B. Provost, Yishan Zhang

Abstract:

A moment-based adjustment to the saddlepoint approximation is introduced in the context of density estimation. First applied to univariate distributions, this methodology is extended to the bivariate case. It then entails estimating the density function associated with each marginal distribution by means of the saddlepoint approximation and applying a bivariate adjustment to the product of the resulting density estimates. The connection to the distribution of empirical copulas will be pointed out. As well, a novel approach is proposed for estimating the support of distribution. As these results solely rely on sample moments and empirical cumulant-generating functions, they are particularly well suited for modeling massive data sets. Several illustrative applications will be presented.

Keywords: empirical cumulant-generating function, endpoints identification, saddlepoint approximation, sample moments, density estimation

Procedia PDF Downloads 125
27332 Short Term Distribution Load Forecasting Using Wavelet Transform and Artificial Neural Networks

Authors: S. Neelima, P. S. Subramanyam

Abstract:

The major tool for distribution planning is load forecasting, which is the anticipation of the load in advance. Artificial neural networks have found wide applications in load forecasting to obtain an efficient strategy for planning and management. In this paper, the application of neural networks to study the design of short term load forecasting (STLF) Systems was explored. Our work presents a pragmatic methodology for short term load forecasting (STLF) using proposed two-stage model of wavelet transform (WT) and artificial neural network (ANN). It is a two-stage prediction system which involves wavelet decomposition of input data at the first stage and the decomposed data with another input is trained using a separate neural network to forecast the load. The forecasted load is obtained by reconstruction of the decomposed data. The hybrid model has been trained and validated using load data from Telangana State Electricity Board.

Keywords: electrical distribution systems, wavelet transform (WT), short term load forecasting (STLF), artificial neural network (ANN)

Procedia PDF Downloads 398
27331 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions

Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen

Abstract:

Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.

Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma

Procedia PDF Downloads 148
27330 Fault Location Detection in Active Distribution System

Authors: R. Rezaeipour, A. R. Mehrabi

Abstract:

Recent increase of the DGs and microgrids in distribution systems, disturbs the tradition structure of the system. Coordination between protection devices in such a system becomes the concern of the network operators. This paper presents a new method for fault location detection in the active distribution networks, independent of the fault type or its resistance. The method uses synchronized voltage and current measurements at the interconnection of DG units and is able to adapt to changes in the topology of the system. The method has been tested on a 38-bus distribution system, with very encouraging results.

Keywords: fault location detection, active distribution system, micro grids, network operators

Procedia PDF Downloads 751
27329 Joint Probability Distribution of Extreme Water Level with Rainfall and Temperature: Trend Analysis of Potential Impacts of Climate Change

Authors: Ali Razmi, Saeed Golian

Abstract:

Climate change is known to have the potential to impact adversely hydrologic patterns for variables such as rainfall, maximum and minimum temperature and sea level rise. Long-term average of these climate variables could possibly change over time due to climate change impacts. In this study, trend analysis was performed on rainfall, maximum and minimum temperature and water level data of a coastal area in Manhattan, New York City, Central Park and Battery Park stations to investigate if there is a significant change in the data mean. Partial Man-Kendall test was used for trend analysis. Frequency analysis was then performed on data using common probability distribution functions such as Generalized Extreme Value (GEV), normal, log-normal and log-Pearson. Goodness of fit tests such as Kolmogorov-Smirnov are used to determine the most appropriate distributions. In flood frequency analysis, rainfall and water level data are often separately investigated. However, in determining flood zones, simultaneous consideration of rainfall and water level in frequency analysis could have considerable effect on floodplain delineation (flood extent and depth). The present study aims to perform flood frequency analysis considering joint probability distribution for rainfall and storm surge. First, correlation between the considered variables was investigated. Joint probability distribution of extreme water level and temperature was also investigated to examine how global warming could affect sea level flooding impacts. Copula functions were fitted to data and joint probability of water level with rainfall and temperature for different recurrence intervals of 2, 5, 25, 50, 100, 200, 500, 600 and 1000 was determined and compared with the severity of individual events. Results for trend analysis showed increase in long-term average of data that could be attributed to climate change impacts. GEV distribution was found as the most appropriate function to be fitted to the extreme climate variables. The results for joint probability distribution analysis confirmed the necessity for incorporation of both rainfall and water level data in flood frequency analysis.

Keywords: climate change, climate variables, copula, joint probability

Procedia PDF Downloads 326
27328 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics

Authors: Farhad Asadi, Mohammad Javad Mollakazemi

Abstract:

In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.

Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm

Procedia PDF Downloads 399
27327 Wireless Sensor Network for Forest Fire Detection and Localization

Authors: Tarek Dandashi

Abstract:

WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.

Keywords: forest fire, WSN, wireless sensor network, algortihm

Procedia PDF Downloads 239
27326 Effects of Compensation on Distribution System Technical Losses

Authors: B. Kekezoglu, C. Kocatepe, O. Arikan, Y. Hacialiefendioglu, G. Ucar

Abstract:

One of the significant problems of energy systems is to supply economic and efficient energy to consumers. Therefore studies has been continued to reduce technical losses in the network. In this paper, the technical losses analyzed for a portion of European side of Istanbul MV distribution network for different compensation scenarios by considering real system and load data and results are presented. Investigated system is modeled with CYME Power Engineering Software and optimal capacity placement has been proposed to minimize losses.

Keywords: distribution system, optimal capacitor placement, reactive power compensation, technical losses

Procedia PDF Downloads 649
27325 Linkage between Trace Element Distribution and Growth Ring Formation in Japanese Red Coral (Paracorallium japonicum)

Authors: Luan Trong Nguyen, M. Azizur Rahman, Yusuke Tamenori, Toshihiro Yoshimura, Nozomu Iwasaki, Hiroshi Hasegawa

Abstract:

This study investigated the distribution of magnesium (Mg), phosphorus (P), sulfur (S) and strontium (Sr) using micro X-ray fluorescence (µ-XRF) along the annual growth rings in the skeleton of Japanese red coral Paracorallium japonicum. The Mg, P and S distribution in µ-XRF mapping images correspond to the dark and light bands along the annual growth rings observed in microscopic images of the coral skeleton. The µ-XRF mapping data showed a positive correlation (r = 0.6) between P and S distribution in the coral skeleton. A contrasting distribution pattern of S and Mg along the axial skeleton of P. japonicum indicates a weak negative correlation (r = -0.2) between these two trace elements. The distribution pattern of S, P and Mg reveals linkage between their distributions and the formation of dark/light bands along the annual growth rings in the axial skeleton of P. japonicum. Sulfur and P were distributed in the organic matrix rich dark bands, while Mg was distributed in the light bands of the annual growth rings.

Keywords: µ-XRF, trace element, precious coral, Paracorallium japonicum

Procedia PDF Downloads 416
27324 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 100
27323 The Generalized Pareto Distribution as a Model for Sequential Order Statistics

Authors: Mahdy ‎Esmailian, Mahdi ‎Doostparast, Ahmad ‎Parsian

Abstract:

‎In this article‎, ‎sequential order statistics (SOS) censoring type II samples coming from the generalized Pareto distribution are considered‎. ‎Maximum likelihood (ML) estimators of the unknown parameters are derived on the basis of the available multiple SOS data‎. ‎Necessary conditions for existence and uniqueness of the derived ML estimates are given‎. Due to complexity in the proposed likelihood function‎, ‎a useful re-parametrization is suggested‎. ‎For illustrative purposes‎, ‎a Monte Carlo simulation study is conducted and an illustrative example is analysed‎.

Keywords: bayesian estimation‎, generalized pareto distribution‎, ‎maximum likelihood estimation‎, sequential order statistics

Procedia PDF Downloads 474
27322 A Comparative Analysis of Geometric and Exponential Laws in Modelling the Distribution of the Duration of Daily Precipitation

Authors: Mounia El Hafyani, Khalid El Himdi

Abstract:

Precipitation is one of the key variables in water resource planning. The importance of modeling wet and dry durations is a crucial pointer in engineering hydrology. The objective of this study is to model and analyze the distribution of wet and dry durations. For this purpose, the daily rainfall data from 1967 to 2017 of the Moroccan city of Kenitra’s station are used. Three models are implemented for the distribution of wet and dry durations, namely the first-order Markov chain, the second-order Markov chain, and the truncated negative binomial law. The adherence of the data to the proposed models is evaluated using Chi-square and Kolmogorov-Smirnov tests. The Akaike information criterion is applied to assess the most effective model distribution. We go further and study the law of the number of wet and dry days among k consecutive days. The calculation of this law is done through an algorithm that we have implemented based on conditional laws. We complete our work by comparing the observed moments of the numbers of wet/dry days among k consecutive days to the calculated moment of the three estimated models. The study shows the effectiveness of our approach in modeling wet and dry durations of daily precipitation.

Keywords: Markov chain, rainfall, truncated negative binomial law, wet and dry durations

Procedia PDF Downloads 89
27321 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 502
27320 Data-Driven Simulations Tools for Der and Battery Rich Power Grids

Authors: Ali Moradiamani, Samaneh Sadat Sajjadi, Mahdi Jalili

Abstract:

Power system analysis has been a major research topic in the generation and distribution sections, in both industry and academia, for a long time. Several load flow and fault analysis scenarios have been normally performed to study the performance of different parts of the grid in the context of, for example, voltage and frequency control. Software tools, such as PSCAD, PSSE, and PowerFactory DIgSILENT, have been developed to perform these analyses accurately. Distribution grid had been the passive part of the grid and had been known as the grid of consumers. However, a significant paradigm shift has happened with the emergence of Distributed Energy Resources (DERs) in the distribution level. It means that the concept of power system analysis needs to be extended to the distribution grid, especially considering self sufficient technologies such as microgrids. Compared to the generation and transmission levels, the distribution level includes significantly more generation/consumption nodes thanks to PV rooftop solar generation and battery energy storage systems. In addition, different consumption profile is expected from household residents resulting in a diverse set of scenarios. Emergence of electric vehicles will absolutely make the environment more complicated considering their charging (and possibly discharging) requirements. These complexities, as well as the large size of distribution grids, create challenges for the available power system analysis software. In this paper, we study the requirements of simulation tools in the distribution grid and how data-driven algorithms are required to increase the accuracy of the simulation results.

Keywords: smart grids, distributed energy resources, electric vehicles, battery storage systsms, simulation tools

Procedia PDF Downloads 65
27319 An Evaluation Method of Accelerated Storage Life Test for Typical Mechanical and Electronic Products

Authors: Jinyong Yao, Hongzhi Li, Chao Du, Jiao Li

Abstract:

Reliability of long-term storage products is related to the availability of the whole system, and the evaluation of storage life is of great necessity. These products are usually highly reliable and little failure information can be collected. In this paper, an analytical method based on data from accelerated storage life test is proposed to evaluate the reliability index of the long-term storage products. Firstly, singularities are eliminated by data normalization and residual analysis. Secondly, with the pre-processed data, the degradation path model is built to obtain the pseudo life values. Then by life distribution hypothesis, we can get the estimator of parameters in high stress levels and verify failure mechanisms consistency. Finally, the life distribution under the normal stress level is extrapolated via the acceleration model and evaluation of the true average life available. An application example with the camera stabilization device is provided to illustrate the methodology we proposed.

Keywords: accelerated storage life test, failure mechanisms consistency, life distribution, reliability

Procedia PDF Downloads 364
27318 The Effect of Human Capital and Oil Revenue on Income Distribution in Real Sample

Authors: Marjan Majdi, MohammadAli Moradi, Elham Samarikhalaj

Abstract:

Income distribution is one of the most topics in macro economic theories. There are many categories in economy such as income distribution that have the most influenced by economic policies. Human capital has an impact on economic growth and it has significant effect on income distributions. The results of this study confirm that the effects of oil revenue and human capital on income distribution are negative and significant but the value of the estimated coefficient is too small in a real sample in period time (1969-2006).

Keywords: gini coefficient, human capital, income distribution, oil revenue

Procedia PDF Downloads 596
27317 Reliability Analysis in Power Distribution System

Authors: R. A. Deshpande, P. Chandhra Sekhar, V. Sankar

Abstract:

In this paper, we discussed the basic reliability evaluation techniques needed to evaluate the reliability of distribution systems which are applied in distribution system planning and operation. Basically, the reliability study can also help to predict the reliability performance of the system after quantifying the impact of adding new components to the system. The number and locations of new components needed to improve the reliability indices to certain limits are identified and studied.

Keywords: distribution system, reliability indices, urban feeder, rural feeder

Procedia PDF Downloads 744
27316 Historical Metaphors in Insurance: A Journey

Authors: Anjuman Antil, Anuj Kapoor, Neha Saini

Abstract:

Purpose: The purpose of this paper is to study the evolution of insurance in India and the world. The paper also traced the historical basis of life insurance in the world and how it emerged as a major sector in India’s economy. The promotional strategies and distribution channel of top three companies in the Indian insurance sector are also discussed. Design/methodology/approach: The paper examined the secondary data which includes the reports issued by Insurance Regulatory Authority of India, websites of companies, books, and journals relevant to the study. Findings: The paper argued the role and importance of insurance in an emerging economy. The challenges and opportunities of the insurance sector are briefed out. The emerging areas in the insurance sector in terms of promotional strategies and distribution channel are also listed. Implications: The historical evolution can be studied by companies while formulating their strategies. It will help them analyse the insurance sector, how things have changed and how to change with the changing times. Originality/value: This paper gives comprehensive data regarding the background of the insurance sector. Along with historical perspective, marketing and distribution, current and future trends have been discussed.

Keywords: insurance, evolution, life insurance, marketing, distribution channels

Procedia PDF Downloads 204
27315 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling

Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao

Abstract:

In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.

Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis

Procedia PDF Downloads 108
27314 Modeling the Current and Future Distribution of Anthus Pratensis under Climate Change

Authors: Zahira Belkacemi

Abstract:

One of the most important tools in conservation biology is information on the geographic distribution of species and the variables determining those patterns. In this study, we used maximum-entropy niche modeling (Maxent) to predict the current and future distribution of Anthus pratensis using climatic variables. The results showed that the species would not be highly affected by the climate change in shifting its distribution; however, the results of this study should be improved by taking into account other predictors, and that the NATURA 2000 protected sites will be efficient at 42% in protecting the species.

Keywords: anthus pratensis, climate change, Europe, species distribution model

Procedia PDF Downloads 107
27313 Bayesian Using Markov Chain Monte Carlo and Lindley's Approximation Based on Type-I Censored Data

Authors: Al Omari Moahmmed Ahmed

Abstract:

These papers describe the Bayesian Estimator using Markov Chain Monte Carlo and Lindley’s approximation and the maximum likelihood estimation of the Weibull distribution with Type-I censored data. The maximum likelihood method can’t estimate the shape parameter in closed forms, although it can be solved by numerical methods. Moreover, the Bayesian estimates of the parameters, the survival and hazard functions cannot be solved analytically. Hence Markov Chain Monte Carlo method and Lindley’s approximation are used, where the full conditional distribution for the parameters of Weibull distribution are obtained via Gibbs sampling and Metropolis-Hastings algorithm (HM) followed by estimate the survival and hazard functions. The methods are compared to Maximum Likelihood counterparts and the comparisons are made with respect to the Mean Square Error (MSE) and absolute bias to determine the better method in scale and shape parameters, the survival and hazard functions.

Keywords: weibull distribution, bayesian method, markov chain mote carlo, survival and hazard functions

Procedia PDF Downloads 449
27312 Population Size Estimation Based on the GPD

Authors: O. Anan, D. Böhning, A. Maruotti

Abstract:

The purpose of the study is to estimate the elusive target population size under a truncated count model that accounts for heterogeneity. The purposed estimator is based on the generalized Poisson distribution (GPD), which extends the Poisson distribution by adding a dispersion parameter. Thus, it becomes an useful model for capture-recapture data where concurrent events are not homogeneous. In addition, it can account for over-dispersion and under-dispersion. The ratios of neighboring frequency counts are used as a tool for investigating the validity of whether generalized Poisson or Poisson distribution. Since capture-recapture approaches do not provide the zero counts, the estimated parameters can be achieved by modifying the EM-algorithm technique for the zero-truncated generalized Poisson distribution. The properties and the comparative performance of proposed estimator were investigated through simulation studies. Furthermore, some empirical examples are represented insights on the behavior of the estimators.

Keywords: capture, recapture methods, ratio plot, heterogeneous population, zero-truncated count

Procedia PDF Downloads 415
27311 To Ensure Maximum Voter Privacy in E-Voting Using Blockchain, Convolutional Neural Network, and Quantum Key Distribution

Authors: Bhaumik Tyagi, Mandeep Kaur, Kanika Singla

Abstract:

The advancement of blockchain has facilitated scholars to remodel e-voting systems for future generations. Server-side attacks like SQL injection attacks and DOS attacks are the most common attacks nowadays, where malicious codes are injected into the system through user input fields by illicit users, which leads to data leakage in the worst scenarios. Besides, quantum attacks are also there which manipulate the transactional data. In order to deal with all the above-mentioned attacks, integration of blockchain, convolutional neural network (CNN), and Quantum Key Distribution is done in this very research. The utilization of blockchain technology in e-voting applications is not a novel concept. But privacy and security issues are still there in a public and private blockchains. To solve this, the use of a hybrid blockchain is done in this research. This research proposed cryptographic signatures and blockchain algorithms to validate the origin and integrity of the votes. The convolutional neural network (CNN), a normalized version of the multilayer perceptron, is also applied in the system to analyze visual descriptions upon registration in a direction to enhance the privacy of voters and the e-voting system. Quantum Key Distribution is being implemented in order to secure a blockchain-based e-voting system from quantum attacks using quantum algorithms. Implementation of e-voting blockchain D-app and providing a proposed solution for the privacy of voters in e-voting using Blockchain, CNN, and Quantum Key Distribution is done.

Keywords: hybrid blockchain, secure e-voting system, convolutional neural networks, quantum key distribution, one-time pad

Procedia PDF Downloads 54