Search results for: Pareto distribution
1629 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: Statistical slope stability analysis, Skew distributions, Probability of failure, Functions of random variables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15451628 Entropic Measures of a Probability Sample Space and Exponential Type (α, β) Entropy
Authors: Rajkumar Verma, Bhu Dev Sharma
Abstract:
Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon’s entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon’s and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called ‘entropic measure of a sample space’, with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different ‘entropic measures’ depending on the distributions defined on it. It was noted that Campbell’s idea applied for R`enyi’s parametric entropy of a given order also. Knowing that parameters play a role in providing suitable choices and extended applications, paper studies parametric entropic measures of sample spaces also. Exponential entropies related to Shannon’s and those generalizations that have logarithmic functions, i.e. are additive have been studies for wider understanding and applications. We propose and study exponential entropies corresponding to non additive entropies of type (α, β), which include Havard and Charvˆat entropy as a special case.
Keywords: Sample space, Probability distributions, Shannon’s entropy, R`enyi’s entropy, Non-additive entropies .
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33911627 Voltage Stability Investigation of Grid Connected Wind Farm
Authors: Trinh Trong Chuong
Abstract:
At present, it is very common to find renewable energy resources, especially wind power, connected to distribution systems. The impact of this wind power on voltage distribution levels has been addressed in the literature. The majority of this works deals with the determination of the maximum active and reactive power that is possible to be connected on a system load bus, until the voltage at that bus reaches the voltage collapse point. It is done by the traditional methods of PV curves reported in many references. Theoretical expression of maximum power limited by voltage stability transfer through a grid is formulated using an exact representation of distribution line with ABCD parameters. The expression is used to plot PV curves at various power factors of a radial system. Limited values of reactive power can be obtained. This paper presents a method to study the relationship between the active power and voltage (PV) at the load bus to identify the voltage stability limit. It is a foundation to build a permitted working operation region in complying with the voltage stability limit at the point of common coupling (PCC) connected wind farm.Keywords: Wind generator, Voltage stability, grid connected
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36551626 Effect of Fine-Ground Ceramic Admixture on Early Age Properties of Cement Paste
Authors: Z. Pavlík, M. Pavlíková, P. Volfová, M. Keppert, R. Černý
Abstract:
Properties of cement pastes with fine-ground ceramics used as an alternative binder replacing Portland cement up to 20% of its mass are investigated. At first, the particle size distribution of cement and fine-ground ceramics is measured using laser analyser. Then, the material properties are studied in the early hardening period up to 28 days. The hydration process of studied materials is monitored by electrical conductivity measurement using TDR sensors. The changes of materials- structures within the hardening are observed using pore size distribution measurement. The compressive strength measurements are done as well. Experimental results show that the replacement of Portland cement by fine-ground ceramics in the amount of up to 20% by mass is acceptable solution from the mechanical point of view. One can also assume similar physical properties of designed materials to the reference material with only Portland cement as binder.Keywords: Fine-ground ceramics, cement pastes, early age properties, mechanical properties, pore size distribution, electrical conductivity measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15941625 Optimal Placement and Sizing of Energy Storage System in Distribution Network with Photovoltaic Based Distributed Generation Using Improved Firefly Algorithms
Authors: Ling Ai Wong, Hussain Shareef, Azah Mohamed, Ahmad Asrul Ibrahim
Abstract:
The installation of photovoltaic based distributed generation (PVDG) in active distribution system can lead to voltage fluctuation due to the intermittent and unpredictable PVDG output power. This paper presented a method in mitigating the voltage rise by optimally locating and sizing the battery energy storage system (BESS) in PVDG integrated distribution network. The improved firefly algorithm is used to perform optimal placement and sizing. Three objective functions are presented considering the voltage deviation and BESS off-time with state of charge as the constraint. The performance of the proposed method is compared with another optimization method such as the original firefly algorithm and gravitational search algorithm. Simulation results show that the proposed optimum BESS location and size improve the voltage stability.
Keywords: BESS, PVDG, firefly algorithm, voltage fluctuation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13221624 Analysis of Temperature Change under Global Warming Impact using Empirical Mode Decomposition
Authors: Md. Khademul Islam Molla, Akimasa Sumi, M. Sayedur Rahman
Abstract:
The empirical mode decomposition (EMD) represents any time series into a finite set of basis functions. The bases are termed as intrinsic mode functions (IMFs) which are mutually orthogonal containing minimum amount of cross-information. The EMD successively extracts the IMFs with the highest local frequencies in a recursive way, which yields effectively a set low-pass filters based entirely on the properties exhibited by the data. In this paper, EMD is applied to explore the properties of the multi-year air temperature and to observe its effects on climate change under global warming. This method decomposes the original time-series into intrinsic time scale. It is capable of analyzing nonlinear, non-stationary climatic time series that cause problems to many linear statistical methods and their users. The analysis results show that the mode of EMD presents seasonal variability. The most of the IMFs have normal distribution and the energy density distribution of the IMFs satisfies Chi-square distribution. The IMFs are more effective in isolating physical processes of various time-scales and also statistically significant. The analysis results also show that the EMD method provides a good job to find many characteristics on inter annual climate. The results suggest that climate fluctuations of every single element such as temperature are the results of variations in the global atmospheric circulation.
Keywords: Empirical mode decomposition, instantaneous frequency, Hilbert spectrum, Chi-square distribution, anthropogenic impact.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21491623 Non-Sensitive Solutions in Multi-Objective Optimization of a Solar Photovoltaic/Thermal(PV/T) Air Collector
Authors: F. Sarhaddi, S. Farahat, M .A. Alavi, F. Sobhnamayan
Abstract:
In this paper, an attempt has been made to obtain nonsensitive solutions in the multi-objective optimization of a photovoltaic/thermal (PV/T) air collector. The selected objective functions are overall energy efficiency and exergy efficiency. Improved thermal, electrical and exergy models are used to calculate the thermal and electrical parameters, overall energy efficiency, exergy components and exergy efficiency of a typical PV/T air collector. A computer simulation program is also developed. The results of numerical simulation are in good agreement with the experimental measurements noted in the previous literature. Finally, multi-objective optimization has been carried out under given climatic, operating and design parameters. The optimized ranges of inlet air velocity, duct depth and the objective functions in optimal Pareto front have been obtained. Furthermore, non-sensitive solutions from energy or exergy point of view in the results of multi-objective optimization have been shown.Keywords: Solar photovoltaic thermal (PV/T) air collector, Overall energy efficiency, Exergy efficiency, Multi-objectiveoptimization, Sensitivity analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21571622 Variational EM Inference Algorithm for Gaussian Process Classification Model with Multiclass and Its Application to Human Action Classification
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian process classification model that can be used in the field of human behavior recognition. This algorithm can drive simultaneously both a posterior distribution of a latent function and estimators of hyper-parameters in a Gaussian process classification model with multiclass. Our algorithm is based on the Laplace approximation (LA) technique and variational EM framework. This is performed in two steps: called expectation and maximization steps. First, in the expectation step, using the Bayesian formula and LA technique, we derive approximately the posterior distribution of the latent function indicating the possibility that each observation belongs to a certain class in the Gaussian process classification model. Second, in the maximization step, using a derived posterior distribution of latent function, we compute the maximum likelihood estimator for hyper-parameters of a covariance matrix necessary to define prior distribution for latent function. These two steps iteratively repeat until a convergence condition satisfies. Moreover, we apply the proposed algorithm with human action classification problem using a public database, namely, the KTH human action data set. Experimental results reveal that the proposed algorithm shows good performance on this data set.
Keywords: Bayesian rule, Gaussian process classification model with multiclass, Gaussian process prior, human action classification, laplace approximation, variational EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17581621 Finite Element Simulation of Multi-Stage Deep Drawing Processes and Comparison with Experimental Results
Authors: A. Pourkamali Anaraki, M. Shahabizadeh, B. Babaee
Abstract:
The plastic forming process of sheet plate takes an important place in forming metals. The traditional techniques of tool design for sheet forming operations used in industry are experimental and expensive methods. Prediction of the forming results, determination of the punching force, blank holder forces and the thickness distribution of the sheet metal will decrease the production cost and time of the material to be formed. In this paper, multi-stage deep drawing simulation of an Industrial Part has been presented with finite element method. The entire production steps with additional operations such as intermediate annealing and springback has been simulated by ABAQUS software under axisymmetric conditions. The simulation results such as sheet thickness distribution, Punch force and residual stresses have been extracted in any stages and sheet thickness distribution was compared with experimental results. It was found through comparison of results, the FE model have proven to be in close agreement with those of experiment.Keywords: Deep drawing, Finite element method, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50771620 Stochastic Estimation of Wireless Traffic Parameters
Authors: Somenath Mukherjee, Raj Kumar Samanta, Gautam Sanyal
Abstract:
Different services based on different switching techniques in wireless networks leads to drastic changes in the properties of network traffic. Because of these diversities in services, network traffic is expected to undergo qualitative and quantitative variations. Hence, assumption of traffic characteristics and the prediction of network events become more complex for the wireless networks. In this paper, the traffic characteristics have been studied by collecting traces from the mobile switching centre (MSC). The traces include initiation and termination time, originating node, home station id, foreign station id. Traffic parameters namely, call interarrival and holding times were estimated statistically. The results show that call inter-arrival and distribution time in this wireless network is heavy-tailed and follow gamma distributions. They are asymptotically long-range dependent. It is also found that the call holding times are best fitted with lognormal distribution. Based on these observations, an analytical model for performance estimation is also proposed.
Keywords: Wireless networks, traffic analysis, long-range dependence, heavy-tailed distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18971619 Type–2 Fuzzy Programming for Optimizing the Heat Rate of an Industrial Gas Turbine via Absorption Chiller Technology
Authors: T. Ganesan, M. S. Aris, I. Elamvazuthi, Momen Kamal Tageldeen
Abstract:
Terms set in power purchase agreements (PPA) challenge power utility companies in balancing between the returns (from maximizing power production) and securing long term supply contracts at capped production. The production limitation set in the PPA has driven efforts to maximize profits through efficient and economic power production. In this paper, a combined industrial-scale gas turbine (GT) - absorption chiller (AC) system is considered to cool the GT air intake for reducing the plant’s heat rate (HR). This GT-AC system is optimized while considering power output limitations imposed by the PPA. In addition, the proposed formulation accounts for uncertainties in the ambient temperature using Type-2 fuzzy programming. Using the enhanced chaotic differential evolution (CEDE), the Pareto frontier was constructed and the optimization results are analyzed in detail.Keywords: Absorption chillers, turbine inlet air cooling, power purchase agreement, multiobjective optimization, type-2 fuzzy programming, chaotic differential evolution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9341618 Further Investigation of Elastic Scattering of 16O on 12C at Different Energies
Authors: Sh. Hamada, N. Burtebayev, N. Amangeldi, A. Amar
Abstract:
The aim of this work is to study the elastic transfer phenomenon which takes place in the elastic scattering of 16O on 12C at energies near the Coulomb barrier. Where, the angular distribution decrease steadily with increasing the scattering angle, then the cross section will increase at backward angles due to the α-transfer process. This reaction was also studied at different energies for tracking the nuclear rainbow phenomenon. The experimental data of the angular distribution at these energies were compared to the calculation predictions. The optical potential codes such as SPIVAL and Distorted Wave Born Approximation (DWUCK5) were used in analysis.Keywords: Transfer reaction, DWBA, Elastic Scattering, Optical Potential Codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13571617 Distributional Effects of Tax and Benefit Reforms in the Czech Republic
Authors: L. Vítek
Abstract:
The Czech Republic has over the past decade carried out two waves of tax and benefit reforms. The first one took place in 2005–2006 during the left-wing government and the second one has been carried out in 2008 by the right-wing government. Using EUSILC data for selected types of households, the paper assesses changes in the distribution of gross incomes and effects of the changes in taxes and benefits on the distribution of incomes after taxes and a provision of social benefits. The analysis is carried out on four types of households with and without children. The analysis is performed using Lorenz curves and Gini coefficients. The results show that the tax system changes the distribution of incomes less significantly than benefits. The 2006 reform reduced the differential between the Gini coefficient for the gross income and the Gini coefficient after taxes and benefits for households with active parents and one child. Reform in 2008 supported families with children and an reduced the differential between the gross income and income after taxes and benefits for different types of families.
Keywords: Czech Republic, redistribution, tax reforms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10461616 Direct Measurements of Wind Data over 100 Meters above the Ground in the Site of Lendinara, Italy
Authors: A. Dal Monte, M. Raciti Castelli, G. B. Bellato, L. Stevanato, E. Benini
Abstract:
The wind resource in the Italian site of Lendinara (RO) is analyzed through a systematic anemometric campaign performed on the top of the bell tower, at an altitude of over 100 m above the ground. Both the average wind speed and the Weibull distribution are computed. The resulting average wind velocity is in accordance with the numerical predictions of the Italian Wind Atlas, confirming the accuracy of the extrapolation of wind data adopted for the evaluation of wind potential at higher altitudes with respect to the commonly placed measurement stations.Keywords: Anemometric campaign, wind resource, Weibull distribution, wind atlas
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19581615 A New Reliability Based Channel Allocation Model in Mobile Networks
Authors: Anujendra, Parag Kumar Guha Thakurta
Abstract:
The data transmission between mobile hosts and base stations (BSs) in Mobile networks are often vulnerable to failure. So, efficient link connectivity, in terms of the services of both base stations and communication channels of the network, is required in wireless mobile networks to achieve highly reliable data transmission. In addition, it is observed that the number of blocked hosts is increased due to insufficient number of channels during heavy load in the network. Under such scenario, the channels are allocated accordingly to offer a reliable communication at any given time. Therefore, a reliability-based channel allocation model with acceptable system performance is proposed as a MOO problem in this paper. Two conflicting parameters known as Resource Reuse factor (RRF) and the number of blocked calls are optimized under reliability constraint in this problem. The solution to such MOO problem is obtained through NSGA-II (Non dominated Sorting Genetic Algorithm). The effectiveness of the proposed model in this work is shown with a set of experimental results.
Keywords: Base station, channel, GA, Pareto-optimal, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19111614 Evaluating Efficiency of Nina Distribution Company Using Window Data Envelopment Analysis and Malmquist Index
Authors: Hossein Taherian Far, Ali Bazaee
Abstract:
Achieving continuous sustained economic growth and following economic development can be the target for all countries which are looking for it. In this regard, distribution industry plays an important role in growth and development of any nation. So, estimating the efficiency and productivity of the so called industry and identifying factors influencing it, is very necessary. The objective of the present study is to measure the efficiency and productivity of seven branches of Nina Distribution Company using window data envelopment analysis and Malmquist productivity index from spring 2013 to summer 2015. In this study, using criteria of fixed assets, payroll personnel, operating costs and duration of collection of receivables were selected as inputs and people and net sales, gross profit and percentage of coverage to customers were selected as outputs. Then, the process of performance window data envelopment analysis was driven and process efficiency has been measured using Malmquist index. The results indicate that the average technical efficiency of window Data Envelopment Analysis (DEA) model and fluctuating trend is sustainable. But the average management efficiency in window DEA model is related with negative growth (decline) of about 13%. The mean scale efficiency in all windows, except in the second one which is faced with 8%, shows growth of 18% compared to the first window. On the other hand, the mean change in total factor productivity in all branches of the industry shows average negative growth (decrease) of 12% which are the result of a negative change in technology.
Keywords: Nina Distribution Company branches, window data envelopment analysis, Malmquist productivity index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11621613 Screening of Process Variables for the Production of Extracellular Lipase from Palm Oil by Trichoderma Viride using Plackett-Burman Design
Authors: R. Rajendiran, S. Gayathri devi, B.T. SureshKumar, V. Arul Priya
Abstract:
Plackett-Burman statistical screening of media constituents and operational conditions for extracellular lipase production from isolate Trichoderma viride has been carried out in submerged fermentation. This statistical design is used in the early stages of experimentation to screen out unimportant factors from a large number of possible factors. This design involves screening of up to 'n-1' variables in just 'n' number of experiments. Regression coefficients and t-values were calculated by subjecting the experimental data to statistical analysis using Minitab version 15. The effects of nine process variables were studied in twelve experimental trials. Maximum lipase activity of 7.83 μmol /ml /min was obtained in the 6th trail. Pareto chart illustrates the order of significance of the variables affecting the lipase production. The present study concludes that the most significant variables affecting lipase production were found to be palm oil, yeast extract, K2HPO4, MgSO4 and CaCl2.Keywords: lipase, submerged fermentation, statistical optimization, Trichoderma viride
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23201612 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.
Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33111611 Thermal Post-buckling of Shape Memory Alloy Composite Plates under Non-uniform Temperature Distribution
Authors: Z.A. Rasid, R. Zahari, A. Ayob, D.L. Majid, A.S.M. Rafie
Abstract:
Aerospace vehicles are subjected to non-uniform thermal loading that may cause thermal buckling. A study was conducted on the thermal post-buckling of shape memory alloy composite plates subjected to the non-uniform tent-like temperature field. The shape memory alloy wires were embedded within the laminated composite plates to add recovery stress to the plates. The non-linear finite element model that considered the recovery stress of the shape memory alloy and temperature dependent properties of the shape memory alloy and composite matrix along with its source codes were developed. It was found that the post-buckling paths of the shape memory alloy composite plates subjected to various tentlike temperature fields were stable within the studied temperature range. The addition of shape memory alloy wires to the composite plates was found to significantly improve the post-buckling behavior of laminated composite plates under non-uniform temperature distribution.Keywords: Post-buckling, shape memory alloy, temperaturedependent property, tent-like temperature distribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20161610 Decoy-pulse Protocol for Frequency-coded Quantum Key Distribution
Authors: Sudeshna Bhattacharya, Pratyush Pandey, Pradeep Kumar K
Abstract:
We propose a decoy-pulse protocol for frequency-coded implementation of B92 quantum key distribution protocol. A direct extension of decoy-pulse method to frequency-coding scheme results in security loss as an eavesdropper can distinguish between signal and decoy pulses by measuring the carrier photon number without affecting other statistics. We overcome this problem by optimizing the ratio of carrier photon number of decoy-to-signal pulse to be as close to unity as possible. In our method the switching between signal and decoy pulses is achieved by changing the amplitude of RF signal as opposed to modulating the intensity of optical signal thus reducing system cost. We find an improvement by a factor of 100 approximately in the key generation rate using decoy-state protocol. We also study the effect of source fluctuation on key rate. Our simulation results show a key generation rate of 1.5×10-4/pulse for link lengths up to 70km. Finally, we discuss the optimum value of average photon number of signal pulse for a given key rate while also optimizing the carrier ratio.
Keywords: B92, decoy-pulse, frequency-coding, quantum key distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17261609 The Possibility-Probability Relationship for Bloodstream Concentrations of Physiologically Active Substances
Authors: Arkady Bolotin
Abstract:
If a possibility distribution and a probability distribution are describing values x of one and the same system or process x(t), can they relate to each other? Though in general the possibility and probability distributions might be not connected at all, we can assume that in some particular cases there is an association linked them. In the presented paper, we consider distributions of bloodstream concentrations of physiologically active substances and propose that the probability to observe a concentration x of a substance X can be produced from the possibility of the event X = x . The proposed assumptions and resulted theoretical distributions are tested against the data obtained from various panel studies of the bloodstream concentrations of the different physiologically active substances in patients and healthy adults as well.Keywords: Possibility distributions, possibility-probability relationship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11901608 Electric Field and Potential Distributions along Surface of Silicone Rubber Polymer Insulators Using Finite Element Method
Authors: B. Marungsri, W. Onchantuek, A. Oonsivilai
Abstract:
This paper presents the simulation the results of electric field and potential distributions along surface of silicone rubber polymer insulators. Near the same leakage distance subjected to 15 kV in 50 cycle salt fog ageing test, alternate sheds silicone rubber polymer insulator showed better contamination performance than straight sheds silicone rubber polymer insulator. Severe surface ageing was observed on the straight sheds insulator. The objective of this work is to elucidate that electric field distribution along straight sheds insulator higher than alternate shed insulator in salt fog ageing test. Finite element method (FEM) is adopted for this work. The simulation results confirmed the experimental data, as well.Keywords: Electric field distribution, potential distribution, silicone rubber polymer insulator, finite element method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21081607 A Brief Study about Nonparametric Adherence Tests
Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim
Abstract:
The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.Keywords: Kolmogorov-Smirnov, Anderson-Darling, Cramer-Von-Mises, Nonparametric adherence tests.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18431606 An Efficient Passive Planar Micromixer with Finshaped Baffles in the Tee Channel for Wide Reynolds Number Flow Range
Authors: C. A. Cortes-Quiroz, A. Azarbadegan, E. Moeendarbary
Abstract:
A new design of a planar passive T-micromixer with fin-shaped baffles in the mixing channel is presented. The mixing efficiency and the level of pressure loss in the channel have been investigated by numerical simulations in the range of Reynolds number (Re) 1 to 50. A Mixing index (Mi) has been defined to quantify the mixing efficiency, which results over 85% at both ends of the Re range, what demonstrates the micromixer can enhance mixing using the mechanisms of diffusion (lower Re) and convection (higher Re). Three geometric dimensions: radius of baffle, baffles pitch and height of the channel define the design parameters, and the mixing index and pressure loss are the performance parameters used to optimize the micromixer geometry with a multi-criteria optimization method. The Pareto front of designs with the optimum trade-offs, maximum mixing index with minimum pressure loss, is obtained. Experiments for qualitative and quantitative validation have been implemented.
Keywords: Computational fluids dynamics, fin-shaped baffle, mixing strategies, multi-objective optimization, passive micromixer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19871605 Modelling Extreme Temperature in Malaysia Using Generalized Extreme Value Distribution
Authors: Husna Hasan, Norfatin Salam, Mohd Bakri Adam
Abstract:
Extreme temperature of several stations in Malaysia is modelled by fitting the monthly maximum to the Generalized Extreme Value (GEV) distribution. The Mann-Kendall (MK) test suggests a non-stationary model. Two models are considered for stations with trend and the Likelihood Ratio test is used to determine the best-fitting model. Results show that half of the stations favour a model which is linear for the location parameters. The return level is the level of events (maximum temperature) which is expected to be exceeded once, on average, in a given number of years, is obtained.Keywords: Extreme temperature, extreme value, return level.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28351604 Influence of Propeller Blade Lift Distribution on Whirl Flutter Stability Characteristics
Authors: J. Cecrdle
Abstract:
This paper deals with the whirl flutter of the turboprop aircraft structures. It is focused on the influence of the blade lift span-wise distribution on the whirl flutter stability. Firstly it gives the overall theoretical background of the whirl flutter phenomenon. After that the propeller blade forces solution and the options of the blade lift modeling are described. The problem is demonstrated on the example of a twin turboprop aircraft structure. There are evaluated the influences with respect to the propeller aerodynamic derivatives and finally the influences to the whirl flutter speed and the whirl flutter margin respectively.
Keywords: Aeroelasticity, flutter, propeller blade force, whirl flutter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23331603 Numerical Simulation of Electric and Hydrodynamic Fields Distribution in a Dielectric Liquids Electrofilter Cell
Authors: Narcis C. Ostahie, Tudor Sajin
Abstract:
In this paper a numerical simulation of electric and hydrodynamic fields distribution in an electrofilter for dielectric liquids cell is made. The simulation is made with the purpose to determine the trajectory of particles that moves under the action of external force in an electric and hydrodynamic field created inside of an electrofilter for dielectric liquids. Particle trajectory is analyzed for a dielectric liquid-solid particles suspension.Keywords: Dielectric liquids, electrohydrodynamics, energy, high voltage, particles
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16141602 Artificial Accelerated Ageing Test of 22 kVXLPE Cable for Distribution System Applications in Thailand
Authors: A. Rawangpai, B. Maraungsri, N. Chomnawang
Abstract:
This paper presents the experimental results on artificial ageing test of 22 kV XLPE cable for distribution system application in Thailand. XLPE insulating material of 22 kV cable was sliced to 60-70 μm in thick and was subjected to ac high voltage at 23 Ôùª C, 60 Ôùª C and 75 Ôùª C. Testing voltage was constantly applied to the specimen until breakdown. Breakdown voltage and time to breakdown were used to evaluate life time of insulating material. Furthermore, the physical model by J. P. Crine for predicts life time of XLPE insulating material was adopted as life time model and was calculated in order to compare the experimental results. Acceptable life time results were obtained from Crine-s model comparing with the experimental result. In addition, fourier transform infrared spectroscopy (FTIR) for chemical analysis and scanning electron microscope (SEM) for physical analysis were conducted on tested specimens.Keywords: Artificial accelerated ageing test, XLPE cable, distribution system, insulating material, life time, life time model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36771601 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics
Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh
Abstract:
In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.
Keywords: Bond ball mill, population balance model, product size distribution, vertical stirred mill.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11481600 Conflation Methodology Applied to Flood Recovery
Authors: E. L. Suarez, D. E. Meeroff, Y. Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.
Keywords: Community resilience, conflation, flood risk, nuisance flooding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138