Search results for: estimations of probability distributions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1900

Search results for: estimations of probability distributions

1690 Invalidation of the Start of Lunar Calendars Based on Sighting of Crescent: A Survey of 101 Years of Data between 1938 and 2038

Authors: Rafik Ouared

Abstract:

The purpose of this paper is to invalidate decisions made by the Islamic conference led at Istanbul in 2016, which had defined two basic criteria to determine the start of the lunar month: (1)they are all based on the sighting of the crescent, be it observed or computed with modern methods, and (2) they've strongly recommended the adoption of the principle of 'unification of sighting', by which any occurrence of sighting anywhere would be applicable everywhere. To demonstrate the invalidation of those statements, a survey of 101 years of data, from 1938 to 2038, have been analyzed to compare the probability density function (PDF) of time difference between different types of fajr and new moon. Two groups of fajr have been considered: the 'natural fajr', which is the very first fajr following new moon, and the 'biased fajr', which is defined by human being inclusively of all chosen definitions. The parametric and non-parametric statistical comparisons between the different groups have shown the all the biased PDFs are significantly different from the unbiased (natural) PDF with probability value (p-value) less than 0.001. The significance level was fixed to 0.05. Conclusion: the on-going reference to sighting of crescent is inducing an significant bias in defining lunar calendar. Therefore, 'natural' calendar would be more applicable requiring a more contextualized revision of issue in fiqh.

Keywords: biased fajr, lunar calendar, natural fajr, probability density function, sighting of crescent, time difference between fajr and new moon

Procedia PDF Downloads 183
1689 An Analysis of the Effect of Sharia Financing and Work Relation Founding towards Non-Performing Financing in Islamic Banks in Indonesia

Authors: Muhammad Bahrul Ilmi

Abstract:

The purpose of this research is to analyze the influence of Islamic financing and work relation founding simultaneously and partially towards non-performing financing in Islamic banks. This research was regression quantitative field research, and had been done in Muammalat Indonesia Bank and Islamic Danamon Bank in 3 months. The populations of this research were 15 account officers of Muammalat Indonesia Bank and Islamic Danamon Bank in Surakarta, Indonesia. The techniques of collecting data used in this research were documentation, questionnaire, literary study and interview. Regression analysis result shows that Islamic financing and work relation founding simultaneously has positive and significant effect towards non performing financing of two Islamic Banks. It is obtained with probability value 0.003 which is less than 0.05 and F value 9.584. The analysis result of Islamic financing regression towards non performing financing shows the significant effect. It is supported by double linear regression analysis with probability value 0.001 which is less than 0.05. The regression analysis of work relation founding effect towards non-performing financing shows insignificant effect. This is shown in the double linear regression analysis with probability value 0.161 which is bigger than 0.05.

Keywords: Syariah financing, work relation founding, non-performing financing (NPF), Islamic Bank

Procedia PDF Downloads 404
1688 Finite Difference Based Probabilistic Analysis to Evaluate the Impact of Correlation Length on Long-Term Settlement of Soft Soils

Authors: Mehrnaz Alibeikloo, Hadi Khabbaz, Behzad Fatahi

Abstract:

Probabilistic analysis has become one of the most popular methods to quantify and manage geotechnical risks due to the spatial variability of soil input parameters. The correlation length is one of the key factors of quantifying spatial variability of soil parameters which is defined as a distance within which the random variables are correlated strongly. This paper aims to assess the impact of correlation length on the long-term settlement of soft soils improved with preloading. The concept of 'worst-case' spatial correlation length was evaluated by determining the probability of failure of a real case study of Vasby test fill. For this purpose, a finite difference code was developed based on axisymmetric consolidation equations incorporating the non-linear elastic visco-plastic model and the Karhunen-Loeve expansion method. The results show that correlation length has a significant impact on the post-construction settlement of soft soils in a way that by increasing correlation length, probability of failure increases and the approach to asymptote.

Keywords: Karhunen-Loeve expansion, probability of failure, soft soil settlement, 'worst case' spatial correlation length

Procedia PDF Downloads 136
1687 A Knowledge-Based Development of Risk Management Approaches for Construction Projects

Authors: Masoud Ghahvechi Pour

Abstract:

Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.

Keywords: risk, management, knowledge, risk management

Procedia PDF Downloads 29
1686 Congestion Control in Mobile Network by Prioritizing Handoff Calls

Authors: O. A. Lawal, O. A Ojesanmi

Abstract:

The demand for wireless cellular services continues to increase while the radio resources remain limited. Thus, network operators have to continuously manage the scarce radio resources in order to have an improved quality of service for mobile users. This paper proposes how to handle the problem of congestion in the mobile network by prioritizing handoff call, using the guard channel allocation scheme. The research uses specific threshold value for the time of allocation of the channel in the algorithm. The scheme would be simulated by generating various data for different traffics in the network as it would be in the real life. The result would be used to determine the probability of handoff call dropping and the probability of the new call blocking as a way of measuring the network performance.

Keywords: call block, channel, handoff, mobile cellular network

Procedia PDF Downloads 366
1685 Comparison of Wind Fragility for Window System in the Simplified 10 and 15-Story Building Considering Exposure Category

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Window system in high rise building is occasionally subjected to an excessive wind intensity, particularly during typhoon. The failure of window system did not affect overall safety of structural performance; however, it could endanger the safety of the residents. In this paper, comparison of fragility curves for window system of two residential buildings was studied. The probability of failure for individual window was determined with Monte Carlo Simulation method. Then, lognormal cumulative distribution function was used to represent the fragility. The results showed that windows located on the edge of leeward wall were more susceptible to wind load and the probability of failure for each window panel increased at higher floors.

Keywords: wind fragility, window system, high rise building, wind disaster

Procedia PDF Downloads 292
1684 Assimilating Multi-Mission Satellites Data into a Hydrological Model

Authors: Mehdi Khaki, Ehsan Forootan, Joseph Awange, Michael Kuhn

Abstract:

Terrestrial water storage, as a source of freshwater, plays an important role in human lives. Hydrological models offer important tools for simulating and predicting water storages at global and regional scales. However, their comparisons with 'reality' are imperfect mainly due to a high level of uncertainty in input data and limitations in accounting for all complex water cycle processes, uncertainties of (unknown) empirical model parameters, as well as the absence of high resolution (both spatially and temporally) data. Data assimilation can mitigate this drawback by incorporating new sets of observations into models. In this effort, we use multi-mission satellite-derived remotely sensed observations to improve the performance of World-Wide Water Resources Assessment system (W3RA) hydrological model for estimating terrestrial water storages. For this purpose, we assimilate total water storage (TWS) data from the Gravity Recovery And Climate Experiment (GRACE) and surface soil moisture data from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) into W3RA. This is done to (i) improve model estimations of water stored in ground and soil moisture, and (ii) assess the impacts of each satellite of data (from GRACE and AMSR-E) and their combination on the final terrestrial water storage estimations. These data are assimilated into W3RA using the Ensemble Square-Root Filter (EnSRF) filtering technique over Mississippi Basin (the United States) and Murray-Darling Basin (Australia) between 2002 and 2013. In order to evaluate the results, independent ground-based groundwater and soil moisture measurements within each basin are used.

Keywords: data assimilation, GRACE, AMSR-E, hydrological model, EnSRF

Procedia PDF Downloads 253
1683 Sufficient Conditions for Exponential Stability of Stochastic Differential Equations with Non Trivial Solutions

Authors: Fakhreddin Abedi, Wah June Leong

Abstract:

Exponential stability of stochastic differential equations with non trivial solutions is provided in terms of Lyapunov functions. The main result of this paper establishes that, under certain hypotheses for the dynamics f(.) and g(.), practical exponential stability in probability at the small neighborhood of the origin is equivalent to the existence of an appropriate Lyapunov function. Indeed, we establish exponential stability of stochastic differential equation when almost all the state trajectories are bounded and approach a sufficiently small neighborhood of the origin. We derive sufficient conditions for exponential stability of stochastic differential equations. Finally, we give a numerical example illustrating our results.

Keywords: exponential stability in probability, stochastic differential equations, Lyapunov technique, Ito's formula

Procedia PDF Downloads 21
1682 Existence of Systemic Risk in Turkish Banking Sector: An Evidence from Return Distributions

Authors: İlhami Karahanoglu, Oguz Ceylan

Abstract:

As its well-known definitions; systemic risk refers to whole economic system down-turn movement even collapse together in very severe cases. In fact, it points out the contagion effects of the defaults. Such a risk is can be depicted with the famous Chinese game of falling domino stones. During and after the Bear & Sterns and Lehman Brothers cases, it was well understood that there is a very strong effect of systemic risk in financial services sector. In this study, we concentrate on the existence of systemic risk in Turkish Banking Sector based upon the Halkbank Case during the end month of 2013; there was a political turmoil in Turkey in which the close relatives of the upper politicians were involved in illegal trading activities. In that operation, the CEO of Halkbank was also arrested and in investigation, Halkbank was considered as part of such illegal actions. That operation had an impact on Halkbanks stock value. The Halkbank stock value during that time interval decreased remarkably, the distributional profile of stock return changed and became more volatile as well as more skewed. In this study, the daily returns of 5 leading banks in Turkish banking sector were used to obtain 48 return distributions (for each month, 90-days-back stock value returns are used) of 5 banks for the period 12/2011-12/2013 (pre operation period) and 12/2013-12/2015 (post operation period). When those distributions are compared with timely manner, interestingly; the distribution of the 5 other leading banks in Turkey, public or private, had also distribution profiles which was different from the past 2011-2013 period just like Halkbank. Those 5 big banks, whose stock values are monitored with sub index in Istanbul stock exchange (BIST) as BN10, had more skewed distribution just following the Halkbank stock return movement during the post operation period, with lover mean value and as well higher volatility. In addition, the correlation between the stock value return distributions of the leading banks after Halkbank case, where the returns are more skewed to the left, increased (which is measured in monthly base before and after the operation). The dependence between those banks was stronger under the case where the stock values were falling compared with the normal market condition. Such distributional effect of stock returns between the leading banks in Turkey, which is valid for down sub-market (financial/banking sector) condition, can be evaluated as an evidence for the existence of contagious effect and systemic risk.

Keywords: financial risk, systemic risk, banking sector, return distribution, dependency structure

Procedia PDF Downloads 264
1681 On the Optimality Assessment of Nano-Particle Size Spectrometry and Its Association to the Entropy Concept

Authors: A. Shaygani, R. Saifi, M. S. Saidi, M. Sani

Abstract:

Particle size distribution, the most important characteristics of aerosols, is obtained through electrical characterization techniques. The dynamics of charged nano-particles under the influence of electric field in electrical mobility spectrometer (EMS) reveals the size distribution of these particles. The accuracy of this measurement is influenced by flow conditions, geometry, electric field and particle charging process, therefore by the transfer function (transfer matrix) of the instrument. In this work, a wire-cylinder corona charger was designed and the combined field-diffusion charging process of injected poly-disperse aerosol particles was numerically simulated as a prerequisite for the study of a multi-channel EMS. The result, a cloud of particles with non-uniform charge distribution, was introduced to the EMS. The flow pattern and electric field in the EMS were simulated using computational fluid dynamics (CFD) to obtain particle trajectories in the device and therefore to calculate the reported signal by each electrometer. According to the output signals (resulted from bombardment of particles and transferring their charges as currents), we proposed a modification to the size of detecting rings (which are connected to electrometers) in order to evaluate particle size distributions more accurately. Based on the capability of the system to transfer information contents about size distribution of the injected particles, we proposed a benchmark for the assessment of optimality of the design. This method applies the concept of Von Neumann entropy and borrows the definition of entropy from information theory (Shannon entropy) to measure optimality. Entropy, according to the Shannon entropy, is the ''average amount of information contained in an event, sample or character extracted from a data stream''. Evaluating the responses (signals) which were obtained via various configurations of detecting rings, the best configuration which gave the best predictions about the size distributions of injected particles, was the modified configuration. It was also the one that had the maximum amount of entropy. A reasonable consistency was also observed between the accuracy of the predictions and the entropy content of each configuration. In this method, entropy is extracted from the transfer matrix of the instrument for each configuration. Ultimately, various clouds of particles were introduced to the simulations and predicted size distributions were compared to the exact size distributions.

Keywords: aerosol nano-particle, CFD, electrical mobility spectrometer, von neumann entropy

Procedia PDF Downloads 312
1680 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli

Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha

Abstract:

Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.

Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus

Procedia PDF Downloads 254
1679 Investigations of Flow Field with Different Turbulence Models on NREL Phase VI Blade

Authors: T. Y. Liu, C. H. Lin, Y. M. Ferng

Abstract:

Wind energy is one of the clean renewable energy. However, the low frequency (20-200HZ) noise generated from the wind turbine blades, which bothers the residents, becomes the major problem to be developed. It is useful for predicting the aerodynamic noise by flow field and pressure distribution analysis on the wind turbine blades. Therefore, the main objective of this study is to use different turbulence models to analyse the flow field and pressure distributions of the wing blades. Three-dimensional Computation Fluid Dynamics (CFD) simulation of the flow field was used to calculate the flow phenomena for the National Renewable Energy Laboratory (NREL) Phase VI horizontal axis wind turbine rotor. Two different flow cases with different wind speeds were investigated: 7m/s with 72rpm and 15m/s with 72rpm. Four kinds of RANS-based turbulence models, Standard k-ε, Realizable k-ε, SST k-ω, and v2f, were used to predict and analyse the results in the present work. The results show that the predictions on pressure distributions with SST k-ω and v2f turbulence models have good agreements with experimental data.

Keywords: horizontal axis wind turbine, turbulence model, noise, fluid dynamics

Procedia PDF Downloads 236
1678 Flexural Fatigue Performance of Self-Compacting Fibre Reinforced Concrete

Authors: Surinder Pal Singh, Sanjay Goel

Abstract:

The paper presents results of an investigation conducted to study the flexural fatigue characteristics of Self Compacting Concrete (SCC) and Self Compacting Fibre Reinforced Concrete (SCFRC). In total 360 flexural fatigue tests and 270 static flexural strength tests were conducted on SCC and SCFRC specimens to obtain the fatigue test data. The variability in the distribution of fatigue life of SCC and SCFRC have been analyzed and compared with that of NVC and NVFRC containing steel fibres of comparable size and shape. The experimental coefficients of fatigue equations have been estimated to represent relationship between stress level (S) and fatigue life (N) for SCC and SCFRC containing different fibre volume fractions. The probability of failure (Pf) has been incorporated in S-N relationships to obtain families of S-N-Pf relationships. A good agreement between the predicted curves and those obtained from the test data has been observed. The fatigue performance of SCC and SCFRC has been evaluated in terms of two-million cycles fatigue strength/endurance limit. The theoretic fatigue lives were also estimated using single-log fatigue equation for 10% probability of failure to estimate the enhanced extent of theoretic fatigue lives of SCFRC with reference to SCC and NVC. The reduction in variability in the fatigue life, increased endurance limit and increased theoretiac fatigue lives demonstrates an overall better fatigue performance for SCC and SCFRC.

Keywords: fatigue life, fibre, probability of failure, self-compacting concrete

Procedia PDF Downloads 329
1677 Mechanical-Reliability Coupling for a Bearing Capacity Assessment of Shallow Foundations

Authors: Amal Hentati, Mbarka Selmi, Tarek Kormi, Julien Baroth, Barthelemy Harthong

Abstract:

The impact of uncertainties on the performance assessment of shallow foundations is often significant. The need of the geotechnical engineers to a more objective and rigorous description of soil variations permitting to quantify these uncertainties and to incorporate them into calculation methods led to the development of reliability approaches. In this context, a mechanical-reliability coupling was developed in this paper, using a program coded in Matlab and the finite element software Abaqus, for the bearing capacity assessment of shallow foundations. The reliability analysis, based on the finite element method, assumed both soil cohesion and friction angle as uncertain parameters characterized by normal or lognormal probability distributions. The inherent spatial variability of both soil properties was, then, taken into account using 1D stationary random fields. The application of the proposed methodology to a shallow foundation subjected to a centered vertical loading permitted to highlight the proposed process interest. Findings proved the insufficiency of the conventional approach to predict the foundation failure and a high sensitivity of the ultimate loads to the soil properties uncertainties, mainly those related to the friction angle, was noted. Moreover, an asymmetry of both displacement and velocity fields was obtained.

Keywords: mechanical-reliability coupling, finite element method, shallow foundation, random fields, spatial variability

Procedia PDF Downloads 637
1676 Wind Fragility of Window Glass in 10-Story Apartment with Two Different Window Models

Authors: Viriyavudh Sim, WooYoung Jung

Abstract:

Damage due to high wind is not limited to load resistance components such as beam and column. The majority of damage is due to breach in the building envelope such as broken roof, window, and door. In this paper, wind fragility of window glass in residential apartment was determined to compare the difference between two window configuration models. Monte Carlo Simulation method had been used to derive damage data and analytical fragilities were constructed. Fragility of window system showed that window located in leeward wall had higher probability of failure, especially those close to the edge of structure. Between the two window models, Model 2 had higher probability of failure, this was due to the number of panel in this configuration.

Keywords: wind fragility, glass window, high rise building, wind disaster

Procedia PDF Downloads 234
1675 Robust Noisy Speech Identification Using Frame Classifier Derived Features

Authors: Punnoose A. K.

Abstract:

This paper presents an approach for identifying noisy speech recording using a multi-layer perception (MLP) trained to predict phonemes from acoustic features. Characteristics of the MLP posteriors are explored for clean speech and noisy speech at the frame level. Appropriate density functions are used to fit the softmax probability of the clean and noisy speech. A function that takes into account the ratio of the softmax probability density of noisy speech to clean speech is formulated. These phoneme independent scoring is weighted using a phoneme-specific weightage to make the scoring more robust. Simple thresholding is used to identify the noisy speech recording from the clean speech recordings. The approach is benchmarked on standard databases, with a focus on precision.

Keywords: noisy speech identification, speech pre-processing, noise robustness, feature engineering

Procedia PDF Downloads 95
1674 Optimal Scheduling for Energy Storage System Considering Reliability Constraints

Authors: Wook-Won Kim, Je-Seok Shin, Jin-O Kim

Abstract:

This paper propose the method for optimal scheduling for battery energy storage system with reliability constraint of energy storage system in reliability aspect. The optimal scheduling problem is solved by dynamic programming with proposed transition matrix. Proposed optimal scheduling method guarantees the minimum fuel cost within specific reliability constraint. For evaluating proposed method, the timely capacity outage probability table (COPT) is used that is calculated by convolution of probability mass function of each generator. This study shows the result of optimal schedule of energy storage system.

Keywords: energy storage system (ESS), optimal scheduling, dynamic programming, reliability constraints

Procedia PDF Downloads 376
1673 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance

Authors: Loai AbdAllah, Mahmoud Kaiyal

Abstract:

Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.

Keywords: missing values, incomplete data, distance, incomplete diabetes data

Procedia PDF Downloads 188
1672 Examining Motivational Strategies of Foreign Manufacturing Firms in Ghana

Authors: Samuel Ato Dadzie

Abstract:

The objective of this study is to examine the influence of eclectic paradigm on motivational strategy of foreign subsidiaries in Ghana. This study uses binary regression model, and the analysis was based on 75 manufacturing investments made by MNEs from different countries in 1994–2008. The results indicated that perceived market size increases the probability of foreign firms undertaking a market seeking (MS) in Ghana, while perceived cultural distance between Ghana and foreign firm’s home countries decreased the probability of foreign firms undertaking an market seeking (MS) foreign direct investment (FDI) in Ghana. Furthermore, extensive international experience decreases the probability of foreign firms undertaking a market seeking (MS) foreign direct investment (FDI) in Ghana. Most of the studies done by earlier researchers were based on the advanced and emerging countries and offered support for the theory, which was used in generalizing the result that multinational corporations (MNCs) normally used the theory regarding investment strategy outside their home country. In using the same theory in the context of Ghana, the result does not offer strong support for the theory. This means that MNCs that come to Sub-Sahara Africa cannot rely much on eclectic paradigm for their motivational strategies because prevailing economic conditions in Ghana are different from that of the advanced and emerging economies where the institutional structures work.

Keywords: foreign subsidiary, motives, Ghana, foreign direct investment

Procedia PDF Downloads 402
1671 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 561
1670 Sniff-Camera for Imaging of Ethanol Vapor in Human Body Gases after Drinking

Authors: Toshiyuki Sato, Kenta Iitani, Koji Toma, Takahiro Arakawa, Kohji Mitsubayashi

Abstract:

A 2-dimensional imaging system (Sniff-camera) for gaseous ethanol emissions from a human palm skin was constructed and demonstrated. This imaging system measures gaseous ethanol concentrations as intensities of chemiluminescence (CL) by luminol reaction induced by alcohol oxidase and luminol-hydrogen peroxide system. A conversion of ethanol distributions and concentrations to 2-dimensional CL was conducted on an enzyme-immobilized mesh substrate in a dark box, which contained a luminol solution. In order to visualize ethanol emissions from human palm skin, we developed highly sensitive and selective imaging system for transpired gaseous ethanol at sub ppm-levels. High sensitivity imaging allows us to successfully visualize the emissions dynamics of transdermal gaseous ethanol. The intensity of each pixel on the palm shows the reflection of ethanol concentrations distributions based on the metabolism of oral alcohol administration. This imaging system is significant and useful for the assessment of ethanol measurement of the palmar skin.

Keywords: sniff-camera, gas-imaging, ethanol vapor, human body gas

Procedia PDF Downloads 339
1669 Modelling of Creep in a Thick-Walled Cylindrical Vessel Subjected to Internal Pressure

Authors: Tejeet Singh, Ishvneet Singh, Vinay Gupta

Abstract:

The present study focussed on carrying out the creep analysis in an isotropic thick-walled composite cylindrical pressure vessel composed of aluminium matrix reinforced with silicon-carbide in particulate form. The creep behaviour of the composite material has been described by the threshold stress based creep law. The value of stress exponent appearing in the creep law was selected as 3, 5 and 8. The constitutive equations were developed using well known von-Mises yield criteria. Models were developed to find out the distributions of creep stresses and strain rate in thick-walled composite cylindrical pressure vessels under internal pressure. In order to obtain the stress distributions in the cylinder, the equilibrium equation of the continuum mechanics and the constitutive equations are solved together. It was observed that the radial stress, tangential stress and axial stress increases along with the radial distance. The cross-over was also obtained almost at the middle region of cylindrical vessel for tangential and axial stress for different values of stress exponent. The strain rates were also decreasing in nature along the entire radius.

Keywords: creep, composite, cylindrical vessel, internal pressure

Procedia PDF Downloads 540
1668 Dimensioning of Circuit Switched Networks by Using Simulation Code Based On Erlang (B) Formula

Authors: Ali Mustafa Elshawesh, Mohamed Abdulali

Abstract:

The paper presents an approach to dimension circuit switched networks and find the relationship between the parameters of the circuit switched networks on the condition of specific probability of call blocking. Our work is creating a Simulation code based on Erlang (B) formula to draw graphs which show two curves for each graph; one of simulation and the other of calculated. These curves represent the relationships between average number of calls and average call duration with the probability of call blocking. This simulation code facilitates to select the appropriate parameters for circuit switched networks.

Keywords: Erlang B formula, call blocking, telephone system dimension, Markov model, link capacity

Procedia PDF Downloads 567
1667 Simulation Study of Multiple-Thick Gas Electron Multiplier-Based Microdosimeters for Fast Neutron Measurements

Authors: Amir Moslehi, Gholamreza Raisali

Abstract:

Microdosimetric detectors based on multiple-thick gas electron multiplier (multiple-THGEM) configurations are being used in various fields of radiation protection and dosimetry. In the present work, microdosimetric response of these detectors to fast neutrons has been investigated by Monte Carlo method. Three similar microdosimeters made of A-150 and rexolite as the wall materials are designed; the first based on single-THGEM, the second based on double-THGEM and the third is based on triple-THGEM. Sensitive volume of the three microdosimeters is a right cylinder of 5 mm height and diameter which is filled with the propane-based tissue-equivalent (TE) gas. The TE gas with 0.11 atm pressure at the room temperature simulates 1 µm of tissue. Lineal energy distributions for several neutron energies from 10 keV to 14 MeV including 241Am-Be neutrons are calculated by the Geant4 simulation toolkit. Also, mean quality factor and dose-equivalent value for any neutron energy has been determined by these distributions. Obtained data derived from the three microdosimeters are in agreement. Therefore, we conclude that the multiple-THGEM structures present similar microdosimetric responses to fast neutrons.

Keywords: fast neutrons, geant4, multiple-thick gas electron multiplier, microdosimeter

Procedia PDF Downloads 324
1666 Optimization of Flexible Job Shop Scheduling Problem with Sequence-Dependent Setup Times Using Genetic Algorithm Approach

Authors: Sanjay Kumar Parjapati, Ajai Jain

Abstract:

This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.

Keywords: flexible job shop, genetic algorithm, makespan, sequence dependent setup times

Procedia PDF Downloads 300
1665 Modeling of Steady State Creep in Thick-Walled Cylinders under Internal Pressure

Authors: Tejeet Singh, Ishavneet Singh

Abstract:

The present study focused on carrying out the creep analysis in an isotropic thick-walled composite cylindrical pressure vessel composed of aluminum matrix reinforced with silicon-carbide in particulate form. The creep behavior of the composite material has been described by the threshold stress based creep law. The values of stress exponent appearing in the creep law were selected as 3, 5 and 8. The constitutive equations were developed using well known von-Mises yield criteria. Models were developed to find out the distributions of creep stress and strain rate in thick-walled composite cylindrical pressure vessels under internal pressure. In order to obtain the stress distributions in the cylinder, the equilibrium equation of the continuum mechanics and the constitutive equations are solved together. It was observed that the radial stress, tangential stress and axial stress increases along with the radial distance. The cross-over was also obtained almost at the middle region of cylindrical vessel for tangential and axial stress for different values of stress exponent. The strain rates were also decreasing in nature along the entire radius.

Keywords: steady state creep, composite, cylinder, pressure

Procedia PDF Downloads 389
1664 Contingency Screening Using Risk Factor Considering Transmission Line Outage

Authors: M. Marsadek, A. Mohamed

Abstract:

Power system security analysis is the most time demanding process due to large number of possible contingencies that need to be analyzed.  In a power system, any contingency resulting in security violation such as line overload or low voltage may occur for a number of reasons at any time.  To efficiently rank a contingency, both probability and the extent of security violation must be considered so as not to underestimate the risk associated with the contingency. This paper proposed a contingency ranking method that take into account the probabilistic nature of power system and the severity of contingency by using a newly developed method based on risk factor.  The proposed technique is implemented on IEEE 24-bus system.

Keywords: line overload, low voltage, probability, risk factor, severity

Procedia PDF Downloads 520
1663 Poverty Dynamics in Thailand: Evidence from Household Panel Data

Authors: Nattabhorn Leamcharaskul

Abstract:

This study aims to examine determining factors of the dynamics of poverty in Thailand by using panel data of 3,567 households in 2007-2017. Four techniques of estimation are employed to analyze the situation of poverty across households and time periods: the multinomial logit model, the sequential logit model, the quantile regression model, and the difference in difference model. Households are categorized based on their experiences into 5 groups, namely chronically poor, falling into poverty, re-entering into poverty, exiting from poverty and never poor households. Estimation results emphasize the effects of demographic and socioeconomic factors as well as unexpected events on the economic status of a household. It is found that remittances have positive impact on household’s economic status in that they are likely to lower the probability of falling into poverty or trapping in poverty while they tend to increase the probability of exiting from poverty. In addition, not only receiving a secondary source of household income can raise the probability of being a never poor household, but it also significantly increases household income per capita of the chronically poor and falling into poverty households. Public work programs are recommended as an important tool to relieve household financial burden and uncertainty and thus consequently increase a chance for households to escape from poverty.

Keywords: difference in difference, dynamic, multinomial logit model, panel data, poverty, quantile regression, remittance, sequential logit model, Thailand, transfer

Procedia PDF Downloads 83
1662 Channels Splitting Strategy for Optical Local Area Networks of Passive Star Topology

Authors: Peristera Baziana

Abstract:

In this paper, we present a network configuration for a WDM LANs of passive star topology that assume that the set of data WDM channels is split into two separate sets of channels, with different access rights over them. Especially, a synchronous transmission WDMA access algorithm is adopted in order to increase the probability of successful transmission over the data channels and consequently to reduce the probability of data packets transmission cancellation in order to avoid the data channels collisions. Thus, a control pre-transmission access scheme is followed over a separate control channel. An analytical Markovian model is studied and the average throughput is mathematically derived. The performance is studied for several numbers of data channels and various values of control phase duration.

Keywords: access algorithm, channels division, collisions avoidance, wavelength division multiplexing

Procedia PDF Downloads 264
1661 Human Action Recognition Using Wavelets of Derived Beta Distributions

Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel

Abstract:

In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.

Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet

Procedia PDF Downloads 383