Search results for: Conditional Probability Distribution
1785 Liquid-Liquid Equilibrium Data for Butan-2-ol - Ethanol - Water, Pentan-1-ol - Ethanol - Water and Toluene - Acetone - Water Systems
Authors: Tinuade Jolaade Afolabi, Theresa Ibibia Edewor
Abstract:
Experimental liquid-liquid equilibra of butan-2-ol - ethanol -water; pentan-1-ol - ethanol - water and toluene - acetone - water ternary systems were investigated at (25oC). The reliability of the experimental tie-line data was ascertained by using Othmer-Tobias and Hand plots. The distribution coefficients (D) and separation factors (S) of the immiscibility region were evaluated for the three systems.Keywords: Distribution coefficient, Liquid-liquid equilibrium, separation factors, thermodynamic models
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38461784 Marketing Strategy Analysis of Thai Asia Pacific Brewery Company
Authors: Sinee Sankrusme
Abstract:
The study was a case study analysis about Thai Asia Pacific Brewery Company. The purpose was to analyze the company’s marketing objective, marketing strategy at company level, and marketing mix before liquor liberalization in 2000. Methods used in this study were qualitative and descriptive research approach which demonstrated the following results of the study demonstrated as follows: (1) Marketing objective was to increase market share of Heineken and Amtel, (2) the company’s marketing strategies were brand building strategy and distribution strategy. Additionally, the company also conducted marketing mix strategy as follows. Product strategy: The company added more beer brands namely Amstel and Tiger to provide additional choice to consumers, product and marketing research, and product development. Price strategy: the company had taken the following into consideration: cost, competitor, market, economic situation and tax. Promotion strategy: the company conducted sales promotion and advertising. Distribution strategy: the company extended channels its channels of distribution into food shops, pubs and various entertainment places. This strategy benefited interested persons and people who were engaged in the beer business.
Keywords: Marketing Strategy, Beer, Thai Asia Pacific Brewery Company.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62411783 Thermal Analysis of a Sliding Electric Contact System Using Finite Element Method
Authors: Adrian T. Pleșca
Abstract:
In this paper a three dimensional thermal model of a sliding contact system is proposed for both steady-state or transient conditions. The influence of contact force, electric current and ambient temperature on the temperature distribution, has been investigated. A thermal analysis of the different type of the graphite material of fixed electric contact and its influence on contact system temperature rise, has been performed. To validate the three dimensional thermal model, some experimental tests have been done. There is a good correlation between experimental and simulation results.Keywords: Sliding electric contact, temperature distribution, thermal analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21331782 Nonparametric Control Chart Using Density Weighted Support Vector Data Description
Authors: Myungraee Cha, Jun Seok Kim, Seung Hwan Park, Jun-Geol Baek
Abstract:
In manufacturing industries, development of measurement leads to increase the number of monitoring variables and eventually the importance of multivariate control comes to the fore. Statistical process control (SPC) is one of the most widely used as multivariate control chart. Nevertheless, SPC is restricted to apply in processes because its assumption of data as following specific distribution. Unfortunately, process data are composed by the mixture of several processes and it is hard to estimate as one certain distribution. To alternative conventional SPC, therefore, nonparametric control chart come into the picture because of the strength of nonparametric control chart, the absence of parameter estimation. SVDD based control chart is one of the nonparametric control charts having the advantage of flexible control boundary. However,basic concept of SVDD has been an oversight to the important of data characteristic, density distribution. Therefore, we proposed DW-SVDD (Density Weighted SVDD) to cover up the weakness of conventional SVDD. DW-SVDD makes a new attempt to consider dense of data as introducing the notion of density Weight. We extend as control chart using new proposed SVDD and a simulation study of various distributional data is conducted to demonstrate the improvement of performance.
Keywords: Density estimation, Multivariate control chart, Oneclass classification, Support vector data description (SVDD)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21241781 Computer Aided X-Ray Diffraction Intensity Analysis for Spinels: Hands-On Computing Experience
Authors: Ashish R. Tanna, Hiren H. Joshi
Abstract:
The mineral having chemical compositional formula MgAl2O4 is called “spinel". The ferrites crystallize in spinel structure are known as spinel-ferrites or ferro-spinels. The spinel structure has a fcc cage of oxygen ions and the metallic cations are distributed among tetrahedral (A) and octahedral (B) interstitial voids (sites). The X-ray diffraction (XRD) intensity of each Bragg plane is sensitive to the distribution of cations in the interstitial voids of the spinel lattice. This leads to the method of determination of distribution of cations in the spinel oxides through XRD intensity analysis. The computer program for XRD intensity analysis has been developed in C language and also tested for the real experimental situation by synthesizing the spinel ferrite materials Mg0.6Zn0.4AlxFe2- xO4 and characterized them by X-ray diffractometry. The compositions of Mg0.6Zn0.4AlxFe2-xO4(x = 0.0 to 0.6) ferrites have been prepared by ceramic method and powder X-ray diffraction patterns were recorded. Thus, the authenticity of the program is checked by comparing the theoretically calculated data using computer simulation with the experimental ones. Further, the deduced cation distributions were used to fit the magnetization data using Localized canting of spins approach to explain the “recovery" of collinear spin structure due to Al3+ - substitution in Mg-Zn ferrites which is the case if A-site magnetic dilution and non-collinear spin structure. Since the distribution of cations in the spinel ferrites plays a very important role with regard to their electrical and magnetic properties, it is essential to determine the cation distribution in spinel lattice.
Keywords: Spinel ferrites, Localized canting of spins, X-ray diffraction, Programming in Borland C.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38101780 Approximations to the Distribution of the Sample Correlation Coefficient
Authors: John N. Haddad, Serge B. Provost
Abstract:
Given a bivariate normal sample of correlated variables, (Xi, Yi), i = 1, . . . , n, an alternative estimator of Pearson’s correlation coefficient is obtained in terms of the ranges, |Xi − Yi|. An approximate confidence interval for ρX,Y is then derived, and a simulation study reveals that the resulting coverage probabilities are in close agreement with the set confidence levels. As well, a new approximant is provided for the density function of R, the sample correlation coefficient. A mixture involving the proposed approximate density of R, denoted by hR(r), and a density function determined from a known approximation due to R. A. Fisher is shown to accurately approximate the distribution of R. Finally, nearly exact density approximants are obtained on adjusting hR(r) by a 7th degree polynomial.Keywords: Sample correlation coefficient, density approximation, confidence intervals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22741779 Study on Performance of Wigner Ville Distribution for Linear FM and Transient Signal Analysis
Authors: Azeemsha Thacham Poyil, Nasimudeen KM
Abstract:
This research paper presents some methods to assess the performance of Wigner Ville Distribution for Time-Frequency representation of non-stationary signals, in comparison with the other representations like STFT, Spectrogram etc. The simultaneous timefrequency resolution of WVD is one of the important properties which makes it preferable for analysis and detection of linear FM and transient signals. There are two algorithms proposed here to assess the resolution and to compare the performance of signal detection. First method is based on the measurement of area under timefrequency plot; in case of a linear FM signal analysis. A second method is based on the instantaneous power calculation and is used in case of transient, non-stationary signals. The implementation is explained briefly for both methods with suitable diagrams. The accuracy of the measurements is validated to show the better performance of WVD representation in comparison with STFT and Spectrograms.
Keywords: WVD: Wigner Ville Distribution, STFT: Short Time Fourier Transform, FT: Fourier Transform, TFR: Time-Frequency Representation, FM: Frequency Modulation, LFM Signal: Linear FM Signal, JTFA: Joint time frequency analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24271778 Investigating Real Ship Accidents with Descriptive Analysis in Turkey
Authors: İsmail Karaca, Ömer Söner
Abstract:
The use of advanced methods has been increasing day by day in the maritime sector, which is one of the sectors least affected by the COVID-19 pandemic. It is aimed to minimize accidents, especially by using advanced methods in the investigation of marine accidents. This research aimed to conduct an exploratory statistical analysis of particular ship accidents in the Transport Safety Investigation Center of Turkey database. 46 ship accidents, which occurred between 2010-2018, have been selected from the database. In addition to the availability of a reliable and comprehensive database, taking advantage of the robust statistical models for investigation is critical to improving the safety of ships. Thus, descriptive analysis has been used in the research to identify causes and conditional factors related to different types of ship accidents. The research outcomes underline the fact that environmental factors and day and night ratio have great influence on ship safety.
Keywords: Descriptive analysis, maritime industry, maritime safety, marine accident analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7201777 Spatial Distribution of Local Sheep Breeds in Antalya Province
Authors: Serife Gulden Yilmaz, Suleyman Karaman
Abstract:
Sheep breeding is important in terms of meeting both the demand of red meat consumption and the availability of industrial raw materials and the employment of the rural sector in Turkey. It is also very important to ensure the selection and continuity of the breeds that are raised in order to increase quality and productive products related to sheep breeding. The protection of local breeds and crossbreds also enables the development of the sector in the region and the reduction of imports. In this study, the data were obtained from the records of the Turkish Statistical Institute and Antalya Sheep & Goat Breeders' Association. Spatial distribution of sheep breeds in Antalya is reviewed statistically in terms of concentration at the local level for 2015 period spatially. For this reason; mapping, box plot, linear regression are used in this study. Concentration is introduced by means of studbook data on sheep breeding as locals and total sheep farm by mapping. It is observed that Pırlak breed (17.5%) and Merinos crossbreed (16.3%) have the highest concentration in the region. These breeds are respectively followed by Akkaraman breed (11%), Pirlak crossbreed (8%), Merinos breed (7.9%) Akkaraman crossbreed (7.9%) and Ivesi breed (7.2%).
Keywords: Antalya, sheep breeds, spatial distribution, local.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12331776 Comparison of Pore Space Features by Thin Sections and X-Ray Microtomography
Authors: H. Alves, J. T. Assis, M. Geraldes, I. Lima, R. T. Lopes
Abstract:
Microtomographic images and thin section (TS) images were analyzed and compared against some parameters of geological interest such as porosity and its distribution along the samples. The results show that microtomography (CT) analysis, although limited by its resolution, have some interesting information about the distribution of porosity (homogeneous or not) and can also quantify the connected and non-connected pores, i.e., total porosity. TS have no limitations concerning resolution, but are limited by the experimental data available in regards to a few glass sheets for analysis and also can give only information about the connected pores, i.e., effective porosity. Those two methods have their own virtues and flaws but when paired together they are able to complement one another, making for a more reliable and complete analysis.
Keywords: Microtomography, petrographical microscopy, sediments, thin sections.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23311775 SVC and DSTATCOM Comparison for Voltage Improvement in RDS Using ANFIS
Authors: U. Ramesh Babu, V. Vijaya Kumar Reddy, S. Tara Kalyani
Abstract:
This paper investigates the performance comparison of SVC (Static VAR Compensator) and DSTATCOM (Distribution Static Synchronous Compensator) to improve voltage stability in Radial Distribution System (RDS) which are efficient FACTS (Flexible AC Transmission System) devices that are capable of controlling the active and reactive power flows in a power system line by appropriately controlling parameters using ANFIS. Simulations are carried out in MATLAB/Simulink environment for the IEEE-4 bus system to test the ability of increasing load. It is found that these controllers significantly increase the margin of load in the power systems.
Keywords: SVC, DSTATCOM, voltage improvement, ANFIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13861774 Data Envelopment Analysis under Uncertainty and Risk
Authors: P. Beraldi, M. E. Bruni
Abstract:
Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19711773 Plug and Play Interferometer Configuration using Single Modulator Technique
Authors: Norshamsuri Ali, Hafizulfika, Salim Ali Al-Kathiri, Abdulla Al-Attas, Suhairi Saharudin, Mohamed Ridza Wahiddin
Abstract:
We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.Keywords: single photon, interferometer, quantum key distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16251772 Managing Iterations in Product Design and Development
Authors: K. Aravindhan, Trishit Bandyopadhyay, Mahesh Mehendale, Supriya Kumar De
Abstract:
The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Keywords: Decision Points, Iteration, Product Design, Rework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21971771 Distributed Load Flow Analysis using Graph Theory
Authors: D. P. Sharma, A. Chaturvedi, G.Purohit , R.Shivarudraswamy
Abstract:
In today scenario, to meet enhanced demand imposed by domestic, commercial and industrial consumers, various operational & control activities of Radial Distribution Network (RDN) requires a focused attention. Irrespective of sub-domains research aspects of RDN like network reconfiguration, reactive power compensation and economic load scheduling etc, network performance parameters are usually estimated by an iterative process and is commonly known as load (power) flow algorithm. In this paper, a simple mechanism is presented to implement the load flow analysis (LFA) algorithm. The reported algorithm utilizes graph theory principles and is tested on a 69- bus RDN.Keywords: Radial Distribution network, Graph, Load-flow, Array.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31481770 Numerical Study of Effects of Air Dam on the Flow Field and Pressure Distribution of a Passenger Car
Authors: Min Ye Koo, Ji Ho Ahn, Byung Il You, Gyo Woo Lee
Abstract:
Everything that is attached to the outside of the vehicle to improve the driving performance of the vehicle by changing the flow characteristics of the surrounding air or to pursue the external personality is called a tuning part. Typical tuning components include front or rear air dam, also known as spoilers, splitter, and side air dam. Particularly, the front air dam prevents the airflow flowing into the lower portion of the vehicle and increases the amount of air flow to the side and front of the vehicle body, thereby reducing lift force generation that lifts the vehicle body, and thus, improving the steering and driving performance of the vehicle. The purpose of this study was to investigate the role of anterior air dam in the flow around a sedan passenger car using computational fluid dynamics. The effects of flow velocity, trajectory of fluid particles on static pressure distribution and pressure distribution on body surface were investigated by varying flow velocity and size of air dam. As a result, it has been confirmed that the front air dam improves the flow characteristics, thereby reducing the generation of lift force of the vehicle, so it helps in steering and driving characteristics.
Keywords: Numerical study, computational fluid dynamics, air dam, tuning parts, drag, lift force.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16391769 Secondary Ion Mass Spectrometry of Proteins
Authors: Santanu Ray, Alexander G. Shard
Abstract:
The adsorption of bovine serum albumin (BSA), immunoglobulin G (IgG) and fibrinogen (Fgn) on fluorinated selfassembled monolayers have been studied using time of flight secondary ion mass spectrometry (ToF-SIMS) and Spectroscopic Ellipsometry (SE). The objective of the work has to establish the utility of ToF-SIMS for the determination of the amount of protein adsorbed on the surface. Quantification of surface adsorbed proteins was carried out using SE and a good correlation between ToF-SIMS results and SE was achieved. The surface distribution of proteins were also analysed using Atomic Force Microscopy (AFM). We show that the surface distribution of proteins strongly affect the ToFSIMS results.
Keywords: ToF-SIMS, Spectroscopic Ellipsometry, Protein, Atomic Force Microscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19421768 Statistical Modeling of Constituents in Ash Evolved From Pulverized Coal Combustion
Authors: Esam Jassim
Abstract:
Industries using conventional fossil fuels have an interest in better understanding the mechanism of particulate formation during combustion since such is responsible for emission of undesired inorganic elements that directly impact the atmospheric pollution level. Fine and ultrafine particulates have tendency to escape the flue gas cleaning devices to the atmosphere. They also preferentially collect on surfaces in power systems resulting in ascending in corrosion inclination, descending in the heat transfer thermal unit, and severe impact on human health. This adverseness manifests particularly in the regions of world where coal is the dominated source of energy for consumption. This study highlights the behavior of calcium transformation as mineral grains verses organically associated inorganic components during pulverized coal combustion. The influence of existing type of calcium on the coarse, fine and ultrafine mode formation mechanisms is also presented. The impact of two sub-bituminous coals on particle size and calcium composition evolution during combustion is to be assessed. Three mixed blends named Blends 1, 2, and 3 are selected according to the ration of coal A to coal B by weight. Calcium percentage in original coal increases as going from Blend 1 to 3. A mathematical model and a new approach of describing constituent distribution are proposed. Analysis of experiments of calcium distribution in ash is also modeled using Poisson distribution. A novel parameter, called elemental index λ, is introduced as a measuring factor of element distribution. Results show that calcium in ash that originally in coal as mineral grains has index of 17, whereas organically associated calcium transformed to fly ash shown to be best described when elemental index λ is 7. As an alkaline-earth element, calcium is considered the fundamental element responsible for boiler deficiency since it is the major player in the mechanism of ash slagging process. The mechanism of particle size distribution and mineral species of ash particles are presented using CCSEM and size-segregated ash characteristics. Conclusions are drawn from the analysis of pulverized coal ash generated from a utility-scale boiler.
Keywords: Calcium transformation, Coal Combustion, Inorganic Element, Poisson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19591767 Geometric Representation of Modified Forms of Seven Important Failure Criteria
Authors: Ranajay Bhowmick
Abstract:
Elastoplastic analysis of a structural system involves defining failure/yield criterion, flow rules and hardening rules. The failure/yield criterion defines the limit beyond which the material flows plastically and hardens/softens or remains perfectly plastic before ultimate collapse. The failure/yield criterion is represented geometrically in three/two dimensional Haigh-Westergaard stress-space to facilitate a better understanding of the behavior of the material. In the present study geometric representations in three and two-dimensional stress-space of a few important failure/yield criterion are presented. The criteria presented are the modified forms obtained due to the conditional solutions of the equation of stress invariants. A comparison of the failure/yield surfaces is also presented here to obtain the effectiveness of each of them and it has been found that for identical conditions the Rankine’s criterion gives the largest values of limiting stresses.
Keywords: Deviatoric plane, failure criteria, geometric representation, hydrostatic axis, modified form.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3791766 Radiation Dose Distribution for Workers in South Korean Nuclear Power Plants
Authors: B. I. Lee, S. I. Kim, D. H. Suh, J. I. Kim, Y. K. Lim
Abstract:
A total of 33,680 nuclear power plants (NPPs) workers were monitored and recorded from 1990 to 2007. According to the record, the average individual radiation dose has been decreasing continually from it 3.20 mSv/man in 1990 to 1.12 mSv/man at the end of 2007. After the International Commission on Radiological Protection (ICRP) 60 recommendation was generalized in South Korea, no nuclear power plant workers received above 20 mSv radiation, and the numbers of relatively highly exposed workers have been decreasing continuously. The age distribution of radiation workers in nuclear power plants was composed of mainly 20-30- year-olds (83%) for 1990 ~ 1994 and 30-40-year-olds (75%) for 2003 ~ 2007. The difference in individual average dose by age was not significant. Most (77%) of NPP radiation exposures from 1990 to 2007 occurred mostly during the refueling period. With regard to exposure type, the majority of exposures were external exposures, representing 95% of the total exposures, while internal exposures represented only 5%. External effective dose was affected mainly by gamma radiation exposure, with an insignificant amount of neutron exposure. As for internal effective dose, tritium (3H) in the pressurized heavy water reactor (PHWR) was the biggest cause of exposure.
Keywords: Dose distribution, External exposure, Nuclear powerplant, Occupational radiation dose
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25691765 Urban and Rural Population Pyramids in Georgia Since 1950s
Authors: Shorena Tsiklauri, Avtandil Sulaberidze, Nino Gomelauri
Abstract:
In the years followed independence, an economic crisis and some conflicts led to the displacement of many people inside Georgia. The growing poverty, unemployment, low income and its unequal distribution limited access to basic social service have had a clear direct impact on Georgian population dynamics and its age-sex structure. Factors influencing the changing population age structure and urbanization include mortality, fertility, migration and expansion of urban. In this paper presents the main factors of changing the distribution by urban and rural areas. How different are the urban and rural age and sex structures? Does Georgia have the same age-sex structure among their urban and rural populations since 1950s?Keywords: Age and sex structure of population, Georgia, migration, urban-rural population.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30361764 Clustering Mixed Data Using Non-normal Regression Tree for Process Monitoring
Authors: Youngji Yoo, Cheong-Sool Park, Jun Seok Kim, Young-Hak Lee, Sung-Shick Kim, Jun-Geol Baek
Abstract:
In the semiconductor manufacturing process, large amounts of data are collected from various sensors of multiple facilities. The collected data from sensors have several different characteristics due to variables such as types of products, former processes and recipes. In general, Statistical Quality Control (SQC) methods assume the normality of the data to detect out-of-control states of processes. Although the collected data have different characteristics, using the data as inputs of SQC will increase variations of data, require wide control limits, and decrease performance to detect outof- control. Therefore, it is necessary to separate similar data groups from mixed data for more accurate process control. In the paper, we propose a regression tree using split algorithm based on Pearson distribution to handle non-normal distribution in parametric method. The regression tree finds similar properties of data from different variables. The experiments using real semiconductor manufacturing process data show improved performance in fault detecting ability.Keywords: Semiconductor, non-normal mixed process data, clustering, Statistical Quality Control (SQC), regression tree, Pearson distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17821763 Evaluating Probable Bending of Frames for Near-Field and Far-Field Records
Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar
Abstract:
Most reinforced concrete structures are designed only under heavy loads have large transverse reinforcement spacing values, and therefore suffer severe failure after intense ground movements. The main goal of this paper is to compare the shear- and axial failure of concrete bending frames available in Tehran using Incremental Dynamic Analysis (IDA) under near- and far-field records. For this purpose, IDA of 5, 10, and 15-story concrete structures were done under seven far-fault records and five near-faults records. The results show that in two-dimensional models of short-rise, mid-rise and high-rise reinforced concrete frames located on Type-3 soil, increasing the distance of the transverse reinforcement can increase the maximum inter-story drift ratio values up to 37%. According to the existing results on 5, 10, and 15-story reinforced concrete models located on Type-3 soil, records with characteristics such as fling-step and directivity create maximum drift values between floors more than far-fault earthquakes. The results indicated that in the case of seismic excitation modes under earthquake encompassing directivity or fling-step, the probability values of failure and failure possibility increasing rate values are much smaller than the corresponding values of far-fault earthquakes. However, in near-fault frame records, the probability of exceedance occurs at lower seismic intensities compared to far-fault records.
Keywords: Directivity, fling-step, fragility curve, IDA, inter story drift ratio.v
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3691762 Concept of Automation in Management of Electric Power Systems
Authors: Richard Joseph, Nerey Mvungi
Abstract:
An electric power system includes a generating, a transmission, a distribution, and consumers subsystems. An electrical power network in Tanzania keeps growing larger by the day and become more complex so that, most utilities have long wished for real-time monitoring and remote control of electrical power system elements such as substations, intelligent devices, power lines, capacitor banks, feeder switches, fault analyzers and other physical facilities. In this paper, the concept of automation of management of power systems from generation level to end user levels was determined by using Power System Simulator for Engineering (PSS/E) version 30.3.2.
Keywords: Automation, Distribution subsystem, Generating subsystem, PSS/E, TANESCO, Transmission subsystem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 36121761 On the Efficiency and Robustness of Commingle Wiener and Lévy Driven Processes for Vasciek Model
Authors: Rasaki O. Olanrewaju
Abstract:
The driven processes of Wiener and Lévy are known self-standing Gaussian-Markov processes for fitting non-linear dynamical Vasciek model. In this paper, a coincidental Gaussian density stationarity condition and autocorrelation function of the two driven processes were established. This led to the conflation of Wiener and Lévy processes so as to investigate the efficiency of estimates incorporated into the one-dimensional Vasciek model that was estimated via the Maximum Likelihood (ML) technique. The conditional laws of drift, diffusion and stationarity process was ascertained for the individual Wiener and Lévy processes as well as the commingle of the two processes for a fixed effect and Autoregressive like Vasciek model when subjected to financial series; exchange rate of Naira-CFA Franc. In addition, the model performance error of the sub-merged driven process was miniature compared to the self-standing driven process of Wiener and Lévy.Keywords: Wiener process, Lévy process, Vasciek model, drift, diffusion, Gaussian density stationary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6691760 Medical Image Registration by Minimizing Divergence Measure Based on Tsallis Entropy
Authors: Shaoyan Sun, Liwei Zhang, Chonghui Guo
Abstract:
As the use of registration packages spreads, the number of the aligned image pairs in image databases (either by manual or automatic methods) increases dramatically. These image pairs can serve as a set of training data. Correspondingly, the images that are to be registered serve as testing data. In this paper, a novel medical image registration method is proposed which is based on the a priori knowledge of the expected joint intensity distribution estimated from pre-aligned training images. The goal of the registration is to find the optimal transformation such that the distance between the observed joint intensity distribution obtained from the testing image pair and the expected joint intensity distribution obtained from the corresponding training image pair is minimized. The distance is measured using the divergence measure based on Tsallis entropy. Experimental results show that, compared with the widely-used Shannon mutual information as well as Tsallis mutual information, the proposed method is computationally more efficient without sacrificing registration accuracy.
Keywords: Multimodality images, image registration, Shannonentropy, Tsallis entropy, mutual information, Powell optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16371759 Springback Property and Texture Distribution of Grained Pure Copper
Authors: Takashi Sakai, Hitoshi Omata, Jun-Ichi Koyama
Abstract:
To improve the material characteristics of single- and poly-crystals of pure copper, the respective relationships between crystallographic orientations and microstructures, and the bending and mechanical properties were examined. And texture distribution is also analyzed. A grain refinement procedure was performed to obtain a grained structure. Furthermore, some analytical results related to crystal direction maps, inverse pole figures, and textures were obtained from SEM-EBSD analyses. Results showed that these grained metallic materials have peculiar springback characteristics with various bending angles.Keywords: Pure Copper, Grain Refinement, Environmental Materials, SEM-EBSD Analysis, Texture, Microstructure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21721758 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.
Keywords: Probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12011757 High Impedance Fault Detection using LVQ Neural Networks
Authors: Abhishek Bansal, G. N. Pillai
Abstract:
This paper presents a new method to detect high impedance faults in radial distribution systems. Magnitudes of third and fifth harmonic components of voltages and currents are used as a feature vector for fault discrimination. The proposed methodology uses a learning vector quantization (LVQ) neural network as a classifier for identifying high impedance arc-type faults. The network learns from the data obtained from simulation of a simple radial system under different fault and system conditions. Compared to a feed-forward neural network, a properly tuned LVQ network gives quicker response.Keywords: Fault identification, distribution networks, high impedance arc-faults, feature vector, LVQ networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22161756 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things
Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker
Abstract:
Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.Keywords: CUSUM, evidence theory, KL divergence, quickest change detection, time series data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 995