Search results for: Data cutting and sorting method
11168 A New Distribution and Application on the Lifetime Data
Authors: Gamze Ozel, Selen Cakmakyapan
Abstract:
We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of a simulation study.
Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 138811167 Trimmed Mean as an Adaptive Robust Estimator of a Location Parameter for Weibull Distribution
Authors: Carolina B. Baguio
Abstract:
One of the purposes of the robust method of estimation is to reduce the influence of outliers in the data, on the estimates. The outliers arise from gross errors or contamination from distributions with long tails. The trimmed mean is a robust estimate. This means that it is not sensitive to violation of distributional assumptions of the data. It is called an adaptive estimate when the trimming proportion is determined from the data rather than being fixed a “priori-. The main objective of this study is to find out the robustness properties of the adaptive trimmed means in terms of efficiency, high breakdown point and influence function. Specifically, it seeks to find out the magnitude of the trimming proportion of the adaptive trimmed mean which will yield efficient and robust estimates of the parameter for data which follow a modified Weibull distribution with parameter λ = 1/2 , where the trimming proportion is determined by a ratio of two trimmed means defined as the tail length. Secondly, the asymptotic properties of the tail length and the trimmed means are also investigated. Finally, a comparison is made on the efficiency of the adaptive trimmed means in terms of the standard deviation for the trimming proportions and when these were fixed a “priori". The asymptotic tail lengths defined as the ratio of two trimmed means and the asymptotic variances were computed by using the formulas derived. While the values of the standard deviations for the derived tail lengths for data of size 40 simulated from a Weibull distribution were computed for 100 iterations using a computer program written in Pascal language. The findings of the study revealed that the tail lengths of the Weibull distribution increase in magnitudes as the trimming proportions increase, the measure of the tail length and the adaptive trimmed mean are asymptotically independent as the number of observations n becomes very large or approaching infinity, the tail length is asymptotically distributed as the ratio of two independent normal random variables, and the asymptotic variances decrease as the trimming proportions increase. The simulation study revealed empirically that the standard error of the adaptive trimmed mean using the ratio of tail lengths is relatively smaller for different values of trimming proportions than its counterpart when the trimming proportions were fixed a 'priori'.Keywords: Adaptive robust estimate, asymptotic efficiency, breakdown point, influence function, L-estimates, location parameter, tail length, Weibull distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207411166 Efficiency of Post-Tensioning Method for Seismic Retrofitting of Pre-Cast Cylindrical Concrete Reservoirs
Authors: M.E.Karbaschi, R.Goudarzizadeh, N.Hedayat
Abstract:
Cylindrical concrete reservoirs are appropriate choice for storing liquids as water, oil and etc. By using of the pre-cast concrete reservoirs instead of the in-situ constructed reservoirs, the speed and precision of the construction would considerably increase. In this construction method, wall and roof panels would make in factory with high quality materials and precise controlling. Then, pre-cast wall and roof panels would carry out to the construction site for assembling. This method has a few faults such as: the existing weeks in connection of wall panels together and wall panels to foundation. Therefore, these have to be resisted under applied loads such as seismic load. One of the innovative methods which was successfully applied for seismic retrofitting of numerous pre-cast cylindrical water reservoirs in New Zealand, using of the high tensile cables around the reservoirs and post-tensioning them. In this paper, analytical modeling of wall and roof panels and post-tensioned cables are carried out with finite element method and the effect of height to diameter ratio, post-tensioning force value, liquid level in reservoir, installing position of tendons on seismic response of reservoirs are investigated.Keywords: Seismic Retrofit, Pre-Cast, Concrete Reservoir, Post-Tensioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202711165 ASLT Method for Beer Accelerated Shelf-Life Determination
Authors: Tatjana Rakcejeva, Valentina Skorina, Daina Karklina, Liga Skudra
Abstract:
The aim of current research was to investigate ASLT method suitability for accelerated beer shelf-life determination. The research was accomplished on popular Latvian beer: light filtrated and unfiltered pasteurized beer with alcohol content 5.2%; dark filtrated pasteurized beer with alcohol content 4.2% with shelf-life five months. Bottled in dark glass bottles beer samples were storage during 20 weeks at several temperature regimes: +10±1 °C, +20±1 °C, +30±1 °C, +40±1 °C. Samples quality parameters as physically-chemical and microbiological was tested every two weeks using standard methods. It is possible to determine beer shelf-life rapidly during storage at +30±1 °C for filtered pasteurized light beer by 2.5 times, unfiltered pasteurized light beer by 1.4 times and for filtered pasteurized dark beer by 1.7 times. During preset experiments it was proved, that it is possible to determine beer shelf-life rapidly using ASLT method if beer storage temperature could be increased by +10±1 °C.
Keywords: Beer, shelf-life, ASLT method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 610011164 Optimal Capacitor Allocation for loss reduction in Distribution System Using Fuzzy and Plant Growth Simulation Algorithm
Authors: R. Srinivasa Rao
Abstract:
This paper presents a new and efficient approach for capacitor placement in radial distribution systems that determine the optimal locations and size of capacitor with an objective of improving the voltage profile and reduction of power loss. The solution methodology has two parts: in part one the loss sensitivity factors are used to select the candidate locations for the capacitor placement and in part two a new algorithm that employs Plant growth Simulation Algorithm (PGSA) is used to estimate the optimal size of capacitors at the optimal buses determined in part one. The main advantage of the proposed method is that it does not require any external control parameters. The other advantage is that it handles the objective function and the constraints separately, avoiding the trouble to determine the barrier factors. The proposed method is applied to 9 and 34 bus radial distribution systems. The solutions obtained by the proposed method are compared with other methods. The proposed method has outperformed the other methods in terms of the quality of solution.Keywords: Distribution systems, Capacitor allocation, Loss reduction, Fuzzy, PGSA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 227911163 Risk Classification of SMEs by Early Warning Model Based on Data Mining
Authors: Nermin Ozgulbas, Ali Serhan Koyuncugil
Abstract:
One of the biggest problems of SMEs is their tendencies to financial distress because of insufficient finance background. In this study, an Early Warning System (EWS) model based on data mining for financial risk detection is presented. CHAID algorithm has been used for development of the EWS. Developed EWS can be served like a tailor made financial advisor in decision making process of the firms with its automated nature to the ones who have inadequate financial background. Besides, an application of the model implemented which covered 7,853 SMEs based on Turkish Central Bank (TCB) 2007 data. By using EWS model, 31 risk profiles, 15 risk indicators, 2 early warning signals, and 4 financial road maps has been determined for financial risk mitigation.
Keywords: Early Warning Systems, Data Mining, Financial Risk, SMEs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 338811162 Introduction of the Harmfulness of the Seismic Signal in the Assessment of the Performance of Reinforced Concrete Frame Structures
Authors: Kahil Amar, Boukais Said, Kezmane Ali, Hamizi Mohand, Hannachi Naceur Eddine
Abstract:
The principle of the seismic performance evaluation methods is to provide a measure of capability for a building or set of buildings to be damaged by an earthquake. The common objective of many of these methods is to supply classification criteria. The purpose of this study is to present a method for assessing the seismic performance of structures, based on Pushover method; we are particularly interested in reinforced concrete frame structures, which represent a significant percentage of damaged structures after a seismic event. The work is based on the characterization of seismic movement of the various earthquake zones in terms of PGA and PGD that is obtained by means of SIMQK_GR and PRISM software and the correlation between the points of performance and the scalar characterizing the earthquakes will developed.
Keywords: Seismic performance, Pushover method, characterization of seismic motion, harmfulness of the seismic signal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 205511161 Electromagnetic Assessment of Submarine Power Cable Degradation Using Finite Element Method and Sensitivity Analysis
Authors: N. Boutra, N. Ravot, J. Benoit, O. Picon
Abstract:
Submarine power cables used for offshore wind farms electric energy distribution and transmission are subject to numerous threats. Some of the risks are associated with transport, installation and operating in harsh marine environment. This paper describes the feasibility of an electromagnetic low frequency sensing technique for submarine power cable failure prediction. The impact of a structural damage shape and material variability on the induced electric field is evaluated. The analysis is performed by modeling the cable using the finite element method, we use sensitivity analysis in order to identify the main damage characteristics affecting electric field variation. Lastly, we discuss the results obtained.Keywords: Electromagnetism, defect, finite element method, sensitivity analysis, submarine power cables.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 109311160 From Forbidden States to Linear Constraints
Authors: M. Zareiee, A. Dideban, P. Nazemzadeh
Abstract:
This paper deals with the problem of constructing constraints in non safe Petri Nets and then reducing the number of the constructed constraints. In a system, assigning some linear constraints to forbidden states is possible. Enforcing these constraints on the system prevents it from entering these states. But there is no a systematic method for assigning constraints to forbidden states in non safe Petri Nets. In this paper a useful method is proposed for constructing constraints in non safe Petri Nets. But when the number of these constraints is large enforcing them on the system may complicate the Petri Net model. So, another method is proposed for reducing the number of constructed constraints.Keywords: discrete event system, Supervisory control, Petri Net, Constraint
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150011159 Performance Analysis of a Combined Ordered Successive and Interference Cancellation Using Zero-Forcing Detection over Rayleigh Fading Channels in MIMO Systems
Authors: Jamal R. Elbergali
Abstract:
Multiple Input Multiple Output (MIMO) systems are wireless systems with multiple antenna elements at both ends of the link. Wireless communication systems demand high data rate and spectral efficiency with increased reliability. MIMO systems have been popular techniques to achieve these goals because increased data rate is possible through spatial multiplexing scheme and diversity. Spatial Multiplexing (SM) is used to achieve higher possible throughput than diversity. In this paper, we propose a Zero- Forcing (ZF) detection using a combination of Ordered Successive Interference Cancellation (OSIC) and Zero Forcing using Interference Cancellation (ZF-IC). The proposed method used an OSIC based on Signal to Noise Ratio (SNR) ordering to get the estimation of last symbol, then the estimated last symbol is considered to be an input to the ZF-IC. We analyze the Bit Error Rate (BER) performance of the proposed MIMO system over Rayleigh Fading Channel, using Binary Phase Shift Keying (BPSK) modulation scheme. The results show better performance than the previous methods.Keywords: SNR, BER, BPSK, MIMO, Modulation, Zero forcing (ZF), OSIC, ZF-IC, Spatial Multiplexing (SM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 169511158 Simultaneous Determination of Reference Free-Stream Temperature and Convective Heat Transfer Coefficient
Authors: Giho Jeong, Sooin Jeong, Kuisoon Kim
Abstract:
It is very important to determine reference temperature when convective temperature because it should be used to calculate the temperature potential. This paper deals with the development of a new method that can determine heat transfer coefficient and reference free stream temperature simultaneously, based on transient heat transfer experiments with using two narrow band thermo-tropic liquid crystals (TLC's). The method is validated through error analysis in terms of the random uncertainties in the measured temperatures. It is shown how the uncertainties in heat transfer coefficient and free stream temperature can be reduced. The general method described in this paper is applicable to many heat transfer models with unknown free stream temperature.Keywords: Heat transfer coefficient, Thermo-tropic LiquidCrystal (TLC), Free stream temperature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161311157 Exploration of Least Significant Bit Based Watermarking and Its Robustness against Salt and Pepper Noise
Authors: Kamaldeep Joshi, Rajkumar Yadav, Sachin Allwadhi
Abstract:
Image steganography is the best aspect of information hiding. In this, the information is hidden within an image and the image travels openly on the Internet. The Least Significant Bit (LSB) is one of the most popular methods of image steganography. In this method, the information bit is hidden at the LSB of the image pixel. In one bit LSB steganography method, the total numbers of the pixels and the total number of message bits are equal to each other. In this paper, the LSB method of image steganography is used for watermarking. The watermarking is an application of the steganography. The watermark contains 80*88 pixels and each pixel requirs 8 bits for its binary equivalent form so, the total number of bits required to hide the watermark are 80*88*8(56320). The experiment was performed on standard 256*256 and 512*512 size images. After the watermark insertion, histogram analysis was performed. A noise factor (salt and pepper) of 0.02 was added to the stego image in order to evaluate the robustness of the method. The watermark was successfully retrieved after insertion of noise. An experiment was performed in order to know the imperceptibility of stego and the retrieved watermark. It is clear that the LSB watermarking scheme is robust to the salt and pepper noise.Keywords: LSB, watermarking, salt and pepper, PSNR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 105311156 The Significance of Awareness about Gender Diversity for the Future of Work: A Multi-Method Study of Organizational Structures and Policies Considering Trans and Gender Diversity
Authors: Robin C. Ladwig
Abstract:
The future of work becomes less predictable which requires increasing adaptability of organizations to social and work changes. Society is transforming regarding gender identity in the sense that more people come forward to identify as trans and gender diverse (TGD). Organizations are ill-equipped to provide a safe and encouraging work environment by lacking inclusive organizational structures. The qualitative multi-method research about TGD inclusivity in the workplace explores the enablers and barriers for TGD individuals to satisfactorily engage in the work environment and organizational culture. Furthermore, these TGD insights are analyzed based on organizational implications and awareness from a leadership and management perspective. The semi-structured online interviews with TGD individuals and the photo-elicit open-ended questionnaire addressed to leadership and management in diversity, career development, and human resources have been analyzed with a critical grounded theory approach. Findings demonstrated the significance of TGD voices, the support of leadership and management, as well as the synergy between voices and leadership. Hence, it indicates practical implications such as the revision of exclusive language used in policies, data collection, or communication and reconsideration of organizational decision-making by leaders to include TGD voices.
Keywords: Future of work, occupational identity, organizational decision-making, trans and gender diverse identity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46611155 Using Data from Foursquare Web Service to Represent the Commercial Activity of a City
Authors: Taras Agryzkov, Almudena Nolasco-Cirugeda, Jos´e L. Oliver, Leticia Serrano-Estrada, Leandro Tortosa, Jos´e F. Vicent
Abstract:
This paper aims to represent the commercial activity of a city taking as source data the social network Foursquare. The city of Murcia is selected as case study, and the location-based social network Foursquare is the main source of information. After carrying out a reorganisation of the user-generated data extracted from Foursquare, it is possible to graphically display on a map the various city spaces and venues especially those related to commercial, food and entertainment sector businesses. The obtained visualisation provides information about activity patterns in the city of Murcia according to the people‘s interests and preferences and, moreover, interesting facts about certain characteristics of the town itself.
Keywords: Social networks, Foursquare, spatial analysis, data visualization, geocomputation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 267811154 Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach
Authors: Cláudio Santos, Madalena Araújo, Nuno Correia
Abstract:
The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.Keywords: Competencies, Delphi Method, Quality Function Deployment, Technology Foresight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 170311153 Statistical Reliability Based Modeling of Series and Parallel Operating Systems using Extreme Value Theory
Authors: Mohamad Mahdavi, Mojtaba Mahdavi
Abstract:
This paper tries to represent a new method for computing the reliability of a system which is arranged in series or parallel model. In this method we estimate life distribution function of whole structure using the asymptotic Extreme Value (EV) distribution of Type I, or Gumbel theory. We use EV distribution in minimal mode, for estimate the life distribution function of series structure and maximal mode for parallel system. All parameters also are estimated by Moments method. Reliability function and failure (hazard) rate and p-th percentile point of each function are determined. Other important indexes such as Mean Time to Failure (MTTF), Mean Time to repair (MTTR), for non-repairable and renewal systems in both of series and parallel structure will be computed.Keywords: Reliability, extreme value, parallel, series, lifedistribution
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209011152 Fuzzy Cost Support Vector Regression
Authors: Hadi Sadoghi Yazdi, Tahereh Royani, Mehri Sadoghi Yazdi, Sohrab Effati
Abstract:
In this paper, a new version of support vector regression (SVR) is presented namely Fuzzy Cost SVR (FCSVR). Individual property of the FCSVR is operation over fuzzy data whereas fuzzy cost (fuzzy margin and fuzzy penalty) are maximized. This idea admits to have uncertainty in the penalty and margin terms jointly. Robustness against noise is shown in the experimental results as a property of the proposed method and superiority relative conventional SVR.
Keywords: Support vector regression, Fuzzy input, Fuzzy cost.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137211151 Determination of Penicillins Residues in Livestock and Marine Products by LC/MS/MS
Authors: Ji Young Song, Soo Jung Hu, Hyunjin Joo, Joung Boon Hwang, Mi Ok Kim, Shin Jung Kang, Dae Hyun Cho
Abstract:
Multi-residue analysis method for penicillins was developed and validated in bovine muscle, chicken, milk, and flatfish. Detection was based on liquid chromatography tandem mass spectrometry (LC/MS/MS). The developed method was validated for specificity, precision, recovery, and linearity. The analytes were extracted with 80% acetonitrile and clean-up by a single reversed-phase solid-phase extraction step. Six penicillins presented recoveries higher than 76% with the exception of Amoxicillin (59.7%). Relative standard deviations (RSDs) were not more than 10%. LOQs values ranged from 0.1 and to 4.5 ug/kg. The method was applied to 128 real samples. Benzylpenicillin was detected in 15 samples and Cloxacillin was detected in 7 samples. Oxacillin was detected in 2 samples. But the detected levels were under the MRL levels for penicillins in samples.Keywords: Penicillins, livestock product, Multi-residue analysis, LC/MS/MS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 341811150 Long-Range Dependence of Financial Time Series Data
Authors: Chatchai Pesee
Abstract:
This paper examines long-range dependence or longmemory of financial time series on the exchange rate data by the fractional Brownian motion (fBm). The principle of spectral density function in Section 2 is used to find the range of Hurst parameter (H) of the fBm. If 0< H <1/2, then it has a short-range dependence (SRD). It simulates long-memory or long-range dependence (LRD) if 1/2< H <1. The curve of exchange rate data is fBm because of the specific appearance of the Hurst parameter (H). Furthermore, some of the definitions of the fBm, long-range dependence and selfsimilarity are reviewed in Section II as well. Our results indicate that there exists a long-memory or a long-range dependence (LRD) for the exchange rate data in section III. Long-range dependence of the exchange rate data and estimation of the Hurst parameter (H) are discussed in Section IV, while a conclusion is discussed in Section V.Keywords: Fractional Brownian motion, long-rangedependence, memory, short-range dependence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188411149 Implementation of ADETRAN Language Using Message Passing Interface
Authors: Akiyoshi Wakatani
Abstract:
This paper describes the Message Passing Interface (MPI) implementation of ADETRAN language, and its evaluation on SX-ACE supercomputers. ADETRAN language includes pdo statement that specifies the data distribution and parallel computations and pass statement that specifies the redistribution of arrays. Two methods for implementation of pass statement are discussed and the performance evaluation using Splitting-Up CG method is presented. The effectiveness of the parallelization is evaluated and the advantage of one dimensional distribution is empirically confirmed by using the results of experiments.Keywords: Iterative methods, array redistribution, translator, distributed memory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 119911148 Optimal DG Placement in Distribution systems Using Cost/Worth Analysis
Authors: M Ahmadigorji, A. Abbaspour, A Rajabi-Ghahnavieh, M. Fotuhi- Firuzabad
Abstract:
DG application has received increasing attention during recent years. The impact of DG on various aspects of distribution system operation, such as reliability and energy loss, depend highly on DG location in distribution feeder. Optimal DG placement is an important subject which has not been fully discussed yet. This paper presents an optimization method to determine optimal DG placement, based on a cost/worth analysis approach. This method considers technical and economical factors such as energy loss, load point reliability indices and DG costs, and particularly, portability of DG. The proposed method is applied to a test system and the impacts of different parameters such as load growth rate and load forecast uncertainty (LFU) on optimum DG location are studied.Keywords: Distributed generation, optimal placement, cost/worthanalysis, customer interruption cost, Dynamic programming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 297511147 Study Punching Shear of Steel Fiber Reinforced Self Compacting Concrete Slabs by Nonlinear Analysis
Authors: Khaled S. Ragab
Abstract:
This paper deals with behavior and capacity of punching shear force for flat slabs produced from steel fiber reinforced self compacting concrete (SFRSCC) by application nonlinear finite element method. Nonlinear finite element analysis on nine slab specimens was achieved by using ANSYS software. A general description of the finite element method, theoretical modeling of concrete and reinforcement are presented. The nonlinear finite element analysis program ANSYS is utilized owing to its capabilities to predict either the response of reinforced concrete slabs in the post elastic range or the ultimate strength of a flat slabs produced from steel fiber reinforced self compacting concrete (SFRSCC). In order to verify the analytical model used in this research using test results of the experimental data, the finite element analysis were performed then a parametric study of the effect ratio of flexural reinforcement, ratio of the upper reinforcement, and volume fraction of steel fibers were investigated. A comparison between the experimental results and those predicted by the existing models are presented. Results and conclusions may be useful for designers, have been raised, and represented.
Keywords: Nonlinear FEM, Punching shear behavior, Flat slabs and Steel fiber reinforced self compacting concrete (SFRSCC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 425711146 Meta Random Forests
Authors: Praveen Boinee, Alessandro De Angelis, Gian Luca Foresti
Abstract:
Leo Breimans Random Forests (RF) is a recent development in tree based classifiers and quickly proven to be one of the most important algorithms in the machine learning literature. It has shown robust and improved results of classifications on standard data sets. Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques to the random forests. We experiment the working of the ensembles of random forests on the standard data sets available in UCI data sets. We compare the original random forest algorithm with their ensemble counterparts and discuss the results.Keywords: Random Forests [RF], ensembles, UCI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 271211145 Mapping of C* Elements in Finite Element Method using Transformation Matrix
Authors: G. H. Majzoob, B. Sharifi Hamadani
Abstract:
Mapping between local and global coordinates is an important issue in finite element method, as all calculations are performed in local coordinates. The concern arises when subparametric are used, in which the shape functions of the field variable and the geometry of the element are not the same. This is particularly the case for C* elements in which the extra degrees of freedoms added to the nodes make the elements sub-parametric. In the present work, transformation matrix for C1* (an 8-noded hexahedron element with 12 degrees of freedom at each node) is obtained using equivalent C0 elements (with the same number of degrees of freedom). The convergence rate of 8-noded C1* element is nearly equal to its equivalent C0 element, while it consumes less CPU time with respect to the C0 element. The existence of derivative degrees of freedom at the nodes of C1* element along with excellent convergence makes it superior compared with it equivalent C0 element.Keywords: Mapping, Finite element method, C* elements, Convergence, C0 elements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 314911144 Optimal Capacitor Placement in a Radial Distribution System using Plant Growth Simulation Algorithm
Authors: R. Srinivasa Rao, S. V. L. Narasimham
Abstract:
This paper presents a new and efficient approach for capacitor placement in radial distribution systems that determine the optimal locations and size of capacitor with an objective of improving the voltage profile and reduction of power loss. The solution methodology has two parts: in part one the loss sensitivity factors are used to select the candidate locations for the capacitor placement and in part two a new algorithm that employs Plant growth Simulation Algorithm (PGSA) is used to estimate the optimal size of capacitors at the optimal buses determined in part one. The main advantage of the proposed method is that it does not require any external control parameters. The other advantage is that it handles the objective function and the constraints separately, avoiding the trouble to determine the barrier factors. The proposed method is applied to 9, 34, and 85-bus radial distribution systems. The solutions obtained by the proposed method are compared with other methods. The proposed method has outperformed the other methods in terms of the quality of solution.
Keywords: Distribution systems, Capacitor placement, loss reduction, Loss sensitivity factors, PGSA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 528211143 A Multimodal Approach for Biometric Authentication with Multiple Classifiers
Authors: Sorin Soviany, Cristina Soviany, Mariana Jurian
Abstract:
The paper presents a multimodal approach for biometric authentication, based on multiple classifiers. The proposed solution uses a post-classification biometric fusion method in which the biometric data classifiers outputs are combined in order to improve the overall biometric system performance by decreasing the classification error rates. The paper shows also the biometric recognition task improvement by means of a carefully feature selection, as much as not all of the feature vectors components support the accuracy improvement.
Keywords: biometric fusion, multiple classifiers
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 208311142 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition
Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine
Abstract:
In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180811141 Multiscale Blind Image Restoration with a New Method
Authors: Alireza Mallahzadeh, Hamid Dehghani, Iman Elyasi
Abstract:
A new method, based on the normal shrink and modified version of Katssagelous and Lay, is proposed for multiscale blind image restoration. The method deals with the noise and blur in the images. It is shown that the normal shrink gives the highest S/N (signal to noise ratio) for image denoising process. The multiscale blind image restoration is divided in two sections. The first part of this paper proposes normal shrink for image denoising and the second part of paper proposes modified version of katssagelous and Lay for blur estimation and the combination of both methods to reach a multiscale blind image restoration.Keywords: Multiscale blind image restoration, image denoising, blur estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 172211140 On Identity Disclosure Risk Measurement for Shared Microdata
Authors: M. N. Huda, S. Yamada, N. Sonehara
Abstract:
Probability-based identity disclosure risk measurement may give the same overall risk for different anonymization strategy of the same dataset. Some entities in the anonymous dataset may have higher identification risks than the others. Individuals are more concerned about higher risks than the average and are more interested to know if they have a possibility of being under higher risk. A notation of overall risk in the above measurement method doesn-t indicate whether some of the involved entities have higher identity disclosure risk than the others. In this paper, we have introduced an identity disclosure risk measurement method that not only implies overall risk, but also indicates whether some of the members have higher risk than the others. The proposed method quantifies the overall risk based on the individual risk values, the percentage of the records that have a risk value higher than the average and how larger the higher risk values are compared to the average. We have analyzed the disclosure risks for different disclosure control techniques applied to original microdata and present the results.Keywords: Anonymization, microdata, disclosure risk, privacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 136511139 Effects of Roughness Elements on Heat Transfer during Natural Convection
Abstract:
The present study focused on the investigation of the effects of roughness elements on heat transfer during natural convection in a rectangular cavity using numerical technique. Roughness elements were introduced on the bottom hot wall with a normalized amplitude (A*/H) of 0.1. Thermal and hydrodynamic behaviors were studied using computational method based on Lattice Boltzmann method (LBM). Numerical studies were performed for a laminar flow in the range of Rayleigh number (Ra) from 103 to 106 for a rectangular cavity of aspect ratio (L/H) 2.0 with a fluid of Prandtl number (Pr) 1.0. The presence of the sinusoidal roughness elements caused a minimum to maximum decrease in the heat transfer as 7% to 17% respectively compared to smooth enclosure. The results are presented for mean Nusselt number (Nu), isotherms and streamlines.Keywords: Natural convection, Rayleigh number, surface roughness, Nusselt number, Lattice Boltzmann Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717