Search results for: Binary logistic regression
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1140

Search results for: Binary logistic regression

570 On-line Identification of Continuous-time Hammerstein Systems via RBF Networks and Immune Algorithm

Authors: Tomohiro Hachino, Kengo Nagatomo, Hitoshi Takata

Abstract:

This paper deals with an on-line identification method of continuous-time Hammerstein systems by using the radial basis function (RBF) networks and immune algorithm (IA). An unknown nonlinear static part to be estimated is approximately represented by the RBF network. The IA is efficiently combined with the recursive least-squares (RLS) method. The objective function for the identification is regarded as the antigen. The candidates of the RBF parameters such as the centers and widths are coded into binary bit strings as the antibodies and searched by the IA. On the other hand, the candidates of both the weighting parameters of the RBF network and the system parameters of the linear dynamic part are updated by the RLS method. Simulation results are shown to illustrate the proposed method.

Keywords: Continuous-time System, Hammerstein System, OnlineIdentification, Immune Algorithm, RBF network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1362
569 Efficient Hardware Implementation of an Elliptic Curve Cryptographic Processor Over GF (2 163)

Authors: Massoud Masoumi, Hosseyn Mahdizadeh

Abstract:

A new and highly efficient architecture for elliptic curve scalar point multiplication which is optimized for a binary field recommended by NIST and is well-suited for elliptic curve cryptographic (ECC) applications is presented. To achieve the maximum architectural and timing improvements we have reorganized and reordered the critical path of the Lopez-Dahab scalar point multiplication architecture such that logic structures are implemented in parallel and operations in the critical path are diverted to noncritical paths. With G=41, the proposed design is capable of performing a field multiplication over the extension field with degree 163 in 11.92 s with the maximum achievable frequency of 251 MHz on Xilinx Virtex-4 (XC4VLX200) while 22% of the chip area is occupied, where G is the digit size of the underlying digit-serial finite field multiplier.

Keywords: Elliptic curve cryptography, FPGA implementation, scalar point multiplication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556
568 Interest Rate Fluctuation Effect on Commercial Bank’s Fixed Fund Deposit in Nigeria

Authors: Okolo Chimaobi Valentine

Abstract:

Commercial banks in Nigeria adopted many strategies to attract fresh deposits including the use of high deposit rate. However, pricing of banking services moved in favor of the banks at the expense of customers, resulting in their seeking other investment alternatives rather than saving their money in the bank. Both deposit and lending rates were greatly influenced by the Central Bank of Nigeria (CBN) decision on interest rate. Therefore, commercial bank effort to attract deposits via manipulation of her rates was greatly limited, otherwise the banks will be giving out more than it earned. The study aimed at examining the relationship between interest rate and fixed fund deposit of commercial banks, how policy-controlled interest rate affected commercial bank’s fixed fund deposit The researcher employed ordinary least square technique, using, multiple linear regression, unrestricted vector auto-regression, correlation matrix test, granger causality and impulse response graph in the analysis. Commercial bank’s interest rates affected commercial bank’s fixed fund deposit significantly while policy-controlled interest rate did not significantly transmit through the commercial bank’s interest rates to affect fixed fund deposit. While commercial banks seek creative ways to expand their fixed fund deposit, policy authorities in Nigeria should better coordinate interest rate fluctuation and induce competition in the entire financial sector.

Keywords: Commercial bank, fixed fund deposit, fluctuation effects, interest rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3601
567 Modeling Hybrid Systems with MLD Approach and Analysis of the Model Size and Complexity

Authors: H. Mahboubi, B. Moshiri, A. Khaki Seddigh

Abstract:

Recently, a great amount of interest has been shown in the field of modeling and controlling hybrid systems. One of the efficient and common methods in this area utilizes the mixed logicaldynamical (MLD) systems in the modeling. In this method, the system constraints are transformed into mixed-integer inequalities by defining some logic statements. In this paper, a system containing three tanks is modeled as a nonlinear switched system by using the MLD framework. Comparing the model size of the three-tank system with that of a two-tank system, it is deduced that the number of binary variables, the size of the system and its complexity tremendously increases with the number of tanks, which makes the control of the system more difficult. Therefore, methods should be found which result in fewer mixed-integer inequalities.

Keywords: Hybrid systems, mixed-integer inequalities, mixed logical dynamical systems, multi-tank system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
566 Genetic Algorithm for Solving Non-Convex Economic Dispatch Problem

Authors: Navid Javidtash, Abdolmohamad Davodi, Mojtaba Hakimzadeh, Abdolreza Roozbeh

Abstract:

Economic dispatch (ED) is considered to be one of the key functions in electric power system operation. This paper presents a new hybrid approach based genetic algorithm (GA) to economic dispatch problems. GA is most commonly used optimizing algorithm predicated on principal of natural evolution. Utilization of chaotic queue with GA generates several neighborhoods of near optimal solutions to keep solution variation. It could avoid the search process from becoming pre-mature. For the objective of chaotic queue generation, utilization of tent equation as opposed to logistic equation results in improvement of iterative speed. The results of the proposed approach were compared in terms of fuel cost, with existing differential evolution and other methods in literature.

Keywords: Economic Dispatch(ED), Optimization, Fuel Cost, Genetic Algorithm (GA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2395
565 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: Fast motion estimation, low-complexity motion estimation, video coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 854
564 Entropy Generation Analysis of Heat Recovery Vapor Generator for Ammonia-Water Mixture

Authors: Chul Ho Han, Kyoung Hoon Kim

Abstract:

This paper carries out a performance analysis based on the first and second laws of thermodynamics for heat recovery vapor generator (HRVG) of ammonia-water mixture when the heat source is low-temperature energy in the form of sensible heat. In the analysis, effects of the ammonia mass concentration and mass flow ratio of the binary mixture are investigated on the system performance including the effectiveness of heat transfer, entropy generation, and exergy efficiency. The results show that the ammonia concentration and the mass flow ratio of the mixture have significant effects on the system performance of HRVG.

Keywords: Entropy, exergy, ammonia-water mixture, heat exchanger.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2080
563 Ranking and Unranking Algorithms for k-ary Trees in Gray Code Order

Authors: Fateme Ashari-Ghomi, Najme Khorasani, Abbas Nowzari-Dalini

Abstract:

In this paper, we present two new ranking and unranking algorithms for k-ary trees represented by x-sequences in Gray code order. These algorithms are based on a gray code generation algorithm developed by Ahrabian et al.. In mentioned paper, a recursive backtracking generation algorithm for x-sequences corresponding to k-ary trees in Gray code was presented. This generation algorithm is based on Vajnovszki-s algorithm for generating binary trees in Gray code ordering. Up to our knowledge no ranking and unranking algorithms were given for x-sequences in this ordering. we present ranking and unranking algorithms with O(kn2) time complexity for x-sequences in this Gray code ordering

Keywords: k-ary Tree Generation, Ranking, Unranking, Gray Code.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2106
562 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: Artificial neural networks, fuel consumption, machine learning, regression, statistical tests.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829
561 Comparative study of the Genetic Algorithms and Hessians Method for Minimization of the Electric Power Production Cost

Authors: L. Abdelmalek, M. Zerikat, M. Rahli

Abstract:

In this paper, we present a comparative study of the genetic algorithms and Hessian-s methods for optimal research of the active powers in an electric network of power. The objective function which is the performance index of production of electrical energy is minimized by satisfying the constraints of the equality type and inequality type initially by the Hessian-s methods and in the second time by the genetic Algorithms. The results found by the application of AG for the minimization of the electric production costs of power are very encouraging. The algorithms seem to be an effective technique to solve a great number of problems and which are in constant evolution. Nevertheless it should be specified that the traditional binary representation used for the genetic algorithms creates problems of optimization of management of the large-sized networks with high numerical precision.

Keywords: Genetic algorithm, Flow of optimum loadimpedances, Hessians method, Optimal distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
560 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
559 Kinetic Study of Gluconic Acid Batch Fermentation by Aspergillus niger

Authors: Akbarningrum Fatmawati, Rudy Agustriyanto, Lindawati

Abstract:

Gluconic acid is one of interesting chemical products in industries such as detergents, leather, photographic, textile, and especially in food and pharmaceutical industries. Fermentation is an advantageous process to produce gluconic acid. Mathematical modeling is important in the design and operation of fermentation process. In fact, kinetic data must be available for modeling. The kinetic parameters of gluconic acid production by Aspergillus niger in batch culture was studied in this research at initial substrate concentration of 150, 200 and 250 g/l. The kinetic models used were logistic equation for growth, Luedeking-Piret equation for gluconic acid formation, and Luedeking-Piret-like equation for glucose consumption. The Kinetic parameters in the model were obtained by minimizing non linear least squares curve fitting.

Keywords: Aspergillus niger, fermentation, gluconic acid, kinetic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2693
558 Photograph Based Pair-matching Recognition of Human Faces

Authors: Min Yao, Kota Aoki, Hiroshi Nagahashi

Abstract:

In this paper, a novel system recognition of human faces without using face different color photographs is proposed. It mainly in face detection, normalization and recognition. Foot method of combination of Haar-like face determined segmentation and region-based histogram stretchi (RHST) is proposed to achieve more accurate perf using Haar. Apart from an effective angle norm side-face (pose) normalization, which is almost a might be important and beneficial for the prepr introduced. Then histogram-based and photom normalization methods are investigated and ada retinex (ASR) is selected for its satisfactory illumin Finally, weighted multi-block local binary pattern with 3 distance measures is applied for pair-mat Experimental results show its advantageous perfo with PCA and multi-block LBP, based on a principle.

Keywords: Face detection, pair-matching rec normalization, skin color segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
557 Application the Queuing Theory in the Warehouse Optimization

Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova

Abstract:

The aim of optimization of store management is not only designing the situation of store management itself including its equipment, technology and operation. In optimization of store management we need to consider also synchronizing of technological, transport, store and service operations throughout the whole process of logistic chain in such a way that a natural flow of material from provider to consumer will be achieved the shortest possible way, in the shortest possible time in requested quality and quantity and with minimum costs. The paper deals with the application of the queuing theory for optimization of warehouse processes. The first part refers to common information about the problematic of warehousing and using mathematical methods for logistics chains optimization. The second part refers to preparing a model of a warehouse within queuing theory. The conclusion of the paper includes two examples of using queuing theory in praxis.

Keywords: Queuing theory, logistics system, mathematical methods, warehouse optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6567
556 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1919
555 Statistics of Exon Lengths in Animals, Plants, Fungi, and Protists

Authors: Alexander Kaplunovsky, Vladimir Khailenko, Alexander Bolshoy, Shara Atambayeva, AnatoliyIvashchenko

Abstract:

Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.

Keywords: Comparative genomics, exon-intron structure, eukaryotic clustering, linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2573
554 On Pseudo-Random and Orthogonal Binary Spreading Sequences

Authors: Abhijit Mitra

Abstract:

Different pseudo-random or pseudo-noise (PN) as well as orthogonal sequences that can be used as spreading codes for code division multiple access (CDMA) cellular networks or can be used for encrypting speech signals to reduce the residual intelligence are investigated. We briefly review the theoretical background for direct sequence CDMA systems and describe the main characteristics of the maximal length, Gold, Barker, and Kasami sequences. We also discuss about variable- and fixed-length orthogonal codes like Walsh- Hadamard codes. The equivalence of PN and orthogonal codes are also derived. Finally, a new PN sequence is proposed which is shown to have certain better properties than the existing codes.

Keywords: Code division multiple access, pseudo-noise codes, maximal length, Gold, Barker, Kasami, Walsh-Hadamard, autocorrelation, crosscorrelation, figure of merit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6054
553 Development of Rock Engineering System-Based Models for Tunneling Progress Analysis and Evaluation: Case Study of Tailrace Tunnel of Azad Power Plant Project

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Tunneling progress is a key parameter in the blasting method of tunneling. Taking measures to enhance tunneling advance can limit the progress distance without a supporting system, subsequently reducing or eliminating the risk of damage. This paper focuses on modeling tunneling progress using three main groups of parameters (tunneling geometry, blasting pattern, and rock mass specifications) based on the Rock Engineering Systems (RES) methodology. In the proposed models, four main effective parameters on tunneling progress are considered as inputs (RMR, Q-system, Specific charge of blasting, Area), with progress as the output. Data from 86 blasts conducted at the tailrace tunnel in the Azad Dam, western Iran, were used to evaluate the progress value for each blast. The results indicated that, for the 86 blasts, the progress of the estimated model aligns mostly with the measured progress. This paper presents a method for building the interaction matrix (statistical base) of the RES model. Additionally, a comparison was made between the results of the new RES-based model and a Multi-Linear Regression (MLR) analysis model. In the RES-based model, the effective parameters are RMR (35.62%), Q (28.6%), q (specific charge of blasting) (20.35%), and A (15.42%), respectively, whereas for MLR analysis, the main parameters are RMR, Q (system), q, and A. These findings confirm the superior performance of the RES-based model over the other proposed models.

Keywords: Rock Engineering Systems, tunneling progress, Multi Linear Regression, Specific charge of blasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141
552 Dynamic Clustering using Particle Swarm Optimization with Application in Unsupervised Image Classification

Authors: Mahamed G.H. Omran, Andries P Engelbrecht, Ayed Salman

Abstract:

A new dynamic clustering approach (DCPSO), based on Particle Swarm Optimization, is proposed. This approach is applied to unsupervised image classification. The proposed approach automatically determines the "optimum" number of clusters and simultaneously clusters the data set with minimal user interference. The algorithm starts by partitioning the data set into a relatively large number of clusters to reduce the effects of initial conditions. Using binary particle swarm optimization the "best" number of clusters is selected. The centers of the chosen clusters is then refined via the Kmeans clustering algorithm. The experiments conducted show that the proposed approach generally found the "optimum" number of clusters on the tested images.

Keywords: Clustering Validation, Particle Swarm Optimization, Unsupervised Clustering, Unsupervised Image Classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2454
551 Indoor Air Pollution of the Flexographic Printing Environment

Authors: Jelena S. Kiurski, Vesna S. Kecić, Snežana M. Aksentijević

Abstract:

The identification and evaluation of organic and inorganic pollutants were performed in a flexographic facility in Novi Sad, Serbia. Air samples were collected and analyzed in situ, during 4-hours working time at five sampling points by the mobile gas chromatograph and ozonometer at the printing of collagen casing. Experimental results showed that the concentrations of isopropyl alcohol, acetone, total volatile organic compounds and ozone varied during the sampling times. The highest average concentrations of 94.80 ppm and 102.57 ppm were achieved at 200 minutes from starting the production for isopropyl alcohol and total volatile organic compounds, respectively. The mutual dependences between target hazardous and microclimate parameters were confirmed using a multiple linear regression model with software package STATISTICA 10. Obtained multiple coefficients of determination in the case of ozone and acetone (0.507 and 0.589) with microclimate parameters indicated a moderate correlation between the observed variables. However, a strong positive correlation was obtained for isopropyl alcohol and total volatile organic compounds (0.760 and 0.852) with microclimate parameters. Higher values of parameter F than Fcritical for all examined dependences indicated the existence of statistically significant difference between the concentration levels of target pollutants and microclimates parameters. Given that, the microclimate parameters significantly affect the emission of investigated gases and the application of eco-friendly materials in production process present a necessity.

Keywords: Flexographic printing, indoor air, multiple regression analysis, pollution emission.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1308
550 Regression Approach for Optimal Purchase of Hosts Cluster in Fixed Fund for Hadoop Big Data Platform

Authors: Haitao Yang, Jianming Lv, Fei Xu, Xintong Wang, Yilin Huang, Lanting Xia, Xuewu Zhu

Abstract:

Given a fixed fund, purchasing fewer hosts of higher capability or inversely more of lower capability is a must-be-made trade-off in practices for building a Hadoop big data platform. An exploratory study is presented for a Housing Big Data Platform project (HBDP), where typical big data computing is with SQL queries of aggregate, join, and space-time condition selections executed upon massive data from more than 10 million housing units. In HBDP, an empirical formula was introduced to predict the performance of host clusters potential for the intended typical big data computing, and it was shaped via a regression approach. With this empirical formula, it is easy to suggest an optimal cluster configuration. The investigation was based on a typical Hadoop computing ecosystem HDFS+Hive+Spark. A proper metric was raised to measure the performance of Hadoop clusters in HBDP, which was tested and compared with its predicted counterpart, on executing three kinds of typical SQL query tasks. Tests were conducted with respect to factors of CPU benchmark, memory size, virtual host division, and the number of element physical host in cluster. The research has been applied to practical cluster procurement for housing big data computing.

Keywords: Hadoop platform planning, optimal cluster scheme at fixed-fund, performance empirical formula, typical SQL query tasks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 837
549 Persian Printed Numeral Characters Recognition Using Geometrical Central Moments and Fuzzy Min-Max Neural Network

Authors: Hamid Reza Boveiri

Abstract:

In this paper, a new proposed system for Persian printed numeral characters recognition with emphasis on representation and recognition stages is introduced. For the first time, in Persian optical character recognition, geometrical central moments as character image descriptor and fuzzy min-max neural network for Persian numeral character recognition has been used. Set of different experiments on binary images of regular, translated, rotated and scaled Persian numeral characters has been done and variety of results has been presented. The best result was 99.16% correct recognition demonstrating geometrical central moments and fuzzy min-max neural network are adequate for Persian printed numeral character recognition.

Keywords: Fuzzy min-max neural network, geometrical centralmoments, optical character recognition, Persian digits recognition, Persian printed numeral characters recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
548 Optimal Embedded Generation Allocation in Distribution System Employing Real Coded Genetic Algorithm Method

Authors: Mohd Herwan Sulaiman, Omar Aliman, Siti Rafidah Abdul Rahim

Abstract:

This paper proposes a new methodology for the optimal allocation and sizing of Embedded Generation (EG) employing Real Coded Genetic Algorithm (RCGA) to minimize the total power losses and to improve voltage profiles in the radial distribution networks. RCGA is a method that uses continuous floating numbers as representation which is different from conventional binary numbers. The RCGA is used as solution tool, which can determine the optimal location and size of EG in radial system simultaneously. This method is developed in MATLAB. The effect of EG units- installation and their sizing to the distribution networks are demonstrated using 24 bus system.

Keywords: Embedded generation (EG), load flow study, optimal allocation, real coded genetic algorithm (RCGA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
547 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation

Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi

Abstract:

Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.

Keywords: Integral production, level set method, morphological operation, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4232
546 Impact of Metallic Furniture on UWB Channel Statistical Characteristics by BER

Authors: Yu-Shuai Chen , Chien-Ching Chiu , Chung-Hsin Huang, Chien-Hung Chen

Abstract:

The bit error rate (BER) performance for ultra-wide band (UWB) indoor communication with impact of metallic furniture is investigated. The impulse responses of different indoor environments for any transmitter and receiver location are computed by shooting and bouncing ray/image and inverse Fourier transform techniques. By using the impulse responses of these multipath channels, the BER performance for binary pulse amplitude modulation (BPAM) impulse radio UWB communication system are calculated. Numerical results have shown that the multi-path effect by the metallic cabinets is an important factor for BER performance. Also the outage probability for the UWB multipath environment with metallic cabinets is more serious (about 18%) than with wooden cabinets. Finally, it is worth noting that in these cases the present work provides not only comparative information but also quantitative information on the performance reduction.

Keywords: UWB, multipath, outage probability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1431
545 Child Homicide Victimization and Community Context: A Research Note

Authors: Bohsiu Wu

Abstract:

Among serious crimes, child homicide is a rather rare event. However, the killing of children stirs up a special type of emotion in society that pales other criminal acts. This study examines the relevancy of three possible community-level explanations for child homicide: social deprivation, female empowerment, and social isolation. The social deprivation hypothesis posits that child homicide results from lack of resources in communities. The female empowerment hypothesis argues that a higher female status translates into a higher level of capability to prevent child homicide. Finally, the social isolation hypothesis regards child homicide as a result of lack of social connectivity. Child homicide data, aggregated by US postal ZIP codes in California from 1990 to 1999, were analyzed with a negative binomial regression. The results of the negative binomial analysis demonstrate that social deprivation is the most salient and consistent predictor among all other factors in explaining child homicide victimization at the ZIP-code level. Both social isolation and female labor force participation are weak predictors of child homicide victimization across communities. Further, results from the negative binomial regression show that it is the communities with a higher, not lower, degree of female labor force participation that are associated with a higher count of child homicide. It is possible that poor communities with a higher level of female employment have a lesser capacity to provide the necessary care and protection for the children. Policies aiming at reducing social deprivation and strengthening female empowerment possess the potential to reduce child homicide in the community.

Keywords: Child homicide, deprivation, empowerment, isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 689
544 Factors Affecting Slot Machine Performance in an Electronic Gaming Machine Facility

Authors: Etienne Provencal, David L. St-Pierre

Abstract:

A facility exploiting only electronic gambling machines (EGMs) opened in 2007 in Quebec City, Canada under the name of Salons de Jeux du Québec (SdjQ). This facility is one of the first worldwide to rely on that business model. This paper models the performance of such EGMs. The interest from a managerial point of view is to identify the variables that can be controlled or influenced so that a comprehensive model can help improve the overall performance of the business. The EGM individual performance model contains eight different variables under study (Game Title, Progressive jackpot, Bonus Round, Minimum Coin-in, Maximum Coin-in, Denomination, Slant Top and Position). Using data from Quebec City’s SdjQ, a linear regression analysis explains 90.80% of the EGM performance. Moreover, results show a behavior slightly different than that of a casino. The addition of GameTitle as a factor to predict the EGM performance is one of the main contributions of this paper. The choice of the game (GameTitle) is very important. Games having better position do not have significantly better performance than games located elsewhere on the gaming floor. Progressive jackpots have a positive and significant effect on the individual performance of EGMs. The impact of BonusRound on the dependent variable is significant but negative. The effect of Denomination is significant but weakly negative. As expected, the Language of an EGMS does not impact its individual performance. This paper highlights some possible improvements by indicating which features are performing well. Recommendations are given to increase the performance of the EGMs performance.

Keywords: EGM, linear regression, model prediction, slot operations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
543 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: A. Appe, B. Poluparthi, L. Kasivajjula, U. Mv, S. Bagadi, P. Modi, A. Singh, H. Gunupudi, S. Troiano, J. Paul, J. Stovall, J. Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data are considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP (SHapley Additive exPlanations), to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since it is data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for e.g., quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP, a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: Competition, DAGs, hospital, healthcare, machine learning, market share, random forest, SHAP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 284
542 Performance of Chaotic Lu System in CDMA Satellites Communications Systems

Authors: K. Kemih, M. Benslama

Abstract:

This paper investigates the problem of spreading sequence and receiver code synchronization techniques for satellite based CDMA communications systems. The performance of CDMA system depends on the autocorrelation and cross-correlation properties of the used spreading sequences. In this paper we propose the uses of chaotic Lu system to generate binary sequences for spreading codes in a direct sequence spread CDMA system. To minimize multiple access interference (MAI) we propose the use of genetic algorithm for optimum selection of chaotic spreading sequences. To solve the problem of transmitter-receiver synchronization, we use the passivity controls. The concept of semipassivity is defined to find simple conditions which ensure boundedness of the solutions of coupled Lu systems. Numerical results are presented to show the effectiveness of the proposed approach.

Keywords: About Chaotic Lu system, synchronization, Spreading sequence, Genetic Algorithm. Passive System

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
541 Electricity Load Modeling: An Application to Italian Market

Authors: Giovanni Masala, Stefania Marica

Abstract:

Forecasting electricity load plays a crucial role regards decision making and planning for economical purposes. Besides, in the light of the recent privatization and deregulation of the power industry, the forecasting of future electricity load turned out to be a very challenging problem. Empirical data about electricity load highlights a clear seasonal behavior (higher load during the winter season), which is partly due to climatic effects. We also emphasize the presence of load periodicity at a weekly basis (electricity load is usually lower on weekends or holidays) and at daily basis (electricity load is clearly influenced by the hour). Finally, a long-term trend may depend on the general economic situation (for example, industrial production affects electricity load). All these features must be captured by the model. The purpose of this paper is then to build an hourly electricity load model. The deterministic component of the model requires non-linear regression and Fourier series while we will investigate the stochastic component through econometrical tools. The calibration of the parameters’ model will be performed by using data coming from the Italian market in a 6 year period (2007- 2012). Then, we will perform a Monte Carlo simulation in order to compare the simulated data respect to the real data (both in-sample and out-of-sample inspection). The reliability of the model will be deduced thanks to standard tests which highlight a good fitting of the simulated values.

Keywords: ARMA-GARCH process, electricity load, fitting tests, Fourier series, Monte Carlo simulation, non-linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486