Search results for: multivariate adaptive regression spline
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1673

Search results for: multivariate adaptive regression spline

833 Adaptive Skin Segmentation Using Color Distance Map

Authors: Mohammad Shoyaib, M. Abdullah-Al-Wadud, Oksam Chae

Abstract:

In this paper an effective approach for segmenting human skin regions in images taken at different environment is proposed. The proposed method uses a color distance map that is flexible enough to reliably detect the skin regions even if the illumination conditions of the image vary. Local image conditions is also focused, which help the technique to adaptively detect differently illuminated skin regions of an image. Moreover, usage of local information also helps the skin detection process to get rid of picking up much noisy pixels.

Keywords: Color Distance map, Reference skin color, Regiongrowing, Skin segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1977
832 Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

Authors: Nazif Çalış, Murat Erişoğlu, Hamza Erol, Tayfun Servi

Abstract:

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

Keywords: Self Organizing Mixture Network, MixtureDiscriminant Analysis, Waveform Datasets, Glass Identification, Mixture of Multivariate Normal Distributions

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1495
831 Affine Projection Adaptive Filter with Variable Regularization

Authors: Young-Seok Choi

Abstract:

We propose two affine projection algorithms (APA) with variable regularization parameter. The proposed algorithms dynamically update the regularization parameter that is fixed in the conventional regularized APA (R-APA) using a gradient descent based approach. By introducing the normalized gradient, the proposed algorithms give birth to an efficient and a robust update scheme for the regularization parameter. Through experiments we demonstrate that the proposed algorithms outperform conventional R-APA in terms of the convergence rate and the misadjustment error.

Keywords: Affine projection, regularization, gradient descent, system identification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
830 Faults Forecasting System

Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki

Abstract:

This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.

Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
829 Monte Carlo Estimation of Heteroscedasticity and Periodicity Effects in a Panel Data Regression Model

Authors: Nureni O. Adeboye, Dawud A. Agunbiade

Abstract:

This research attempts to investigate the effects of heteroscedasticity and periodicity in a Panel Data Regression Model (PDRM) by extending previous works on balanced panel data estimation within the context of fitting PDRM for Banks audit fee. The estimation of such model was achieved through the derivation of Joint Lagrange Multiplier (LM) test for homoscedasticity and zero-serial correlation, a conditional LM test for zero serial correlation given heteroscedasticity of varying degrees as well as conditional LM test for homoscedasticity given first order positive serial correlation via a two-way error component model. Monte Carlo simulations were carried out for 81 different variations, of which its design assumed a uniform distribution under a linear heteroscedasticity function. Each of the variation was iterated 1000 times and the assessment of the three estimators considered are based on Variance, Absolute bias (ABIAS), Mean square error (MSE) and the Root Mean Square (RMSE) of parameters estimates. Eighteen different models at different specified conditions were fitted, and the best-fitted model is that of within estimator when heteroscedasticity is severe at either zero or positive serial correlation value. LM test results showed that the tests have good size and power as all the three tests are significant at 5% for the specified linear form of heteroscedasticity function which established the facts that Banks operations are severely heteroscedastic in nature with little or no periodicity effects.

Keywords: Audit fee, heteroscedasticity, Lagrange multiplier test, periodicity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708
828 Retaining Structural System Active Vibration Control

Authors: Ming-Hui Lee, Shou-Jen Hsu

Abstract:

This study presents an active vibration control technique to reduce the earthquake responses of a retained structural system. The proposed technique is a synthesis of the adaptive input estimation method (AIEM) and linear quadratic Gaussian (LQG) controller. The AIEM can estimate an unknown system input online. The LQG controller offers optimal control forces to suppress wall-structural system vibration. The numerical results show robust performance in the active vibration control technique.

Keywords: Active vibration control, AIEM, LQG, Optimal control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1851
827 Transformation of the Traditional Landscape of Kabul Old City: A Study for Its Conservation

Authors: Mohammad Umar Azizi, Tetsuya Ando

Abstract:

This study investigates the transformation of the traditional landscape of Kabul Old City through an examination of five case study areas. Based on physical observation, three types of houses are found: traditional, mixed and modern. Firstly, characteristics of the houses are described according to construction materials and the number of stories. Secondly, internal and external factors are considered in order to implement a conservation plan. Finally, an adaptive conservation plan is suggested to protect the traditional landscape of Kabul Old City.

Keywords: Conservation, District 1, Kabul Old City, landscape, transformation, traditional houses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165
826 Adaptive Distributed Genetic Algorithms and Its VLSI Design

Authors: Kazutaka Kobayashi, Norihiko Yoshida, Shuji Narazaki

Abstract:

This paper presents a dynamic adaptation scheme for the frequency of inter-deme migration in distributed genetic algorithms (GA), and its VLSI hardware design. Distributed GA, or multi-deme-based GA, uses multiple populations which evolve concurrently. The purpose of dynamic adaptation is to improve convergence performance so as to obtain better solutions. Through simulation experiments, we proved that our scheme achieves better performance than fixed frequency migration schemes.

Keywords: Genetic algorithms, dynamic adaptation, VLSI hardware.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
825 Evidence of the Long-run Equilibrium between Money Demand Determinants in Croatia

Authors: B. Skrabic, N. Tomic-Plazibat

Abstract:

In this paper real money demand function is analyzed within multivariate time-series framework. Cointegration approach is used (Johansen procedure) assuming interdependence between money demand determinants, which are nonstationary variables. This will help us to understand the behavior of money demand in Croatia, revealing the significant influence between endogenous variables in vector autoregrression system (VAR), i.e. vector error correction model (VECM). Exogeneity of the explanatory variables is tested. Long-run money demand function is estimated indicating slow speed of adjustment of removing the disequilibrium. Empirical results provide the evidence that real industrial production and exchange rate explains the most variations of money demand in the long-run, while interest rate is significant only in short-run.

Keywords: Cointegration, Long-run equilibrium, Money demand function, Vector error correction model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2134
824 A New Composition Method of Admissible Support Vector Kernel Based on Reproducing Kernel

Authors: Wei Zhang, Xin Zhao, Yi-Fan Zhu, Xin-Jian Zhang

Abstract:

Kernel function, which allows the formulation of nonlinear variants of any algorithm that can be cast in terms of dot products, makes the Support Vector Machines (SVM) have been successfully applied in many fields, e.g. classification and regression. The importance of kernel has motivated many studies on its composition. It-s well-known that reproducing kernel (R.K) is a useful kernel function which possesses many properties, e.g. positive definiteness, reproducing property and composing complex R.K by simple operation. There are two popular ways to compute the R.K with explicit form. One is to construct and solve a specific differential equation with boundary value whose handicap is incapable of obtaining a unified form of R.K. The other is using a piecewise integral of the Green function associated with a differential operator L. The latter benefits the computation of a R.K with a unified explicit form and theoretical analysis, whereas there are relatively later studies and fewer practical computations. In this paper, a new algorithm for computing a R.K is presented. It can obtain the unified explicit form of R.K in general reproducing kernel Hilbert space. It avoids constructing and solving the complex differential equations manually and benefits an automatic, flexible and rigorous computation for more general RKHS. In order to validate that the R.K computed by the algorithm can be used in SVM well, some illustrative examples and a comparison between R.K and Gaussian kernel (RBF) in support vector regression are presented. The result shows that the performance of R.K is close or slightly superior to that of RBF.

Keywords: admissible support vector kernel, reproducing kernel, reproducing kernel Hilbert space, Green function, support vectorregression

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1509
823 Sparsity-Aware Affine Projection Algorithm for System Identification

Authors: Young-Seok Choi

Abstract:

This work presents a new type of the affine projection (AP) algorithms which incorporate the sparsity condition of a system. To exploit the sparsity of the system, a weighted l1-norm regularization is imposed on the cost function of the AP algorithm. Minimizing the cost function with a subgradient calculus and choosing two distinct weighting for l1-norm, two stochastic gradient based sparsity regularized AP (SR-AP) algorithms are developed. Experimental results exhibit that the SR-AP algorithms outperform the typical AP counterparts for identifying sparse systems.

Keywords: System identification, adaptive filter, affine projection, sparsity, sparse system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1535
822 An Improved Transfer Logic of the Two-Path Algorithm for Acoustic Echo Cancellation

Authors: Chang Liu, Zishu He

Abstract:

Adaptive echo cancellers with two-path algorithm are applied to avoid the false adaptation during the double-talk situation. In the two-path algorithm, several transfer logic solutions have been proposed to control the filter update. This paper presents an improved transfer logic solution. It improves the convergence speed of the two-path algorithm, and allows the reduction of the memory elements and computational complexity. Results of simulations show the improved performance of the proposed solution.

Keywords: Acoustic echo cancellation, Echo return lossenhancement (ERLE), Two-path algorithm, Transfer logic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
821 Evolutionary Computing Approach for the Solution of Initial value Problems in Ordinary Differential Equations

Authors: A. Junaid, M. A. Z. Raja, I. M. Qureshi

Abstract:

An evolutionary computing technique for solving initial value problems in Ordinary Differential Equations is proposed in this paper. Neural network is used as a universal approximator while the adaptive parameters of neural networks are optimized by genetic algorithm. The solution is achieved on the continuous grid of time instead of discrete as in other numerical techniques. The comparison is carried out with classical numerical techniques and the solution is found with a uniform accuracy of MSE ≈ 10-9 .

Keywords: Neural networks, Unsupervised learning, Evolutionary computing, Numerical methods, Fitness evaluation function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1748
820 Interest Rate Fluctuation Effect on Commercial Bank’s Fixed Fund Deposit in Nigeria

Authors: Okolo Chimaobi Valentine

Abstract:

Commercial banks in Nigeria adopted many strategies to attract fresh deposits including the use of high deposit rate. However, pricing of banking services moved in favor of the banks at the expense of customers, resulting in their seeking other investment alternatives rather than saving their money in the bank. Both deposit and lending rates were greatly influenced by the Central Bank of Nigeria (CBN) decision on interest rate. Therefore, commercial bank effort to attract deposits via manipulation of her rates was greatly limited, otherwise the banks will be giving out more than it earned. The study aimed at examining the relationship between interest rate and fixed fund deposit of commercial banks, how policy-controlled interest rate affected commercial bank’s fixed fund deposit The researcher employed ordinary least square technique, using, multiple linear regression, unrestricted vector auto-regression, correlation matrix test, granger causality and impulse response graph in the analysis. Commercial bank’s interest rates affected commercial bank’s fixed fund deposit significantly while policy-controlled interest rate did not significantly transmit through the commercial bank’s interest rates to affect fixed fund deposit. While commercial banks seek creative ways to expand their fixed fund deposit, policy authorities in Nigeria should better coordinate interest rate fluctuation and induce competition in the entire financial sector.

Keywords: Commercial bank, fixed fund deposit, fluctuation effects, interest rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3581
819 Exploiting Self-Adaptive Replication Management on Decentralized Tuple Space

Authors: Xing Jiankuan, Qin Zheng, Zhang Jinxue

Abstract:

Decentralized Tuple Space (DTS) implements tuple space model among a series of decentralized hosts and provides the logical global shared tuple repository. Replication has been introduced to promote performance problem incurred by remote tuple access. In this paper, we propose a replication approach of DTS allowing replication policies self-adapting. The accesses from users or other nodes are monitored and collected to contribute the decision making. The replication policy may be changed if the better performance is expected. The experiments show that this approach suitably adjusts the replication policies, which brings negligible overhead.

Keywords: Decentralization, Replication Management, SelfAdaption, Tuple Space.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1193
818 Kalman Filter Based Adaptive Reduction of Motion Artifact from Photoplethysmographic Signal

Authors: S. Seyedtabaii, L. Seyedtabaii

Abstract:

Artifact free photoplethysmographic (PPG) signals are necessary for non-invasive estimation of oxygen saturation (SpO2) in arterial blood. Movement of a patient corrupts the PPGs with motion artifacts, resulting in large errors in the computation of Sp02. This paper presents a study on using Kalman Filter in an innovative way by modeling both the Artillery Blood Pressure (ABP) and the unwanted signal, additive motion artifact, to reduce motion artifacts from corrupted PPG signals. Simulation results show acceptable performance regarding LMS and variable step LMS, thus establishing the efficacy of the proposed method.

Keywords: Kalman filter, Motion artifact, PPG, Photoplethysmography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4241
817 Collaborative and Experimental Cultures in Virtual Reality Journalism: From the Perspective of Content Creators

Authors: Radwa Mabrook

Abstract:

Virtual Reality (VR) content creation is a complex and an expensive process, which requires multi-disciplinary teams of content creators. Grant schemes from technology companies help media organisations to explore the VR potential in journalism and factual storytelling. Media organisations try to do as much as they can in-house, but they may outsource due to time constraints and skill availability. Journalists, game developers, sound designers and creative artists work together and bring in new cultures of work. This study explores the collaborative experimental nature of VR content creation, through tracing every actor involved in the process and examining their perceptions of the VR work. The study builds on Actor Network Theory (ANT), which decomposes phenomena into their basic elements and traces the interrelations among them. Therefore, the researcher conducted 22 semi-structured interviews with VR content creators between November 2017 and April 2018. Purposive and snowball sampling techniques allowed the researcher to recruit fact-based VR content creators from production studios and media organisations, as well as freelancers. Interviews lasted up to three hours, and they were a mix of Skype calls and in-person interviews. Participants consented for their interviews to be recorded, and for their names to be revealed in the study. The researcher coded interviews’ transcripts in Nvivo software, looking for key themes that correspond with the research questions. The study revealed that VR content creators must be adaptive to change, open to learn and comfortable with mistakes. The VR content creation process is very iterative because VR has no established work flow or visual grammar. Multi-disciplinary VR team members often speak different languages making it hard to communicate. However, adaptive content creators perceive VR work as a fun experience and an opportunity to learn. The traditional sense of competition and the strive for information exclusivity are now replaced by a strong drive for knowledge sharing. VR content creators are open to share their methods of work and their experiences. They target to build a collaborative network that aims to harness VR technology for journalism and factual storytelling. Indeed, VR is instilling collaborative and experimental cultures in journalism.

Keywords: Collaborative culture, content creation, experimental culture, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758
816 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: Artificial neural networks, fuel consumption, machine learning, regression, statistical tests.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 792
815 An Adaptive Model for Blind Image Restoration using Bayesian Approach

Authors: S.K. Satpathy, S.K. Nayak, K. K. Nagwanshi, S. Panda, C. Ardil

Abstract:

Image restoration involves elimination of noise. Filtering techniques were adopted so far to restore images since last five decades. In this paper, we consider the problem of image restoration degraded by a blur function and corrupted by random noise. A method for reducing additive noise in images by explicit analysis of local image statistics is introduced and compared to other noise reduction methods. The proposed method, which makes use of an a priori noise model, has been evaluated on various types of images. Bayesian based algorithms and technique of image processing have been described and substantiated with experimentation using MATLAB.

Keywords: Image Restoration, Probability DensityFunction (PDF), Neural Networks, Bayesian Classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222
814 Fuzzy Controller Design for Ball and Beam System with an Improved Ant Colony Optimization

Authors: Yeong-Hwa Chang, Chia-Wen Chang, Hung-Wei Lin, C.W. Tao

Abstract:

In this paper, an improved ant colony optimization (ACO) algorithm is proposed to enhance the performance of global optimum search. The strategy of the proposed algorithm has the capability of fuzzy pheromone updating, adaptive parameter tuning, and mechanism resetting. The proposed method is utilized to tune the parameters of the fuzzy controller for a real beam and ball system. Simulation and experimental results indicate that better performance can be achieved compared to the conventional ACO algorithms in the aspect of convergence speed and accuracy.

Keywords: Ant colony algorithm, Fuzzy control, ball and beamsystem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172
813 Conceptual Method for Flexible Business Process Modeling

Authors: Adla Bentellis, Zizette Boufaïda

Abstract:

Nowadays, the pace of business change is such that, increasingly, new functionality has to be realized and reliably installed in a matter of days, or even hours. Consequently, more and more business processes are prone to a continuous change. The objective of the research in progress is to use the MAP model, in a conceptual modeling method for flexible and adaptive business process. This method can be used to capture the flexibility dimensions of a business process; it takes inspiration from modularity concept in the object oriented paradigm to establish a hierarchical construction of the BP modeling. Its intent is to provide a flexible modeling that allows companies to quickly adapt their business processes.

Keywords: Business Process, Business process modeling, flexibility, MAP Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871
812 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1017
811 An Innovational Intermittent Algorithm in Networks-On-Chip (NOC)

Authors: Ahmad M. Shafiee, Mehrdad Montazeri, Mahdi Nikdast

Abstract:

Every day human life experiences new equipments more automatic and with more abilities. So the need for faster processors doesn-t seem to finish. Despite new architectures and higher frequencies, a single processor is not adequate for many applications. Parallel processing and networks are previous solutions for this problem. The new solution to put a network of resources on a chip is called NOC (network on a chip). The more usual topology for NOC is mesh topology. There are several routing algorithms suitable for this topology such as XY, fully adaptive, etc. In this paper we have suggested a new algorithm named Intermittent X, Y (IX/Y). We have developed the new algorithm in simulation environment to compare delay and power consumption with elders' algorithms.

Keywords: Computer architecture, parallel computing, NOC, routing algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1652
810 The Fuel Consumption and Non Linear Model Metropolitan and Large City Transportation System

Authors: Mudjiastuti Handajani

Abstract:

The national economy development affects the vehicle ownership which ultimately increases fuel consumption. The rise of the vehicle ownership is dominated by the increasing number of motorcycles. This research aims to analyze and identify the characteristics of fuel consumption, the city transportation system, and to analyze the relationship and the effect of the city transportation system on the fuel consumption. A multivariable analysis is used in this study. The data analysis techniques include: a Multivariate Multivariable Analysis by using the R software. More than 84% of fuel on Java is consumed in metropolitan and large cities. The city transportation system variables that strongly effect the fuel consumption are population, public vehicles, private vehicles and private bus. This method can be developed to control the fuel consumption by considering the urban transport system and city tipology. The effect can reducing subsidy on the fuel consumption, increasing state economic.

Keywords: city, consumption, fuel, transportation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921
809 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
808 Gene Selection Guided by Feature Interdependence

Authors: Hung-Ming Lai, Andreas Albrecht, Kathleen Steinhöfel

Abstract:

Cancers could normally be marked by a number of differentially expressed genes which show enormous potential as biomarkers for a certain disease. Recent years, cancer classification based on the investigation of gene expression profiles derived by high-throughput microarrays has widely been used. The selection of discriminative genes is, therefore, an essential preprocess step in carcinogenesis studies. In this paper, we have proposed a novel gene selector using information-theoretic measures for biological discovery. This multivariate filter is a four-stage framework through the analyses of feature relevance, feature interdependence, feature redundancy-dependence and subset rankings, and having been examined on the colon cancer data set. Our experimental result show that the proposed method outperformed other information theorem based filters in all aspect of classification errors and classification performance.

Keywords: Colon cancer, feature interdependence, feature subset selection, gene selection, microarray data analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2114
807 Electrocardiogram Signal Denoising Using a Hybrid Technique

Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim

Abstract:

This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.

Keywords: Hybrid technique, ADTF, DWT, tresholding, ECG signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1172
806 Statistics of Exon Lengths in Animals, Plants, Fungi, and Protists

Authors: Alexander Kaplunovsky, Vladimir Khailenko, Alexander Bolshoy, Shara Atambayeva, AnatoliyIvashchenko

Abstract:

Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.

Keywords: Comparative genomics, exon-intron structure, eukaryotic clustering, linear regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2546
805 Development of Rock Engineering System-Based Models for Tunneling Progress Analysis and Evaluation: Case Study of Tailrace Tunnel of Azad Power Plant Project

Authors: S. Golmohammadi, M. Noorian Bidgoli

Abstract:

Tunneling progress is a key parameter in the blasting method of tunneling. Taking measures to enhance tunneling advance can limit the progress distance without a supporting system, subsequently reducing or eliminating the risk of damage. This paper focuses on modeling tunneling progress using three main groups of parameters (tunneling geometry, blasting pattern, and rock mass specifications) based on the Rock Engineering Systems (RES) methodology. In the proposed models, four main effective parameters on tunneling progress are considered as inputs (RMR, Q-system, Specific charge of blasting, Area), with progress as the output. Data from 86 blasts conducted at the tailrace tunnel in the Azad Dam, western Iran, were used to evaluate the progress value for each blast. The results indicated that, for the 86 blasts, the progress of the estimated model aligns mostly with the measured progress. This paper presents a method for building the interaction matrix (statistical base) of the RES model. Additionally, a comparison was made between the results of the new RES-based model and a Multi-Linear Regression (MLR) analysis model. In the RES-based model, the effective parameters are RMR (35.62%), Q (28.6%), q (specific charge of blasting) (20.35%), and A (15.42%), respectively, whereas for MLR analysis, the main parameters are RMR, Q (system), q, and A. These findings confirm the superior performance of the RES-based model over the other proposed models.

Keywords: Rock Engineering Systems, tunneling progress, Multi Linear Regression, Specific charge of blasting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 90
804 Indoor Air Pollution of the Flexographic Printing Environment

Authors: Jelena S. Kiurski, Vesna S. Kecić, Snežana M. Aksentijević

Abstract:

The identification and evaluation of organic and inorganic pollutants were performed in a flexographic facility in Novi Sad, Serbia. Air samples were collected and analyzed in situ, during 4-hours working time at five sampling points by the mobile gas chromatograph and ozonometer at the printing of collagen casing. Experimental results showed that the concentrations of isopropyl alcohol, acetone, total volatile organic compounds and ozone varied during the sampling times. The highest average concentrations of 94.80 ppm and 102.57 ppm were achieved at 200 minutes from starting the production for isopropyl alcohol and total volatile organic compounds, respectively. The mutual dependences between target hazardous and microclimate parameters were confirmed using a multiple linear regression model with software package STATISTICA 10. Obtained multiple coefficients of determination in the case of ozone and acetone (0.507 and 0.589) with microclimate parameters indicated a moderate correlation between the observed variables. However, a strong positive correlation was obtained for isopropyl alcohol and total volatile organic compounds (0.760 and 0.852) with microclimate parameters. Higher values of parameter F than Fcritical for all examined dependences indicated the existence of statistically significant difference between the concentration levels of target pollutants and microclimates parameters. Given that, the microclimate parameters significantly affect the emission of investigated gases and the application of eco-friendly materials in production process present a necessity.

Keywords: Flexographic printing, indoor air, multiple regression analysis, pollution emission.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284