Search results for: Genetic algorithm
1235 Capacity Optimization in Cooperative Cognitive Radio Networks
Authors: Mahdi Pirmoradian, Olayinka Adigun, Christos Politis
Abstract:
Cooperative spectrum sensing is a crucial challenge in cognitive radio networks. Cooperative sensing can increase the reliability of spectrum hole detection, optimize sensing time and reduce delay in cooperative networks. In this paper, an efficient central capacity optimization algorithm is proposed to minimize cooperative sensing time in a homogenous sensor network using OR decision rule subject to the detection and false alarm probabilities constraints. The evaluation results reveal significant improvement in the sensing time and normalized capacity of the cognitive sensors.Keywords: Cooperative networks, normalized capacity, sensing time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18791234 Application of Computational Intelligence for Sensor Fault Detection and Isolation
Authors: A. Jabbari, R. Jedermann, W. Lang
Abstract:
The new idea of this research is application of a new fault detection and isolation (FDI) technique for supervision of sensor networks in transportation system. In measurement systems, it is necessary to detect all types of faults and failures, based on predefined algorithm. Last improvements in artificial neural network studies (ANN) led to using them for some FDI purposes. In this paper, application of new probabilistic neural network features for data approximation and data classification are considered for plausibility check in temperature measurement. For this purpose, two-phase FDI mechanism was considered for residual generation and evaluation.
Keywords: Fault detection and Isolation, Neural network, Temperature measurement, measurement approximation and classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20701233 Real-time ROI Acquisition for Unsupervised and Touch-less Palmprint
Authors: Yi Feng, Jingwen Li, Lei Huang, Changping Liu
Abstract:
In this paper we proposed a novel method to acquire the ROI (Region of interest) of unsupervised and touch-less palmprint captured from a web camera in real-time. We use Viola-Jones approach and skin model to get the target area in real time. Then an innovative course-to-fine approach to detect the key points on the hand is described. A new algorithm is used to find the candidate key points coarsely and quickly. In finely stage, we verify the hand key points with the shape context descriptor. To make the user much comfortable, it can process the hand image with different poses, even the hand is closed. Experiments show promising result by using the proposed method in various conditions.Keywords: Palmprint recoginition, hand detection, touch-lesspalmprint, ROI localization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17261232 An Improved Switching Median filter for Uniformly Distributed Impulse Noise Removal
Authors: Rajoo Pandey
Abstract:
The performance of an image filtering system depends on its ability to detect the presence of noisy pixels in the image. Most of the impulse detection schemes assume the presence of salt and pepper noise in the images and do not work satisfactorily in case of uniformly distributed impulse noise. In this paper, a new algorithm is presented to improve the performance of switching median filter in detection of uniformly distributed impulse noise. The performance of the proposed scheme is demonstrated by the results obtained from computer simulations on various images.Keywords: Switching median filter, Impulse noise, Imagefiltering, Impulse detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19571231 On Improving Breast Cancer Prediction Using GRNN-CP
Authors: Kefaya Qaddoum
Abstract:
The aim of this study is to predict breast cancer and to construct a supportive model that will stimulate a more reliable prediction as a factor that is fundamental for public health. In this study, we utilize general regression neural networks (GRNN) to replace the normal predictions with prediction periods to achieve a reasonable percentage of confidence. The mechanism employed here utilises a machine learning system called conformal prediction (CP), in order to assign consistent confidence measures to predictions, which are combined with GRNN. We apply the resulting algorithm to the problem of breast cancer diagnosis. The results show that the prediction constructed by this method is reasonable and could be useful in practice.
Keywords: Neural network, conformal prediction, cancer classification, regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8391230 Arabic Character Recognition using Artificial Neural Networks and Statistical Analysis
Authors: Ahmad M. Sarhan, Omar I. Al Helalat
Abstract:
In this paper, an Arabic letter recognition system based on Artificial Neural Networks (ANNs) and statistical analysis for feature extraction is presented. The ANN is trained using the Least Mean Squares (LMS) algorithm. In the proposed system, each typed Arabic letter is represented by a matrix of binary numbers that are used as input to a simple feature extraction system whose output, in addition to the input matrix, are fed to an ANN. Simulation results are provided and show that the proposed system always produces a lower Mean Squared Error (MSE) and higher success rates than the current ANN solutions.Keywords: ANN, Backpropagation, Gaussian, LMS, MSE, Neuron, standard deviation, Widrow-Hoff rule.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20141229 Design and Development of Automatic Leveling and Equalizing Hoist Device for Spacecraft
Authors: Fu Hao, Sun Gang, Tang Laiying, Cui Junfeng
Abstract:
To solve the quick and accurate level-adjusting problem in the process of spacecraft precise mating, automatic leveling and equalizing hoist device for spacecraft is developed. Based on lifting point adjustment by utilizing XY-workbench, the leveling and equalizing controller by a self-adaptive control algorithm is proposed. By simulation analysis and lifting test using engineering prototype, validity and reliability of the hoist device is verified, which can meet the precision mating requirements of practical applications for spacecraft.Keywords: automatic leveling and equalizing, hoist device, lifting point adjustment, self-adaptive control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20201228 A Visual Control Flow Language and Its Termination Properties
Authors: László Lengyel, Tihamér Levendovszky, Hassan Charaf
Abstract:
This paper presents the visual control flow support of Visual Modeling and Transformation System (VMTS), which facilitates composing complex model transformations out of simple transformation steps and executing them. The VMTS Visual Control Flow Language (VCFL) uses stereotyped activity diagrams to specify control flow structures and OCL constraints to choose between different control flow branches. This work discusses the termination properties of VCFL and provides an algorithm to support the termination analysis of VCFL transformations.
Keywords: Control Flow, Metamodel-Based Visual Model Transformation, OCL, Termination Properties, UML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20661227 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings
Authors: G. Candel, D. Naccache
Abstract:
t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embedding. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic, and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n2) to O(n2/k), and the memory requirement from n2 to 2(n/k)2 which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.
Keywords: Concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4891226 Axisymmetric Nonlinear Analysis of Point Supported Shallow Spherical Shells
Authors: M. Altekin, R. F. Yükseler
Abstract:
Geometrically nonlinear axisymmetric bending of a shallow spherical shell with a point support at the apex under linearly varying axisymmetric load was investigated numerically. The edge of the shell was assumed to be simply supported or clamped. The solution was obtained by the finite difference and the Newton-Raphson methods. The thickness of the shell was considered to be uniform and the material was assumed to be homogeneous and isotropic. Sensitivity analysis was made for two geometrical parameters. The accuracy of the algorithm was checked by comparing the deflection with the solution of point supported circular plates and good agreement was obtained.
Keywords: Bending, nonlinear, plate, point support, shell.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18821225 Model Predictive Control of Turbocharged Diesel Engine with Exhaust Gas Recirculation
Authors: U. Yavas, M. Gokasan
Abstract:
Control of diesel engine’s air path has drawn a lot of attention due to its multi input-multi output, closed coupled, non-linear relation. Today, precise control of amount of air to be combusted is a must in order to meet with tight emission limits and performance targets. In this study, passenger car size diesel engine is modeled by AVL Boost RT, and then simulated with standard, industry level PID controllers. Finally, linear model predictive control is designed and simulated. This study shows the importance of modeling and control of diesel engines with flexible algorithm development in computer based systems.Keywords: Predictive control, engine control, engine modeling, PID control, feedforward compensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18171224 A High Quality Speech Coder at 600 bps
Authors: Yong Zhang, Ruimin Hu
Abstract:
This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.
Keywords: Speech coding, Vector quantization, linear predicition, Mixed sinusoidal excitation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21881223 Dynamic Admission Control for Quality of Service in IP Networks
Authors: J. Kasigwa, V. Baryamureeba, D. Williams
Abstract:
The goal of admission control is to support the Quality of Service demands of real-time applications via resource reservation in IP networks. In this paper we introduce a novel Dynamic Admission Control (DAC) mechanism for IP networks. The DAC dynamically allocates network resources using the previous network pattern for each path and uses the dynamic admission algorithm to improve bandwidth utilization using bandwidth brokers. We evaluate the performance of the proposed mechanism through trace-driven simulation experiments in view point of blocking probability, throughput and normalized utilization.Keywords: Bandwidth broker, dynamic admission control(DAC), IP networks, quality of service, real-time flows.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12941222 Minimizing Examinee Collusion with a Latin- Square Treatment Structure
Authors: M. H. Omar
Abstract:
Cheating on standardized tests has been a major concern as it potentially minimizes measurement precision. One major way to reduce cheating by collusion is to administer multiple forms of a test. Even with this approach, potential collusion is still quite large. A Latin-square treatment structure for distributing multiple forms is proposed to further reduce the colluding potential. An index to measure the extent of colluding potential is also proposed. Finally, with a simple algorithm, the various Latin-squares were explored to find the best structure to keep the colluding potential to a minimum.Keywords: Colluding pairs, Scale for Colluding Potential, Latin-Square Structure, Minimization of Cheating.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11661221 A Parallel Implementation of the Reverse Converter for the Moduli Set {2n, 2n–1, 2n–1–1}
Authors: Mehdi Hosseinzadeh, Amir Sabbagh Molahosseini, Keivan Navi
Abstract:
In this paper, a new reverse converter for the moduli set {2n, 2n–1, 2n–1–1} is presented. We improved a previously introduced conversion algorithm for deriving an efficient hardware design for reverse converter. Hardware architecture of the proposed converter is based on carry-save adders and regular binary adders, without the requirement for modular adders. The presented design is faster than the latest introduced reverse converter for moduli set {2n, 2n–1, 2n–1–1}. Also, it has better performance than the reverse converters for the recently introduced moduli set {2n+1–1, 2n, 2n–1}
Keywords: Residue arithmetic, Residue number system, Residue-to-Binary converter, Reverse converter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13131220 Iterative Clustering Algorithm for Analyzing Temporal Patterns of Gene Expression
Authors: Seo Young Kim, Jae Won Lee, Jong Sung Bae
Abstract:
Microarray experiments are information rich; however, extensive data mining is required to identify the patterns that characterize the underlying mechanisms of action. For biologists, a key aim when analyzing microarray data is to group genes based on the temporal patterns of their expression levels. In this paper, we used an iterative clustering method to find temporal patterns of gene expression. We evaluated the performance of this method by applying it to real sporulation data and simulated data. The patterns obtained using the iterative clustering were found to be superior to those obtained using existing clustering algorithms.Keywords: Clustering, microarray experiment, temporal pattern of gene expression data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13551219 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis
Authors: Nikolay Nikolaev, Evgueni Smirnov
Abstract:
This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13311218 Using PFA in Feature Analysis and Selection for H.264 Adaptation
Authors: Nora A. Naguib, Ahmed E. Hussein, Hesham A. Keshk, Mohamed I. El-Adawy
Abstract:
Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.
Keywords: Adaptation, feature selection, H.264, Principal Feature Analysis (PFA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16071217 A High Bitrate Information Hiding Algorithm for Video in Video
Authors: Wang Shou-Dao, Xiao Chuang-Bai, Lin Yu
Abstract:
In high bitrate information hiding techniques, 1 bit is embedded within each 4 x 4 Discrete Cosine Transform (DCT) coefficient block by means of vector quantization, then the hidden bit can be effectively extracted in terminal end. In this paper high bitrate information hiding algorithms are summarized, and the scheme of video in video is implemented. Experimental result shows that the host video which is embedded numerous auxiliary information have little visually quality decline. Peak Signal to Noise Ratio (PSNR)Y of host video only degrades 0.22dB in average, while the hidden information has a high percentage of survives and keeps a high robustness in H.264/AVC compression, the average Bit Error Rate(BER) of hiding information is 0.015%.Keywords: Information Hiding, Embed, Quantification, Extract
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18991216 Solving Machine Loading Problem in Flexible Manufacturing Systems Using Particle Swarm Optimization
Authors: S. G. Ponnambalam, Low Seng Kiat
Abstract:
In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve machine loading problem in flexible manufacturing system (FMS), with bicriterion objectives of minimizing system unbalance and maximizing system throughput in the occurrence of technological constraints such as available machining time and tool slots. A mathematical model is used to select machines, assign operations and the required tools. The performance of the PSO is tested by using 10 sample dataset and the results are compared with the heuristics reported in the literature. The results support that the proposed PSO is comparable with the algorithms reported in the literature.Keywords: Machine loading problem, FMS, Particle Swarm Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21201215 ZBTB17 Gene rs10927875 Polymorphism in Slovak Patients with Dilated Cardiomyopathy
Authors: I. Boroňová, J. Bernasovská, J. Kmec, E. Petrejčíková
Abstract:
Dilated cardiomyopathy (DCM) is a severe cardiovascular disorder characterized by progressive systolic dysfunction due to cardiac chamber dilatation and inefficient myocardial contractility often leading to chronic heart failure. Recently, a genome-wide association studies (GWASs) on DCM indicate that the ZBTB17 gene rs10927875 single nucleotide polymorphism is associated with DCM. The aim of the study was to identify the distribution of ZBTB17 gene rs10927875 polymorphism in 50 Slovak patients with DCM and 80 healthy control subjects using the Custom Taqman®SNP Genotyping assays. Risk factors detected at baseline in each group included age, sex, body mass index, smoking status, diabetes and blood pressure. The mean age of patients with DCM was 52.9±6.3 years; the mean age of individuals in control group was 50.3±8.9 years. The distribution of investigated genotypes of rs10927875 polymorphism within ZBTB17 gene in the cohort of Slovak patients with DCM was as follows: CC (38.8%), CT (55.1%), TT (6.1%), in controls: CC (43.8%), CT (51.2%), TT (5.0%). The risk allele T was more common among the patients with dilated cardiomyopathy than in normal controls (33.7% versus 30.6%). The differences in genotype or allele frequencies of ZBTB17 gene rs10927875 polymorphism were not statistically significant (p=0.6908; p=0.6098). The results of this study suggest that ZBTB17 gene rs10927875 polymorphism may be a risk factor for susceptibility to DCM in Slovak patients with DCM. Studies of numerous files and additional functional investigations are needed to fully understand the roles of genetic associations.
Keywords: Dilated cardiomyopathy, SNP polymorphism, ZBTB17 gene.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21411214 A Content-Based Optimization of Data Stream Television Multiplex
Authors: Jaroslav Polec, Martin Šimek, Michal Martinovič, Elena Šikudová
Abstract:
The television multiplex has reserved capacity and therefore we can use only limited number of videos for propagation of it. Appropriate composition of the multiplex has a major impact on how many videos is spread by multiplex. Therefore in this paper is designed a simple algorithm to optimize capacity utilization multiplex. Significant impact on the number of programs in the multiplex has also the fact from which programs is composed. Content of multiplex can be movies, news, sport, animated stories, documentaries, etc. These types have their own specific characteristics that affect their resulting data stream. In this paper is also done an impact analysis of the composition of the multiplex to use its capacity by video content.
Keywords: Multiplex, content, group of pictures, frame, capacity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14751213 Elimination of Redundant Links in Web Pages– Mathematical Approach
Authors: G. Poonkuzhali, K.Thiagarajan, K.Sarukesi
Abstract:
With the enormous growth on the web, users get easily lost in the rich hyper structure. Thus developing user friendly and automated tools for providing relevant information without any redundant links to the users to cater to their needs is the primary task for the website owners. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent one that are likely to contain the outlying data such as noise, irrelevant and redundant data. This paper proposes new algorithm for mining the web content by detecting the redundant links from the web documents using set theoretical(classical mathematics) such as subset, union, intersection etc,. Then the redundant links is removed from the original web content to get the required information by the user..Keywords: Web documents, Web content mining, redundantlink, outliers, set theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20151212 Reliability Improvement with Optimal Placement of Distributed Generation in Distribution System
Authors: N. Rugthaicharoencheep, T. Langtharthong
Abstract:
This paper presents the optimal placement and sizing of distributed generation (DG) in a distribution system. The problem is to reliability improvement of distribution system with distributed generations. The technique employed to solve the minimization problem is based on a developed Tabu search algorithm and reliability worth analysis. The developed methodology is tested with a distribution system of Roy Billinton Test System (RBTS) bus 2. It can be seen from the case study that distributed generation can reduce the customer interruption cost and therefore improve the reliability of the system. It is expected that our proposed method will be utilized effectively for distribution system operator.
Keywords: Distributed generation Optimization technique Reliability improvement, Distribution system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30191211 Comparative Study of Scheduling Algorithms for LTE Networks
Authors: Samia Dardouri, Ridha Bouallegue
Abstract:
Scheduling is the process of dynamically allocating physical resources to User Equipment (UE) based on scheduling algorithms implemented at the LTE base station. Various algorithms have been proposed by network researchers as the implementation of scheduling algorithm which represents an open issue in Long Term Evolution (LTE) standard. This paper makes an attempt to study and compare the performance of PF, MLWDF and EXP/PF scheduling algorithms. The evaluation is considered for a single cell with interference scenario for different flows such as Best effort, Video and VoIP in a pedestrian and vehicular environment using the LTE-Sim network simulator. The comparative study is conducted in terms of system throughput, fairness index, delay, packet loss ratio (PLR) and total cell spectral efficiency.
Keywords: LTE, Multimedia flows, Scheduling algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48111210 An Effective Framework for Chinese Syntactic Parsing
Authors: Xing Li, Chengqing Zong
Abstract:
This paper presents an effective framework for Chinesesyntactic parsing, which includes two parts. The first one is a parsing framework, which is based on an improved bottom-up chart parsingalgorithm, and integrates the idea of the beam search strategy of N bestalgorithm and heuristic function of A* algorithm for pruning, then get multiple parsing trees. The second is a novel evaluation model, which integrates contextual and partial lexical information into traditional PCFG model and defines a new score function. Using this model, the tree with the highest score is found out as the best parsing tree. Finally,the contrasting experiment results are given. Keywords?syntactic parsing, PCFG, pruning, evaluation model.
Keywords: syntactic parsing, PCFG, pruning, evaluation model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12211209 Database Compression for Intelligent On-board Vehicle Controllers
Authors: Ágoston Winkler, Sándor Juhász, Zoltán Benedek
Abstract:
The vehicle fleet of public transportation companies is often equipped with intelligent on-board passenger information systems. A frequently used but time and labor-intensive way for keeping the on-board controllers up-to-date is the manual update using different memory cards (e.g. flash cards) or portable computers. This paper describes a compression algorithm that enables data transmission using low bandwidth wireless radio networks (e.g. GPRS) by minimizing the amount of data traffic. In typical cases it reaches a compression rate of an order of magnitude better than that of the general purpose compressors. Compressed data can be easily expanded by the low-performance controllers, too.
Keywords: Data analysis, data compression, differentialencoding, run-length encoding, vehicle control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15671208 Improvement of Blood Detection Accuracy using Image Processing Techniques suitable for Capsule Endoscopy
Authors: Yong-Gyu Lee, Gilwon Yoon
Abstract:
Bleeding in the digestive duct is an important diagnostic parameter for patients. Blood in the endoscopic image can be determined by investigating the color tone of blood due to the degree of oxygenation, under- or over- illumination, food debris and secretions, etc. However, we found that how to pre-process raw images obtained from the capsule detectors was very important. We applied various image process methods suitable for the capsule endoscopic image in order to remove noises and unbalanced sensitivities for the image pixels. The results showed that much improvement was achieved by additional pre-processing techniques on the algorithm of determining bleeding areas.
Keywords: blood detection, capsule endoscopy, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18921207 Application of Adaptive Neuro-Fuzzy Inference System in Smoothing Transition Autoregressive Models
Authors: Ε. Giovanis
Abstract:
In this paper we propose and examine an Adaptive Neuro-Fuzzy Inference System (ANFIS) in Smoothing Transition Autoregressive (STAR) modeling. Because STAR models follow fuzzy logic approach, in the non-linear part fuzzy rules can be incorporated or other training or computational methods can be applied as the error backpropagation algorithm instead to nonlinear squares. Furthermore, additional fuzzy membership functions can be examined, beside the logistic and exponential, like the triangle, Gaussian and Generalized Bell functions among others. We examine two macroeconomic variables of US economy, the inflation rate and the 6-monthly treasury bills interest rates.Keywords: Forecasting, Neuro-Fuzzy, Smoothing transition, Time-series
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16301206 Code-Aided Turbo Channel Estimation for OFDM Systems with NB-LDPC Codes
Authors: Ł. Januszkiewicz, G. Bacci, H. Gierszal, M. Luise
Abstract:
In this paper channel estimation techniques are considered as the support methods for OFDM transmission systems based on Non Binary LDPC (Low Density Parity Check) codes. Standard frequency domain pilot aided LS (Least Squares) and LMMSE (Linear Minimum Mean Square Error) estimators are investigated. Furthermore, an iterative algorithm is proposed as a solution exploiting the NB-LDPC channel decoder to improve the performance of the LMMSE estimator. Simulation results of signals transmitted through fading mobile channels are presented to compare the performance of the proposed channel estimators.Keywords: LDPC codes, LMMSE, OFDM, turbo channelestimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659