Search results for: noise performance.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6392

Search results for: noise performance.

5972 Cost Benefit Analysis: Evaluation among the Millimetre Wavebands and SHF Bands of Small Cell 5G Networks

Authors: Emanuel Teixeira, Anderson Ramos, Marisa Lourenço, Fernando J. Velez, Jon M. Peha

Abstract:

This article discusses the benefit cost analysis aspects of millimetre wavebands (mmWaves) and Super High Frequency (SHF). The devaluation along the distance of the carrier-to-noise-plus-interference ratio with the coverage distance is assessed by considering two different path loss models, the two-slope urban micro Line-of-Sight (UMiLoS) for the SHF band and the modified Friis propagation model, for frequencies above 24 GHz. The equivalent supported throughput is estimated at the 5.62, 28, 38, 60 and 73 GHz frequency bands and the influence of carrier-to-noise-plus-interference ratio in the radio and network optimization process is explored. Mostly owing to the lessening caused by the behaviour of the two-slope propagation model for SHF band, the supported throughput at this band is higher than at the millimetre wavebands only for the longest cell lengths. The benefit cost analysis of these pico-cellular networks was analysed for regular cellular topologies, by considering the unlicensed spectrum. For shortest distances, we can distinguish an optimal of the revenue in percentage terms for values of the cell length, R ≈ 10 m for the millimeter wavebands and for longest distances an optimal of the revenue can be observed at R ≈ 550 m for the 5.62 GHz. It is possible to observe that, for the 5.62 GHz band, the profit is slightly inferior than for millimetre wavebands, for the shortest Rs, and starts to increase for cell lengths approximately equal to the ratio between the break-point distance and the co-channel reuse factor, achieving a maximum for values of R approximately equal to 550 m.

Keywords: 5G, millimetre wavebands, super high-frequency band, SINR, signal-to-interference-plus-noise ratio, cost benefit analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 712
5971 Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression

Authors: Amarunnishad T.M., Govindan V.K., Abraham T. Mathew

Abstract:

An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.

Keywords: Image compression, Edge detection, Ground truth image, Peak signal to noise ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1692
5970 Influence of Loudness Compression on Hearing with Bone Anchored Hearing Implants

Authors: Anja Kurz, Marc Flynn, Tobias Good, Marco Caversaccio, Martin Kompis

Abstract:

Bone Anchored Hearing Implants (BAHI) are  routinely used in patients with conductive or mixed hearing loss, e.g.  if conventional air conduction hearing aids cannot be used. New  sound processors and new fitting software now allow the adjustment  of parameters such as loudness compression ratios or maximum  power output separately. Today it is unclear, how the choice of these  parameters influences aided speech understanding in BAHI users.  In this prospective experimental study, the effect of varying the  compression ratio and lowering the maximum power output in a  BAHI were investigated.  Twelve experienced adult subjects with a mixed hearing loss  participated in this study. Four different compression ratios (1.0; 1.3;  1.6; 2.0) were tested along with two different maximum power output  settings, resulting in a total of eight different programs. Each  participant tested each program during two weeks. A blinded Latin  square design was used to minimize bias.  For each of the eight programs, speech understanding in quiet and  in noise was assessed. For speech in quiet, the Freiburg number test  and the Freiburg monosyllabic word test at 50, 65, and 80 dB SPL  were used. For speech in noise, the Oldenburg sentence test was  administered.  Speech understanding in quiet and in noise was improved  significantly in the aided condition in any program, when compared  to the unaided condition. However, no significant differences were  found between any of the eight programs. In contrast, on a subjective  level there was a significant preference for medium compression  ratios of 1.3 to 1.6 and higher maximum power output.

 

Keywords: Bone Anchored Hearing Implant, Compression, Maximum Power Output, Speech understanding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2053
5969 Modeling Metrics for Monitoring Software Project Performance Based On the GQM Model

Authors: Mariayee Doraisamy, Suhaimi Bin Ibrahim, Mohd Naz’ri Mahrin

Abstract:

There are several methods to monitor software projects and the objective for monitoring is to ensure that the software projects are developed and delivered successfully. A performance measurement is a method that is closely associated with monitoring and it can be scrutinized by looking at two important attributes which are efficiency and effectiveness both of which are factors that are important for the success of a software project. Consequently, a successful steering is achieved by monitoring and controlling a software project via the performance measurement criteria and metrics. Hence, this paper is aimed at identifying the performance measurement criteria and the metrics for monitoring the performance of a software project by using the Goal Question Metrics (GQM) approach. The GQM approach is utilized to ensure that the identified metrics are reliable and useful. These identified metrics are useful guidelines for project managers to monitor the performance of their software projects.

Keywords: Software project performance, Goal Question Metrics, Performance Measurement Criteria, Metrics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2510
5968 Predictive Fuzzy Logic Controller for Agile Micro-Satellite

Authors: A. Bellar, M.K. Fellah, A.M. Si Mohammed, M. Bensaada, L. Boukhris

Abstract:

This paper presents the use of the predictive fuzzy logic controller (PFLC) applied to attitude control system for agile micro-satellite. In order to reduce the effect of unpredictable time delays and large uncertainties, the algorithm employs predictive control to predict the attitude of the satellite. Comparison of the PFLC and conventional fuzzy logic controller (FLC) is presented to evaluate the performance of the control system during attitude maneuver. The two proposed models have been analyzed with the same level of noise and external disturbances. Simulation results demonstrated the feasibility and advantages of the PFLC on the attitude determination and control system (ADCS) of agile satellite.

Keywords: Agile micro-satellite, Attitude control, fuzzy logic, predictive control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1756
5967 A Trainable Neural Network Ensemble for ECG Beat Classification

Authors: Atena Sajedin, Shokoufeh Zakernejad, Soheil Faridi, Mehrdad Javadi, Reza Ebrahimpour

Abstract:

This paper illustrates the use of a combined neural network model for classification of electrocardiogram (ECG) beats. We present a trainable neural network ensemble approach to develop customized electrocardiogram beat classifier in an effort to further improve the performance of ECG processing and to offer individualized health care. We process a three stage technique for detection of premature ventricular contraction (PVC) from normal beats and other heart diseases. This method includes a denoising, a feature extraction and a classification. At first we investigate the application of stationary wavelet transform (SWT) for noise reduction of the electrocardiogram (ECG) signals. Then feature extraction module extracts 10 ECG morphological features and one timing interval feature. Then a number of multilayer perceptrons (MLPs) neural networks with different topologies are designed. The performance of the different combination methods as well as the efficiency of the whole system is presented. Among them, Stacked Generalization as a proposed trainable combined neural network model possesses the highest recognition rate of around 95%. Therefore, this network proves to be a suitable candidate in ECG signal diagnosis systems. ECG samples attributing to the different ECG beat types were extracted from the MIT-BIH arrhythmia database for the study.

Keywords: ECG beat Classification; Combining Classifiers;Premature Ventricular Contraction (PVC); Multi Layer Perceptrons;Wavelet Transform

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209
5966 A Cooperative Weighted Discriminator Energy Detector Technique in Fading Environment

Authors: Muhammad R. Alrabeiah, Ibrahim S. Alnomay

Abstract:

The need in cognitive radio system for a simple, fast, and independent technique to sense the spectrum occupancy has led to the energy detection approach. Energy detector is known by its dependency on noise variation in the system which is one of its major drawbacks. In this paper, we are aiming to improve its performance by utilizing a weighted collaborative spectrum sensing, it is similar to the collaborative spectrum sensing methods introduced previously in the literature. These weighting methods give more improvement for collaborative spectrum sensing as compared to no weighting case. There is two method proposed in this paper: the first one depends on the channel status between each sensor and the primary user while the second depends on the value of the energy measured in each sensor.

Keywords: Cognitive radio, Spectrum sensing, Collaborative sensors, Weighted Decisions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
5965 Subjective Versus Objective Assessment for Magnetic Resonance Images

Authors: Heshalini Rajagopal, Li Sze Chow, Raveendran Paramesran

Abstract:

Magnetic Resonance Imaging (MRI) is one of the most important medical imaging modality. Subjective assessment of the image quality is regarded as the gold standard to evaluate MR images. In this study, a database of 210 MR images which contains ten reference images and 200 distorted images is presented. The reference images were distorted with four types of distortions: Rician Noise, Gaussian White Noise, Gaussian Blur and DCT compression. The 210 images were assessed by ten subjects. The subjective scores were presented in Difference Mean Opinion Score (DMOS). The DMOS values were compared with four FR-IQA metrics. We have used Pearson Linear Coefficient (PLCC) and Spearman Rank Order Correlation Coefficient (SROCC) to validate the DMOS values. The high correlation values of PLCC and SROCC shows that the DMOS values are close to the objective FR-IQA metrics.

Keywords: Medical Resonance (MR) images, Difference Mean Opinion Score (DMOS), Full Reference Image Quality Assessment (FR-IQA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2207
5964 Architectural, Technological and Performance Issues in Enterprise Applications

Authors: Melek Oktay, Ayşe Betül Gülbağcı, Mustafa Sarıöz

Abstract:

Enterprise applications are complex systems that are hard to develop and deploy in organizations. Although software application development tools, frameworks, methodologies and patterns are rapidly developing; many projects fail by causing big costs. There are challenging issues that programmers and designers face with while working on enterprise applications. In this paper, we present the three of the significant issues: Architectural, technological and performance. The important subjects in each issue are pointed out and recommendations are given. In architectural issues the lifecycle, meta-architecture, guidelines are pointed out. .NET and Java EE platforms are presented in technological issues. The importance of performance, measuring performance and profilers are explained in performance issues.

Keywords: Enterprise Applications, Architecture, Technology, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2184
5963 Relation of Optimal Pilot Offsets in the Shifted Constellation-Based Method for the Detection of Pilot Contamination Attacks

Authors: Dimitriya A. Mihaylova, Zlatka V. Valkova-Jarvis, Georgi L. Iliev

Abstract:

One possible approach for maintaining the security of communication systems relies on Physical Layer Security mechanisms. However, in wireless time division duplex systems, where uplink and downlink channels are reciprocal, the channel estimate procedure is exposed to attacks known as pilot contamination, with the aim of having an enhanced data signal sent to the malicious user. The Shifted 2-N-PSK method involves two random legitimate pilots in the training phase, each of which belongs to a constellation, shifted from the original N-PSK symbols by certain degrees. In this paper, legitimate pilots’ offset values and their influence on the detection capabilities of the Shifted 2-N-PSK method are investigated. As the implementation of the technique depends on the relation between the shift angles rather than their specific values, the optimal interconnection between the two legitimate constellations is investigated. The results show that no regularity exists in the relation between the pilot contamination attacks (PCA) detection probability and the choice of offset values. Therefore, an adversary who aims to obtain the exact offset values can only employ a brute-force attack but the large number of possible combinations for the shifted constellations makes such a type of attack difficult to successfully mount. For this reason, the number of optimal shift value pairs is also studied for both 100% and 98% probabilities of detecting pilot contamination attacks. Although the Shifted 2-N-PSK method has been broadly studied in different signal-to-noise ratio scenarios, in multi-cell systems the interference from the signals in other cells should be also taken into account. Therefore, the inter-cell interference impact on the performance of the method is investigated by means of a large number of simulations. The results show that the detection probability of the Shifted 2-N-PSK decreases inversely to the signal-to-interference-plus-noise ratio.

Keywords: Channel estimation, inter-cell interference, pilot contamination attacks, wireless communications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 671
5962 Ranking of Performance Measures of GSCM towards Sustainability: Using Analytic Hierarchy Process

Authors: Dixit Garg, S. Luthra, A. Haleem

Abstract:

During recent years, the natural environment has become a challenging topic that business organizations must consider due to the economic and ecological impacts and increasing awareness of environment protection among society. Organizations are trying to achieve the goals of improvement in environment, low cost, high quality, flexibility and more customer satisfaction. Performance measurement frameworks are very useful to monitor the performance of any organization. The basic goal of this paper is to identify performance measures and ranking of these performance measures of GSCM performance measurement towards sustainability framework. Five perspectives (Environment, Economic, Social, Operational and Cost performances) and nineteen performance measures of GSCM performance towards sustainability have been have been identified from extensive literature review. Analytical Hierarchy Process (AHP) technique has been utilized for ranking of these performance perspectives and measures. All pair comparisons in AHP have been made on the basis on the experts’ opinions (selected from academia and industry). Ranking of these performance perspectives and measures will help to understand the importance of environmental, economic, social, operational performances and cost performances in the supply chain.

Keywords: Analytical Hierarchy Process (AHP), Green Supply Chain Management, Performance Measures (PM), Sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3242
5961 Lifting Wavelet Transform and Singular Values Decomposition for Secure Image Watermarking

Authors: Siraa Ben Ftima, Mourad Talbi, Tahar Ezzedine

Abstract:

In this paper, we present a technique of secure watermarking of grayscale and color images. This technique consists in applying the Singular Value Decomposition (SVD) in LWT (Lifting Wavelet Transform) domain in order to insert the watermark image (grayscale) in the host image (grayscale or color image). It also uses signature in the embedding and extraction steps. The technique is applied on a number of grayscale and color images. The performance of this technique is proved by the PSNR (Pick Signal to Noise Ratio), the MSE (Mean Square Error) and the SSIM (structural similarity) computations.

Keywords: Color image, grayscale image, singular values decomposition, lifting wavelet transform, image watermarking, watermark, secure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1021
5960 Quadratic Pulse Inversion Ultrasonic Imaging(QPI): A Two-Step Procedure for Optimization of Contrast Sensitivity and Specificity

Authors: Mamoun F. Al-Mistarihi

Abstract:

We have previously introduced an ultrasonic imaging approach that combines harmonic-sensitive pulse sequences with a post-beamforming quadratic kernel derived from a second-order Volterra filter (SOVF). This approach is designed to produce images with high sensitivity to nonlinear oscillations from microbubble ultrasound contrast agents (UCA) while maintaining high levels of noise rejection. In this paper, a two-step algorithm for computing the coefficients of the quadratic kernel leading to reduction of tissue component introduced by motion, maximizing the noise rejection and increases the specificity while optimizing the sensitivity to the UCA is presented. In the first step, quadratic kernels from individual singular modes of the PI data matrix are compared in terms of their ability of maximize the contrast to tissue ratio (CTR). In the second step, quadratic kernels resulting in the highest CTR values are convolved. The imaging results indicate that a signal processing approach to this clinical challenge is feasible.

Keywords: Volterra Filter, Pulse Inversion, Ultrasonic Imaging, Contrast Agent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
5959 Angle of Arrival Estimation Using Maximum Likelihood Method

Authors: H. K. Hwang, Zekeriya Aliyazicioglu, Solomon Wu, Hung Lu, Nick Wilkins, Daniel Kerr

Abstract:

Multiple-input multiple-output (MIMO) radar has received increasing attention in recent years. MIMO radar has many advantages over conventional phased array radar such as target detection,resolution enhancement, and interference suppression. In this paper, the results are presented from a simulation study of MIMO uniformly-spaced linear array (ULA) antennas. The performance is investigated under varied parameters, including varied array size, pseudo random (PN) sequence length, number of snapshots, and signal to noise ratio (SNR). The results of MIMO are compared to a traditional array antenna.

Keywords: Multiple-input multiple-output (MIMO) radar, phased array antenna, target detection, radar signal processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2802
5958 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: Iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
5957 Image Steganography Using Least Significant Bit Technique

Authors: Preeti Kumari, Ridhi Kapoor

Abstract:

 In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.

Keywords: Steganography, LSB, encoding, information hiding, color image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1082
5956 Evaluating the Performance of Offensive Lineman in the NFL

Authors: Nikhil Byanna, Abdolghani Ebrahimi, Diego Klabjan

Abstract:

In this paper we objectively measure the performance of an individual offensive lineman in the NFL. The existing literature proposes various measures that rely on subjective assessments of game film, but has yet to develop an objective methodology to evaluate performance. Using a variety of statistics related to an offensive lineman’s performance, we develop a framework to objectively analyze the overall performance of an individual offensive lineman and determine specific linemen who are overvalued or undervalued relative to their salary. We identify eight players across the 2013-2014 and 2014-2015 NFL seasons that are considered to be overvalued or undervalued and corroborate the results with existing metrics that are based on subjective evaluation. To the best of our knowledge, the techniques set forth in this work have not been utilized in previous works to evaluate the performance of NFL players at any position, including offensive linemen.

Keywords: offensive lineman, player performance, NFL, machine learning

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 529
5955 Hidden State Probabilistic Modeling for Complex Wavelet Based Image Registration

Authors: F. C. Calnegru

Abstract:

This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.

Keywords: Complex wavelet transform, image registration, modeling using hidden state variables, probabilistic similaritymeasure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374
5954 Image Contrast Enhancement based Sub-histogram Equalization Technique without Over-equalization Noise

Authors: Hyunsup Yoon, Youngjoon Han, Hernsoo Hahn

Abstract:

In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.

Keywords: Contrast Enhancement, Histogram Equalization, Histogram Region Equalization, Equalization Noise

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3413
5953 Self-Healing Performance of Heavyweight Concrete with Steam Curing

Authors: Hideki Igawa, Yoshinori Kitsutaka, Takashi Yokomuro, Hideo Eguchi

Abstract:

In this study, the crack self-healing performance of the heavyweight concrete used in the walls of containers and structures designed to shield radioactive materials was investigated. A steam curing temperature that preserves self-healing properties and demolding strength was identified. The presented simultaneously mixing method using the expanding material and the fly ash in the process of admixture can maximize the self-curing performance. Also adding synthetic fibers in the heavyweight concrete improved the self-healing performance.

Keywords: Expanding material, heavyweight concrete, self-healing performance, synthetic fiber.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236
5952 Enhancement of Stereo Video Pairs Using SDNs To Aid In 3D Reconstruction

Authors: Lewis E. Hibell, Honghai Liu, David J. Brown

Abstract:

This paper presents the results of enhancing images from a left and right stereo pair in order to increase the resolution of a 3D representation of a scene generated from that same pair. A new neural network structure known as a Self Delaying Dynamic Network (SDN) has been used to perform the enhancement. The advantage of SDNs over existing techniques such as bicubic interpolation is their ability to cope with motion and noise effects. SDNs are used to generate two high resolution images, one based on frames taken from the left view of the subject, and one based on the frames from the right. This new high resolution stereo pair is then processed by a disparity map generator. The disparity map generated is compared to two other disparity maps generated from the same scene. The first is a map generated from an original high resolution stereo pair and the second is a map generated using a stereo pair which has been enhanced using bicubic interpolation. The maps generated using the SDN enhanced pairs match more closely the target maps. The addition of extra noise into the input images is less problematic for the SDN system which is still able to out perform bicubic interpolation.

Keywords: Genetic Evolution, Image Enhancement, Neuron Networks, Stereo Vision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1417
5951 Low Cost Real Time Robust Identification of Impulsive Signals

Authors: R. Biondi, G. Dys, G. Ferone, T. Renard, M. Zysman

Abstract:

This paper describes an automated implementable system for impulsive signals detection and recognition. The system uses a Digital Signal Processing device for the detection and identification process. Here the system analyses the signals in real time in order to produce a particular response if needed. The system analyses the signals in real time in order to produce a specific output if needed. Detection is achieved through normalizing the inputs and comparing the read signals to a dynamic threshold and thus avoiding detections linked to loud or fluctuating environing noise. Identification is done through neuronal network algorithms. As a setup our system can receive signals to “learn” certain patterns. Through “learning” the system can recognize signals faster, inducing flexibility to new patterns similar to those known. Sound is captured through a simple jack input, and could be changed for an enhanced recording surface such as a wide-area recorder. Furthermore a communication module can be added to the apparatus to send alerts to another interface if needed.

Keywords: Sound Detection, Impulsive Signal, Background Noise, Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2331
5950 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error

Authors: Qianhua He, Weili Zhou, Aiwu Chen

Abstract:

A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.

Keywords: Speech denoising, sparse representation, K-singular value decomposition, orthogonal matching pursuit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1007
5949 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System

Authors: Benjamin C. Agwah, Paulinus C. Eze

Abstract:

Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC-VZLC provided fast tracking of desired wheel slip, eliminated chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.

Keywords: ABS, Fuzzy Logic Controller, Variable Zero Lag Compensator, Wheel Slip Tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 331
5948 Rapid Frequency Response Measurement of Power Conversion Products with Coherence-Based Confidence Analysis

Authors: Tomi Roinila, Aki Taskinen, Matti Vilkko

Abstract:

Switched-mode converters play now a significant role in modern society. Their operation are often crucial in various electrical applications affecting the every day life. Therefore, the quality of the converters needs to be reliably verified. Recent studies have shown that the converters can be fully characterized by a set of frequency responses which can be efficiently used to validate the proper operation of the converters. Consequently, several methods have been proposed to measure the frequency responses fast and accurately. Most often correlation-based techniques have been applied. The presented measurement methods are highly sensitive to external errors and system nonlinearities. This fact has been often forgotten and the necessary uncertainty analysis of the measured responses has been neglected. This paper presents a simple approach to analyze the noise and nonlinearities in the frequency-response measurements of switched-mode converters. Coherence analysis is applied to form a confidence interval characterizing the noise and nonlinearities involved in the measurements. The presented method is verified by practical measurements from a high-frequency switchedmode converter.

Keywords: Switched-mode converters, Frequency analysis, CoherenceAnalysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1714
5947 Multi-Objective Optimization of Electric Discharge Machining for Inconel 718

Authors: Pushpendra S. Bharti, S. Maheshwari

Abstract:

Electric discharge machining (EDM) is one of the most widely used non-conventional manufacturing process to shape difficult-to-cut materials. The process yield, in terms of material removal rate, surface roughness and tool wear rate, of EDM may considerably be improved by selecting the optimal combination(s) of process parameters. This paper employs Multi-response signal-to-noise (MRSN) ratio technique to find the optimal combination(s) of the process parameters during EDM of Inconel 718. Three cases v.i.z. high cutting efficiency, high surface finish, and normal machining have been taken and the optimal combinations of input parameters have been obtained for each case. Analysis of variance (ANOVA) has been employed to find the dominant parameter(s) in all three cases. The experimental verification of the obtained results has also been made. MRSN ratio technique found to be a simple and effective multi-objective optimization technique.

Keywords: EDM, material removal rate, multi-response signal-to-noise ratio, optimization, surface roughness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1191
5946 Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform

Authors: Mohamed S. El-Mahallawy, Attalah Hashad, Hazem Hassan Ali, Heba Sami Zaky

Abstract:

This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.

Keywords: AWGN, DWT, PSNR, Watermarking, VideoQuality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
5945 A Novel Prostate Segmentation Algorithm in TRUS Images

Authors: Ali Rafiee, Ahad Salimi, Ali Reza Roosta

Abstract:

Prostate cancer is one of the most frequent cancers in men and is a major cause of mortality in the most of countries. In many diagnostic and treatment procedures for prostate disease accurate detection of prostate boundaries in transrectal ultrasound (TRUS) images is required. This is a challenging and difficult task due to weak prostate boundaries, speckle noise and the short range of gray levels. In this paper a novel method for automatic prostate segmentation in TRUS images is presented. This method involves preprocessing (edge preserving noise reduction and smoothing) and prostate segmentation. The speckle reduction has been achieved by using stick filter and top-hat transform has been implemented for smoothing. A feed forward neural network and local binary pattern together have been use to find a point inside prostate object. Finally the boundary of prostate is extracted by the inside point and an active contour algorithm. A numbers of experiments are conducted to validate this method and results showed that this new algorithm extracted the prostate boundary with MSE less than 4.6% relative to boundary provided manually by physicians.

Keywords: Prostate segmentation, stick filter, neural network, active contour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1960
5944 A Nano-Scaled SRAM Guard Band Design with Gaussian Mixtures Model of Complex Long Tail RTN Distributions

Authors: Worawit Somha, Hiroyuki Yamauchi

Abstract:

This paper proposes, for the first time, how the challenges facing the guard-band designs including the margin assist-circuits scheme for the screening-test in the coming process generations should be addressed. The increased screening error impacts are discussed based on the proposed statistical analysis models. It has been shown that the yield-loss caused by the misjudgment on the screening test would become 5-orders of magnitude larger than that for the conventional one when the amplitude of random telegraph noise (RTN) caused variations approaches to that of random dopant fluctuation. Three fitting methods to approximate the RTN caused complex Gamma mixtures distributions by the simple Gaussian mixtures model (GMM) are proposed and compared. It has been verified that the proposed methods can reduce the error of the fail-bit predictions by 4-orders of magnitude.

Keywords: Mixtures of Gaussian, Random telegraph noise, EM algorithm, Long-tail distribution, Fail-bit analysis, Static random access memory, Guard band design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1836
5943 A Study on the Effect of Design Factors of Slim Keyboard’s Tactile Feedback

Authors: Kai-Chieh Lin, Chih-Fu Wu, Hsiang Ling Hsu, Yung-Hsiang Tu, Chia-Chen Wu

Abstract:

With the rapid development of computer technology, the design of computers and keyboards moves towards a trend of slimness. The change of mobile input devices directly influences users’ behavior. Although multi-touch applications allow entering texts through a virtual keyboard, the performance, feedback, and comfortableness of the technology is inferior to traditional keyboard, and while manufacturers launch mobile touch keyboards and projection keyboards, the performance has not been satisfying. Therefore, this study discussed the design factors of slim pressure-sensitive keyboards. The factors were evaluated with an objective (accuracy and speed) and a subjective evaluation (operability, recognition, feedback, and difficulty) depending on the shape (circle, rectangle, and L-shaped), thickness (flat, 3mm, and 6mm), and force (35±10g, 60±10g, and 85±10g) of the keyboard. Moreover, MANOVA and Taguchi methods (regarding signal-to-noise ratios) were conducted to find the optimal level of each design factor. The research participants, by their typing speed (30 words/ minute), were divided in two groups. Considering the multitude of variables and levels, the experiments were implemented using the fractional factorial design. A representative model of the research samples were established for input task testing. The findings of this study showed that participants with low typing speed primarily relied on vision to recognize the keys, and those with high typing speed relied on tactile feedback that was affected by the thickness and force of the keys. In the objective and subjective evaluation, a combination of keyboard design factors that might result in higher performance and satisfaction was identified (L-shaped, 3mm, and 60±10g) as the optimal combination. The learning curve was analyzed to make a comparison with a traditional standard keyboard to investigate the influence of user experience on keyboard operation. The research results indicated the optimal combination provided input performance to inferior to a standard keyboard. The results could serve as a reference for the development of related products in industry and for applying comprehensively to touch devices and input interfaces which are interacted with people.

Keywords: Input performance, mobile device, slim keyboard, tactile feedback.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563