Search results for: signal reconstruction.
1309 A Fast and Robust Protocol for Reconstruction and Re-Enactment of Historical Sites
Authors: S. I. Abu Alasal, M. M. Esbeih, E. R. Fayyad, R. S. Gharaibeh, M. Z. Ali, A. A. Freewan, M. M. Jamhawi
Abstract:
This research proposes a novel reconstruction protocol for restoring missing surfaces and low-quality edges and shapes in photos of artifacts at historical sites. The protocol starts with the extraction of a cloud of points. This extraction process is based on four subordinate algorithms, which differ in the robustness and amount of resultant. Moreover, they use different -but complementary- accuracy to some related features and to the way they build a quality mesh. The performance of our proposed protocol is compared with other state-of-the-art algorithms and toolkits. The statistical analysis shows that our algorithm significantly outperforms its rivals in the resultant quality of its object files used to reconstruct the desired model.
Keywords: Meshes, Point Clouds, Surface Reconstruction Protocols, 3D Reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20031308 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.
Keywords: Anomaly detection, autoencoder, data centers, deep learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7421307 Partial 3D Reconstruction using Evolutionary Algorithms
Authors: Mónica Pérez-Meza, Rodrigo Montúfar-Chaveznava
Abstract:
When reconstructing a scenario, it is necessary to know the structure of the elements present on the scene to have an interpretation. In this work we link 3D scenes reconstruction to evolutionary algorithms through the vision stereo theory. We consider vision stereo as a method that provides the reconstruction of a scene using only a couple of images of the scene and performing some computation. Through several images of a scene, captured from different positions, vision stereo can give us an idea about the threedimensional characteristics of the world. Vision stereo usually requires of two cameras, making an analogy to the mammalian vision system. In this work we employ only a camera, which is translated along a path, capturing images every certain distance. As we can not perform all computations required for an exhaustive reconstruction, we employ an evolutionary algorithm to partially reconstruct the scene in real time. The algorithm employed is the fly algorithm, which employ “flies" to reconstruct the principal characteristics of the world following certain evolutionary rules.Keywords: 3D Reconstruction, Computer Vision, EvolutionaryAlgorithms, Vision Stereo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18861306 Design of Medical Information Storage System – ECG Signal
Authors: A. Rubiano F, N. Olarte, D. Lara
Abstract:
This paper presents the design, implementation and results related to the storage system of medical information associated to the ECG (Electrocardiography) signal. The system includes the signal acquisition modules, the preprocessing and signal processing, followed by a module of transmission and reception of the signal, along with the storage and web display system of the medical platform. The tests were initially performed with this signal, with the purpose to include more biosignal under the same system in the future.Keywords: Acquisition, ECG Signal, Storage, Web Platform
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22681305 Electronic System Design for Respiratory Signal Processing
Authors: C. Matiz C., N. Olarte L., A. Rubiano F.
Abstract:
This paper presents the design related to the electronic system design of the respiratory signal, including phases for processing, followed by the transmission and reception of this signal and finally display. The processing of this signal is added to the ECG and temperature sign, put up last year. Under this scheme is proposed that in future also be conditioned blood pressure signal under the same final printed circuit and worked.Keywords: Conditioning, Respiratory Signal, Storage, Teleconsultation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23551304 Near Perfect Reconstruction Quadrature Mirror Filter
Authors: A. Kumar, G. K. Singh, R. S. Anand
Abstract:
In this paper, various algorithms for designing quadrature mirror filter are reviewed and a new algorithm is presented for the design of near perfect reconstruction quadrature mirror filter bank. In the proposed algorithm, objective function is formulated using the perfect reconstruction condition or magnitude response condition of prototype filter at frequency (ω = 0.5π) in ideal condition. The cutoff frequency is iteratively changed to adjust the filters coefficients using optimization algorithm. The performances of the proposed algorithm are evaluated in term of computation time, reconstruction error and number of iterations. The design examples illustrate that the proposed algorithm is superior in term of peak reconstruction error, computation time, and number of iterations. The proposed algorithm is simple, easy to implement, and linear in nature.
Keywords: Aliasing cancellations filter bank, Filter banks, quadrature mirror filter (QMF), subband coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25311303 On the Reduction of Side Effects in Tomography
Authors: V. Masilamani, C. Vanniarajan, Kamala Krithivasan
Abstract:
As the Computed Tomography(CT) requires normally hundreds of projections to reconstruct the image, patients are exposed to more X-ray energy, which may cause side effects such as cancer. Even when the variability of the particles in the object is very less, Computed Tomography requires many projections for good quality reconstruction. In this paper, less variability of the particles in an object has been exploited to obtain good quality reconstruction. Though the reconstructed image and the original image have same projections, in general, they need not be the same. In addition to projections, if a priori information about the image is known, it is possible to obtain good quality reconstructed image. In this paper, it has been shown by experimental results why conventional algorithms fail to reconstruct from a few projections, and an efficient polynomial time algorithm has been given to reconstruct a bi-level image from its projections along row and column, and a known sub image of unknown image with smoothness constraints by reducing the reconstruction problem to integral max flow problem. This paper also discusses the necessary and sufficient conditions for uniqueness and extension of 2D-bi-level image reconstruction to 3D-bi-level image reconstruction.Keywords: Discrete Tomography, Image Reconstruction, Projection, Computed Tomography, Integral Max Flow Problem, Smooth Binary Image.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13701302 3-D Reconstruction of Objects Using Digital Fringe Projection: Survey and Experimental Study
Authors: R. Talebi, A. Abdel-Dayem, J. Johnson
Abstract:
Three-dimensional reconstruction of small objects has been one of the most challenging problems over the last decade. Computer graphics researchers and photography professionals have been working on improving 3D reconstruction algorithms to fit the high demands of various real life applications. Medical sciences, animation industry, virtual reality, pattern recognition, tourism industry, and reverse engineering are common fields where 3D reconstruction of objects plays a vital role. Both lack of accuracy and high computational cost are the major challenges facing successful 3D reconstruction. Fringe projection has emerged as a promising 3D reconstruction direction that combines low computational cost to both high precision and high resolution. It employs digital projection, structured light systems and phase analysis on fringed pictures. Research studies have shown that the system has acceptable performance, and moreover it is insensitive to ambient light. This paper presents an overview of fringe projection approaches. It also presents an experimental study and implementation of a simple fringe projection system. We tested our system using two objects with different materials and levels of details. Experimental results have shown that, while our system is simple, it produces acceptable results.Keywords: Digital fringe projection, 3D reconstruction, phase unwrapping, phase shifting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52201301 Improved Estimation of Evolutionary Spectrum based on Short Time Fourier Transforms and Modified Magnitude Group Delay by Signal Decomposition
Authors: H K Lakshminarayana, J S Bhat, H M Mahesh
Abstract:
A new estimator for evolutionary spectrum (ES) based on short time Fourier transform (STFT) and modified group delay function (MGDF) by signal decomposition (SD) is proposed. The STFT due to its built-in averaging, suppresses the cross terms and the MGDF preserves the frequency resolution of the rectangular window with the reduction in the Gibbs ripple. The present work overcomes the magnitude distortion observed in multi-component non-stationary signals with STFT and MGDF estimation of ES using SD. The SD is achieved either through discrete cosine transform based harmonic wavelet transform (DCTHWT) or perfect reconstruction filter banks (PRFB). The MGDF also improves the signal to noise ratio by removing associated noise. The performance of the present method is illustrated for cross chirp and frequency shift keying (FSK) signals, which indicates that its performance is better than STFT-MGDF (STFT-GD) alone. Further its noise immunity is better than STFT. The SD based methods, however cannot bring out the frequency transition path from band to band clearly, as there will be gap in the contour plot at the transition. The PRFB based STFT-SD shows good performance than DCTHWT decomposition method for STFT-GD.Keywords: Evolutionary Spectrum, Modified Group Delay, Discrete Cosine Transform, Harmonic Wavelet Transform, Perfect Reconstruction Filter Banks, Short Time Fourier Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16111300 Stochastic Resonance in Nonlinear Signal Detection
Authors: Youguo Wang, Lenan Wu
Abstract:
Stochastic resonance (SR) is a phenomenon whereby the signal transmission or signal processing through certain nonlinear systems can be improved by adding noise. This paper discusses SR in nonlinear signal detection by a simple test statistic, which can be computed from multiple noisy data in a binary decision problem based on a maximum a posteriori probability criterion. The performance of detection is assessed by the probability of detection error Per . When the input signal is subthreshold signal, we establish that benefit from noise can be gained for different noises and confirm further that the subthreshold SR exists in nonlinear signal detection. The efficacy of SR is significantly improved and the minimum of Per can dramatically approach to zero as the sample number increases. These results show the robustness of SR in signal detection and extend the applicability of SR in signal processing.Keywords: Probability of detection error, signal detection, stochastic resonance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15331299 Performance Analysis of Reconstruction Algorithms in Diffuse Optical Tomography
Authors: K. Uma Maheswari, S. Sathiyamoorthy, G. Lakshmi
Abstract:
Diffuse Optical Tomography (DOT) is a non-invasive imaging modality used in clinical diagnosis for earlier detection of carcinoma cells in brain tissue. It is a form of optical tomography which produces gives the reconstructed image of a human soft tissue with by using near-infra-red light. It comprises of two steps called forward model and inverse model. The forward model provides the light propagation in a biological medium. The inverse model uses the scattered light to collect the optical parameters of human tissue. DOT suffers from severe ill-posedness due to its incomplete measurement data. So the accurate analysis of this modality is very complicated. To overcome this problem, optical properties of the soft tissue such as absorption coefficient, scattering coefficient, optical flux are processed by the standard regularization technique called Levenberg - Marquardt regularization. The reconstruction algorithms such as Split Bregman and Gradient projection for sparse reconstruction (GPSR) methods are used to reconstruct the image of a human soft tissue for tumour detection. Among these algorithms, Split Bregman method provides better performance than GPSR algorithm. The parameters such as signal to noise ratio (SNR), contrast to noise ratio (CNR), relative error (RE) and CPU time for reconstructing images are analyzed to get a better performance.
Keywords: Diffuse optical tomography, ill-posedness, Levenberg Marquardt method, Split Bregman, the Gradient projection for sparse reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16181298 Fast 2.5D Model Reconstruction of Assembled Parts with High Occlusion for Completeness Inspection
Authors: Matteo Munaro, Stefano Michieletto, Edmond So, Daniele Alberton, Emanuele Menegatti
Abstract:
In this work a dual laser triangulation system is presented for fast building of 2.5D textured models of objects within a production line. This scanner is designed to produce data suitable for 3D completeness inspection algorithms. For this purpose two laser projectors have been used in order to considerably reduce the problem of occlusions in the camera movement direction. Results of reconstruction of electronic boards are presented, together with a comparison with a commercial system.
Keywords: 3D quality inspection, 2.5D reconstruction, laser triangulation, occlusions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15101297 Noise-Improved Signal Detection in Nonlinear Threshold Systems
Authors: Youguo Wang, Lenan Wu
Abstract:
We discuss the signal detection through nonlinear threshold systems. The detection performance is assessed by the probability of error Per . We establish that: (1) when the signal is complete suprathreshold, noise always degrades the signal detection both in the single threshold system and in the parallel array of threshold devices. (2) When the signal is a little subthreshold, noise degrades signal detection in the single threshold system. But in the parallel array, noise can improve signal detection, i.e., stochastic resonance (SR) exists in the array. (3) When the signal is predominant subthreshold, noise always can improve signal detection and SR always exists not only in the single threshold system but also in the parallel array. (4) Array can improve signal detection by raising the number of threshold devices. These results extend further the applicability of SR in signal detection.Keywords: Probability of error, signal detection, stochasticresonance, threshold system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14361296 Wavelet Based Residual Method of Detecting GSM Signal Strength Fading
Authors: Danladi Ali, Onah Festus Iloabuchi
Abstract:
In this paper, GSM signal strength was measured in order to detect the type of the signal fading phenomenon using onedimensional multilevel wavelet residual method and neural network clustering to determine the average GSM signal strength received in the study area. The wavelet residual method predicted that the GSM signal experienced slow fading and attenuated with MSE of 3.875dB. The neural network clustering revealed that mostly -75dB, -85dB and -95dB were received. This means that the signal strength received in the study is a weak signal.
Keywords: One-dimensional multilevel wavelets, path loss, GSM signal strength, propagation and urban environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19581295 Debt Reconstruction, Career Development and Famers Household Well-Being in Thailand
Authors: Yothin Sawangdee, Piyawat Katewongsa, Chutima Yousomboon, Kornkanok Pongpradit, Sakapas Saengchai, Phusit Khantikul
Abstract:
Debts reconstruction under some of moratorium projects is one of important method that highly benefits to both the Banks and farmers. The method can reduce probabilities for nonprofits loan. This paper discuss about debts reconstruction and career development training for farmers in Thailand between 2011 and 2013. The research designed is mix-method between quantitative survey and qualitative survey. Sample size for quantitative method is 1003 cases. Data gathering procedure is between October and December 2013. Main results affirmed that debts reconstruction is needed. And there are numerous benefits from farmers’ career development training. Many of farmers who attend field school activities able to bring knowledge learned to apply for the farms’ work. They can reduce production costs. Framers’ quality of life and their household well-being also improve. This program should apply in any countries where farmers have highly debts and highly risks for not return the debts.Keywords: Career development, debts reconstruction, farmers household well-being, Thailand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10231294 Comparison between Haar and Daubechies Wavelet Transformations on FPGA Technology
Authors: Fatma H. Elfouly, Mohamed I. Mahmoud, Moawad I. M. Dessouky, Salah Deyab
Abstract:
Recently, the Field Programmable Gate Array (FPGA) technology offers the potential of designing high performance systems at low cost. The discrete wavelet transform has gained the reputation of being a very effective signal analysis tool for many practical applications. However, due to its computation-intensive nature, current implementation of the transform falls short of meeting real-time processing requirements of most application. The objectives of this paper are implement the Haar and Daubechies wavelets using FPGA technology. In addition, the Bit Error Rate (BER) between the input audio signal and the reconstructed output signal for each wavelet is calculated. From the BER, it is seen that the implementations execute the operation of the wavelet transform correctly and satisfying the perfect reconstruction conditions. The design procedure has been explained and designed using the stat-ofart Electronic Design Automation (EDA) tools for system design on FPGA. Simulation, synthesis and implementation on the FPGA target technology has been carried out.
Keywords: Daubechies wavelet, discrete wavelet transform, Haar wavelet, Xilinx FPGA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72301293 All Optical Wavelength Conversion Based On Four Wave Mixing in Optical Fiber
Authors: Surinder Singh, Gursewak Singh Lovkesh
Abstract:
We have designed wavelength conversion based on four wave mixing in an optical fiber at 10 Gb/s. The power of converted signal increases with increase in signal power. The converted signal power is investigated as a function of input signal power and pump power. On comparison of converted signal power at different value of input signal power, we observe that best converted signal power is obtained at -2 dBm input signal power for both up conversion as well as for down conversion. Further, FWM efficiency, quality factor is observed for increase in input signal power and optical fiber length.Keywords: FWM, Optical fiber, Quality, Wavelength Converter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22421292 Preparation of Computer Model of the Aircraft for Numerical Aeroelasticity Tests – Flutter
Authors: M. Rychlik, R. Roszak, M. Morzynski, M. Nowak, H. Hausa, K. Kotecki
Abstract:
Article presents the geometry and structure reconstruction procedure of the aircraft model for flatter research (based on the I22-IRYDA aircraft). For reconstruction the Reverse Engineering techniques and advanced surface modeling CAD tools are used. Authors discuss all stages of data acquisition process, computation and analysis of measured data. For acquisition the three dimensional structured light scanner was used. In the further sections, details of reconstruction process are present. Geometry reconstruction procedure transform measured input data (points cloud) into the three dimensional parametric computer model (NURBS solid model) which is compatible with CAD systems. Parallel to the geometry of the aircraft, the internal structure (structural model) are extracted and modeled. In last chapter the evaluation of obtained models are discussed.Keywords: computer modeling, numerical simulation, Reverse Engineering, structural model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17581291 Applied Actuator Fault Accommodation in Flight Control Systems Using Fault Reconstruction Based FDD and SMC Reconfiguration
Authors: A. Ghodbane, M. Saad, J.-F. Boland, C. Thibeault
Abstract:
Historically, actuators’ redundancy was used to deal with faults occurring suddenly in flight systems. This technique was generally expensive, time consuming and involves increased weight and space in the system. Therefore, nowadays, the on-line fault diagnosis of actuators and accommodation plays a major role in the design of avionic systems. These approaches, known as Fault Tolerant Flight Control systems (FTFCs) are able to adapt to such sudden faults while keeping avionics systems lighter and less expensive. In this paper, a (FTFC) system based on the Geometric Approach and a Reconfigurable Flight Control (RFC) are presented. The Geometric approach is used for cosmic ray fault reconstruction, while Sliding Mode Control (SMC) based on Lyapunov stability theory is designed for the reconfiguration of the controller in order to compensate the fault effect. Matlab®/Simulink® simulations are performed to illustrate the effectiveness and robustness of the proposed flight control system against actuators’ faulty signal caused by cosmic rays. The results demonstrate the successful real-time implementation of the proposed FTFC system on a non-linear 6 DOF aircraft model.
Keywords: Actuators’ faults, Fault detection and diagnosis, Fault tolerant flight control, Sliding mode control, Geometric approach for fault reconstruction, Lyapunov stability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25761290 Development of Intelligent Time/Frequency Based Signal Detection Algorithm for Intrusion Detection System
Authors: Waqas Ahmed, S Sajjad Haider Zaidi
Abstract:
For the past couple of decades Weak signal detection is of crucial importance in various engineering and scientific applications. It finds its application in areas like Wireless communication, Radars, Aerospace engineering, Control systems and many of those. Usually weak signal detection requires phase sensitive detector and demodulation module to detect and analyze the signal. This article gives you a preamble to intrusion detection system which can effectively detect a weak signal from a multiplexed signal. By carefully inspecting and analyzing the respective signal, this system can successfully indicate any peripheral intrusion. Intrusion detection system (IDS) is a comprehensive and easy approach towards detecting and analyzing any signal that is weakened and garbled due to low signal to noise ratio (SNR). This approach finds significant importance in applications like peripheral security systems.Keywords: Data Acquisition, fast frequency transforms, Lab VIEW software, weak signal detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25101289 Text Mining Analysis of the Reconstruction Plans after the Great East Japan Earthquake
Authors: Minami Ito, Akihiro Iijima
Abstract:
On March 11, 2011, the Great East Japan Earthquake occurred off the coast of Sanriku, Japan. It is important to build a sustainable society through the reconstruction process rather than simply restoring the infrastructure. To compare the goals of reconstruction plans of quake-stricken municipalities, Japanese language morphological analysis was performed by using text mining techniques. Frequently-used nouns were sorted into four main categories of “life”, “disaster prevention”, “economy”, and “harmony with environment”. Because Soma City is affected by nuclear accident, sentences tagged to “harmony with environment” tended to be frequent compared to the other municipalities. Results from cluster analysis and principle component analysis clearly indicated that the local government reinforces the efforts to reduce risks from radiation exposure as a top priority.
Keywords: Eco-friendly reconstruction, harmony with environment, decontamination, nuclear disaster.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19661288 A Comparison of Some Thresholding Selection Methods for Wavelet Regression
Authors: Alsaidi M. Altaher, Mohd T. Ismail
Abstract:
In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.
Keywords: wavelet regression, simulation, Threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17671287 An Analytical Method to Analysis of Foam Drainage Problem
Authors: A. Nikkar, M. Mighani
Abstract:
In this study, a new reliable technique use to handle the foam drainage equation. This new method is resulted from VIM by a simple modification that is Reconstruction of Variational Iteration Method (RVIM). The drainage of liquid foams involves the interplay of gravity, surface tension, and viscous forces. Foaming occurs in many distillation and absorption processes. Results are compared with those of Adomian’s decomposition method (ADM).The comparisons show that the Reconstruction of Variational Iteration Method is very effective and overcome the difficulty of traditional methods and quite accurate to systems of non-linear partial differential equations.
Keywords: Reconstruction of Variational Iteration Method (RVIM), Foam drainage, nonlinear partial differential equation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18121286 Review of the Software Used for 3D Volumetric Reconstruction of the Liver
Authors: P. Strakos, M. Jaros, T. Karasek, T. Kozubek, P. Vavra, T. Jonszta
Abstract:
In medical imaging, segmentation of different areas of human body like bones, organs, tissues, etc. is an important issue. Image segmentation allows isolating the object of interest for further processing that can lead for example to 3D model reconstruction of whole organs. Difficulty of this procedure varies from trivial for bones to quite difficult for organs like liver. The liver is being considered as one of the most difficult human body organ to segment. It is mainly for its complexity, shape versatility and proximity of other organs and tissues. Due to this facts usually substantial user effort has to be applied to obtain satisfactory results of the image segmentation. Process of image segmentation then deteriorates from automatic or semi-automatic to fairly manual one. In this paper, overview of selected available software applications that can handle semi-automatic image segmentation with further 3D volume reconstruction of human liver is presented. The applications are being evaluated based on the segmentation results of several consecutive DICOM images covering the abdominal area of the human body.
Keywords: Image segmentation, semi-automatic, software, 3D volumetric reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44691285 Algorithm of Measurement of Noise Signal Power in the Presence of Narrowband Interference
Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev
Abstract:
A power measurement algorithm of the input mix components of the noise signal and narrowband interference is considered using functional transformations of the input mix in the postdetection processing channel. The algorithm efficiency analysis has been carried out for different interference-to-signal ratio. Algorithm performance features have been explored by numerical experiment results.
Keywords: Noise signal, continuous narrowband interference, signal power, spectrum width, detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13971284 Detecting Abnormal ECG Signals Utilising Wavelet Transform and Standard Deviation
Authors: Dejan Stantic, Jun Jo
Abstract:
ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.
Keywords: Electrocardiogram-ECG, Arrhythmia, Signal Processing, Wavelet Transform, Standard Deviation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29091283 Envelope Echo Signal of Metal Sphere in the Fresh Water
Authors: A. Mahfurdz, Sunardi, H. Ahmad
Abstract:
An envelope echo signal measurement is proposed in this paper using echo signal observation from the 200 kHz echo sounder receiver. The envelope signal without any object is compared with the envelope signal of the sphere. Two diameter size steel ball (3.1 cm & 2.2 cm) and two diameter size air filled stainless steel ball (4.8 cm & 7.4 cm) used in this experiment. The target was positioned about 0.5 m and 1.0 meter from the transducer face using nylon rope. From the echo observation in time domain, it is obviously shown that echo signal structure is different between the size, distance and type of metal sphere. The amplitude envelope voltage for the bigger sphere is higher compare to the small sphere and it confirm that the bigger sphere have higher target strength compare to the small sphere. Although the structure signal without any object are different compare to the signal from the sphere, the reflected signal from the tank floor increase linearly with the sphere size. We considered this event happened because of the object position approximately to the tank floor.Keywords: echo sounder, target strength, sphere, echo signal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16061282 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy
Authors: Nazaket Gazieva
Abstract:
Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.
Keywords: Biometric voice prints, fundamental frequency, phonogram, speech signal, temporal characteristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5741281 Array Signal Processing: DOA Estimation for Missing Sensors
Authors: Lalita Gupta, R. P. Singh
Abstract:
Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.
Keywords: Array Signal Processing, Beamforming, ULA, Direction of Arrival, MUSIC
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30201280 Automatic 3D Reconstruction of Coronary Artery Centerlines from Monoplane X-ray Angiogram Images
Authors: Ali Zifan, Panos Liatsis, Panagiotis Kantartzis, Manolis Gavaises, Nicos Karcanias, Demosthenes Katritsis
Abstract:
We present a new method for the fully automatic 3D reconstruction of the coronary artery centerlines, using two X-ray angiogram projection images from a single rotating monoplane acquisition system. During the first stage, the input images are smoothed using curve evolution techniques. Next, a simple yet efficient multiscale method, based on the information of the Hessian matrix, for the enhancement of the vascular structure is introduced. Hysteresis thresholding using different image quantiles, is used to threshold the arteries. This stage is followed by a thinning procedure to extract the centerlines. The resulting skeleton image is then pruned using morphological and pattern recognition techniques to remove non-vessel like structures. Finally, edge-based stereo correspondence is solved using a parallel evolutionary optimization method based on f symbiosis. The detected 2D centerlines combined with disparity map information allow the reconstruction of the 3D vessel centerlines. The proposed method has been evaluated on patient data sets for evaluation purposes.Keywords: Vessel enhancement, centerline extraction, symbiotic reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2272