Search results for: The Integral transform technique.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4017

Search results for: The Integral transform technique.

3567 Optimal Image Compression Based on Sign and Magnitude Coding of Wavelet Coefficients

Authors: Mbainaibeye Jérôme, Noureddine Ellouze

Abstract:

Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.

Keywords: Image compression, wavelet transform, sign coding, magnitude coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1655
3566 Maya Semantic Technique: A Mathematical Technique Used to Determine Partial Semantics for Declarative Sentences

Authors: Marcia T. Mitchell

Abstract:

This research uses computational linguistics, an area of study that employs a computer to process natural language, and aims at discerning the patterns that exist in declarative sentences used in technical texts. The approach is mathematical, and the focus is on instructional texts found on web pages. The technique developed by the author and named the MAYA Semantic Technique is used here and organized into four stages. In the first stage, the parts of speech in each sentence are identified. In the second stage, the subject of the sentence is determined. In the third stage, MAYA performs a frequency analysis on the remaining words to determine the verb and its object. In the fourth stage, MAYA does statistical analysis to determine the content of the web page. The advantage of the MAYA Semantic Technique lies in its use of mathematical principles to represent grammatical operations which assist processing and accuracy if performed on unambiguous text. The MAYA Semantic Technique is part of a proposed architecture for an entire web-based intelligent tutoring system. On a sample set of sentences, partial semantics derived using the MAYA Semantic Technique were approximately 80% accurate. The system currently processes technical text in one domain, namely Cµ programming. In this domain all the keywords and programming concepts are known and understood.

Keywords: Natural language understanding, computational linguistics, knowledge representation, linguistic theories.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
3565 Scintigraphic Image Coding of Region of Interest Based On SPIHT Algorithm Using Global Thresholding and Huffman Coding

Authors: A. Seddiki, M. Djebbouri, D. Guerchi

Abstract:

Medical imaging produces human body pictures in digital form. Since these imaging techniques produce prohibitive amounts of data, compression is necessary for storage and communication purposes. Many current compression schemes provide a very high compression rate but with considerable loss of quality. On the other hand, in some areas in medicine, it may be sufficient to maintain high image quality only in region of interest (ROI). This paper discusses a contribution to the lossless compression in the region of interest of Scintigraphic images based on SPIHT algorithm and global transform thresholding using Huffman coding.

Keywords: Global Thresholding Transform, Huffman Coding, Region of Interest, SPIHT Coding, Scintigraphic images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1966
3564 An Artificial Intelligent Technique for Robust Digital Watermarking in Multiwavelet Domain

Authors: P. Kumsawat, K. Pasitwilitham, K. Attakitmongcol, A. Srikaew

Abstract:

In this paper, an artificial intelligent technique for robust digital image watermarking in multiwavelet domain is proposed. The embedding technique is based on the quantization index modulation technique and the watermark extraction process does not require the original image. We have developed an optimization technique using the genetic algorithms to search for optimal quantization steps to improve the quality of watermarked image and robustness of the watermark. In addition, we construct a prediction model based on image moments and back propagation neural network to correct an attacked image geometrically before the watermark extraction process begins. The experimental results show that the proposed watermarking algorithm yields watermarked image with good imperceptibility and very robust watermark against various image processing attacks.

Keywords: Watermarking, Multiwavelet, Quantization index modulation, Genetic algorithms, Neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
3563 Dynamic Economic Dispatch Using Glowworm Swarm Optimization Technique

Authors: K. C. Meher, R. K. Swain, C. K. Chanda

Abstract:

This paper gives an intuition regarding glowworm swarm optimization (GSO) technique to solve dynamic economic dispatch (DED) problems of thermal generating units. The objective of the problem is to schedule optimal power generation of dedicated thermal units over a specific time band. In this study, Glowworm swarm optimization technique enables a swarm of agents to split into subgroup, exhibit simultaneous taxis towards each other and rendezvous at multiple optima (not necessarily equal) of a given multimodal function. The feasibility of the GSO method has been tested on ten-unit-test systems where the power balance constraints, operating limits, valve point effects, and ramp rate limits are taken into account. The results obtained by the proposed technique are compared with other heuristic techniques. The results show that GSO technique is capable of producing better results.

Keywords: Dynamic economic dispatch, Glowworm swarm optimization, Luciferin, Valve–point loading effect, Ramp rate limits.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1303
3562 Extracting Tongue Shape Dynamics from Magnetic Resonance Image Sequences

Authors: María S. Avila-García, John N. Carter, Robert I. Damper

Abstract:

An important problem in speech research is the automatic extraction of information about the shape and dimensions of the vocal tract during real-time speech production. We have previously developed Southampton dynamic magnetic resonance imaging (SDMRI) as an approach to the solution of this problem.However, the SDMRI images are very noisy so that shape extraction is a major challenge. In this paper, we address the problem of tongue shape extraction, which poses difficulties because this is a highly deforming non-parametric shape. We show that combining active shape models with the dynamic Hough transform allows the tongue shape to be reliably tracked in the image sequence.

Keywords: Vocal tract imaging, speech production, active shapemodels, dynamic Hough transform, object tracking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
3561 An Efficient Gaussian Noise Removal Image Enhancement Technique for Gray Scale Images

Authors: V. Murugan, R. Balasubramanian

Abstract:

Image enhancement is a challenging issue in many applications. In the last two decades, there are various filters developed. This paper proposes a novel method which removes Gaussian noise from the gray scale images. The proposed technique is compared with Enhanced Fuzzy Peer Group Filter (EFPGF) for various noise levels. Experimental results proved that the proposed filter achieves better Peak-Signal-to-Noise-Ratio PSNR than the existing techniques. The proposed technique achieves 1.736dB gain in PSNR than the EFPGF technique.

Keywords: Gaussian noise, adaptive bilateral filter, fuzzy peer group filter, switching bilateral filter, PSNR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2467
3560 A Normalization-based Robust Watermarking Scheme Using Zernike Moments

Authors: Say Wei Foo, Qi Dong

Abstract:

Digital watermarking has become an important technique for copyright protection but its robustness against attacks remains a major problem. In this paper, we propose a normalizationbased robust image watermarking scheme. In the proposed scheme, original host image is first normalized to a standard form. Zernike transform is then applied to the normalized image to calculate Zernike moments. Dither modulation is adopted to quantize the magnitudes of Zernike moments according to the watermark bit stream. The watermark extracting method is a blind method. Security analysis and false alarm analysis are then performed. The quality degradation of watermarked image caused by the embedded watermark is visually transparent. Experimental results show that the proposed scheme has very high robustness against various image processing operations and geometric attacks.

Keywords: Image watermarking, Image normalization, Zernike moments, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1740
3559 Research on the Layout of Ground Control Points in Plain area 1:10000 DLG Production Using POS Technique

Authors: Dong Ming, Chen Haipeng

Abstract:

POS (also been called DGPS/IMU) technique can obtain the Exterior Orientation Elements of aerial photo, so the triangulation and DLG production using POS can save large numbers of ground control points (GCP), and this will improve the produce efficiency of DLG and reduce the cost of collecting GCP. This paper mainly research on POS technique in production of 1:10 000 scale DLG on GCP distribution. We designed 23 kinds of ground control points distribution schemes, using integrated sensor direction method to do the triangulation experiments, based on the results of triangulation, we produce a map with the scale of 1:10 000 and test its accuracy. This paper put forward appropriate GCP distributing schemes by experiments and research above, and made preparations for the application of POS technique on photogrammetry 4D data production.

Keywords: POS, IMU, DGPS, DLG, ground control point, triangulation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
3558 Development of a Basic Robot System for Medical and Nursing Care for Patients with Glaucoma

Authors: Naoto Suzuki

Abstract:

Medical methods to completely treat glaucoma are yet to be developed. Therefore, ophthalmologists manage patients mainly to delay disease progression. Patients with glaucoma are mainly elderly individuals. In elderly people's houses, having an equipment that can provide medical treatment and care can release their family from their care. For elderly people with the glaucoma to live by themselves as much as possible, we developed a support robot having five functions: elderly people care, ophthalmological examination, trip assistance to the neighborhood, medical treatment, and data referral to a hospital. The medical and nursing care robot should approach the visual field that the patients can see at a speed suitable for their eyesight. This is because the robot will be dangerous if it approaches the patients from the visual field that they cannot see. We experimentally developed a robot that brings a white cane to elderly people with glaucoma. The base part of the robot is a carriage, which is a Megarover 1.1, and it has two infrared sensors. The robot moves along a white line on the floor using the infrared sensors and has a special arm, which does not use electricity. The arm can scoop the block attached to the white cane. Next, we also developed a direction detector comprised of a charge-coupled device camera (SVR41ResucueHD; Sun Mechatronics), goggles (MG-277MLF; Midori Anzen Co. Ltd.), and biconvex lenses with a focal length of 25 mm (Edmund Co.). Some young people were photographed using the direction detector, which was put on their faces. Image processing was performed using Scilab 6.1.0 and Image Processing and Computer Vision Toolbox 4.1.2. To measure the people's line of vision, we calculated the iris's center of gravity using five processes: reduction, trimming, binarization or gray scale, edge extraction, and Hough transform. We compared the binarization and gray scale processes in image processing. The binarization process was better than the gray scale process. For edge extraction, we compared five methods: Sobel, Prewitt, Laplacian of Gaussian, fast Fourier transform, and Canny. The Canny method was the optimal extraction method. We performed the Hough transform to search for the main coordinates from the iris's edge, and we found that the Hough transform could calculate the center point of the iris.

Keywords: Glaucoma, support robot, elderly people, Hough transform, direction detector, line of vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 519
3557 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: Wave atom transform, statistical features, multi-resolution representation, mammogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 864
3556 Increasing Convergence Rate of a Fractionally-Spaced Channel Equalizer

Authors: Waseem Khan

Abstract:

In this paper a technique for increasing the convergence rate of fractionally spaced channel equalizer is proposed. Instead of symbol-spaced updating of the equalizer filter, a mechanism has been devised to update the filter at a higher rate. This ensures convergence of the equalizer filter at a higher rate and therefore less time-consuming. The proposed technique has been simulated and tested for two-ray modeled channels with various delay spreads. These channels include minimum-phase and nonminimum- phase channels. Simulation results suggest that that proposed technique outperforms the conventional technique of symbol-spaced updating of equalizer filter.

Keywords: Channel equalization, Fractionally-spaced equalizer

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1392
3555 New Wavelet Indices to Assess Muscle Fatigue during Dynamic Contractions

Authors: González-Izal M., Rodríguez-Carreño I, Mallor-Giménez F, Malanda A, Izquierdo M

Abstract:

The purpose of this study was to evaluate and compare new indices based on the discrete wavelet transform with another spectral parameters proposed in the literature as mean average voltage, median frequency and ratios between spectral moments applied to estimate acute exercise-induced changes in power output, i.e., to assess peripheral muscle fatigue during a dynamic fatiguing protocol. 15 trained subjects performed 5 sets consisting of 10 leg press, with 2 minutes rest between sets. Surface electromyography was recorded from vastus medialis (VM) muscle. Several surface electromyographic parameters were compared to detect peripheral muscle fatigue. These were: mean average voltage (MAV), median spectral frequency (Fmed), Dimitrov spectral index of muscle fatigue (FInsm5), as well as other five parameters obtained from the discrete wavelet transform (DWT) as ratios between different scales. The new wavelet indices achieved the best results in Pearson correlation coefficients with power output changes during acute dynamic contractions. Their regressions were significantly different from MAV and Fmed. On the other hand, they showed the highest robustness in presence of additive white gaussian noise for different signal to noise ratios (SNRs). Therefore, peripheral impairments assessed by sEMG wavelet indices may be a relevant factor involved in the loss of power output after dynamic high-loading fatiguing task.

Keywords: Median Frequency, EMG, wavelet transform, muscle fatigue

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848
3554 A Novel Compression Algorithm for Electrocardiogram Signals based on Wavelet Transform and SPIHT

Authors: Sana Ktata, Kaïs Ouni, Noureddine Ellouze

Abstract:

Electrocardiogram (ECG) data compression algorithm is needed that will reduce the amount of data to be transmitted, stored and analyzed, but without losing the clinical information content. A wavelet ECG data codec based on the Set Partitioning In Hierarchical Trees (SPIHT) compression algorithm is proposed in this paper. The SPIHT algorithm has achieved notable success in still image coding. We modified the algorithm for the one-dimensional (1-D) case and applied it to compression of ECG data. By this compression method, small percent root mean square difference (PRD) and high compression ratio with low implementation complexity are achieved. Experiments on selected records from the MIT-BIH arrhythmia database revealed that the proposed codec is significantly more efficient in compression and in computation than previously proposed ECG compression schemes. Compression ratios of up to 48:1 for ECG signals lead to acceptable results for visual inspection.

Keywords: Discrete Wavelet Transform, ECG compression, SPIHT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2105
3553 A Dual Digital-Image Watermarking Technique

Authors: Maha Sharkas, Dahlia ElShafie, Nadder Hamdy

Abstract:

Image watermarking has become an important tool for intellectual property protection and authentication. In this paper a watermarking technique is suggested that incorporates two watermarks in a host image for improved protection and robustness. A watermark, in form of a PN sequence (will be called the secondary watermark), is embedded in the wavelet domain of a primary watermark before being embedded in the host image. The technique has been tested using Lena image as a host and the camera man as the primary watermark. The embedded PN sequence was detectable through correlation among other five sequences where a PSNR of 44.1065 dB was measured. Furthermore, to test the robustness of the technique, the watermarked image was exposed to four types of attacks, namely compression, low pass filtering, salt and pepper noise and luminance change. In all cases the secondary watermark was easy to detect even when the primary one is severely distorted.

Keywords: DWT, Image watermarking, watermarkingtechniques, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2694
3552 Transform to Succeed: An Empirical Analysis of Digital Transformation in Firms

Authors: Sarah E. Stief, Anne Theresa Eidhoff, Markus Voeth

Abstract:

Despite all progress firms are facing the increasing need to adapt and assimilate digital technologies to transform their business activities in order to pursue business development. By using new digital technologies, firms can implement major business improvements in order to stay competitive and foster new growth potentials. The corresponding phenomenon of digital transformation has received some attention in previous literature in respect to industries such as media and publishing. Nevertheless, there is a lack of understanding of the concept and its organization within firms. With the help of twenty-three in-depth field interviews with German experts responsible for their company’s digital transformation, we examined what digital transformation encompasses, how it is organized and which opportunities and challenges arise within firms. Our results indicate that digital transformation is an inevitable task for all firms, as it bears the potential to comprehensively optimize and reshape established business activities and can thus be seen as a strategy of business development.

Keywords: Business development, digitalization, digital strategies, digital transformation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6267
3551 Performance Analysis of Chrominance Red and Chrominance Blue in JPEG

Authors: Mamta Garg

Abstract:

While compressing text files is useful, compressing still image files is almost a necessity. A typical image takes up much more storage than a typical text message and without compression images would be extremely clumsy to store and distribute. The amount of information required to store pictures on modern computers is quite large in relation to the amount of bandwidth commonly available to transmit them over the Internet and applications. Image compression addresses the problem of reducing the amount of data required to represent a digital image. Performance of any image compression method can be evaluated by measuring the root-mean-square-error & peak signal to noise ratio. The method of image compression that will be analyzed in this paper is based on the lossy JPEG image compression technique, the most popular compression technique for color images. JPEG compression is able to greatly reduce file size with minimal image degradation by throwing away the least “important" information. In JPEG, both color components are downsampled simultaneously, but in this paper we will compare the results when the compression is done by downsampling the single chroma part. In this paper we will demonstrate more compression ratio is achieved when the chrominance blue is downsampled as compared to downsampling the chrominance red in JPEG compression. But the peak signal to noise ratio is more when the chrominance red is downsampled as compared to downsampling the chrominance blue in JPEG compression. In particular we will use the hats.jpg as a demonstration of JPEG compression using low pass filter and demonstrate that the image is compressed with barely any visual differences with both methods.

Keywords: JPEG, Discrete Cosine Transform, Quantization, Color Space Conversion, Image Compression, Peak Signal to Noise Ratio & Compression Ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
3550 A System of Automatic Speech Recognition based on the Technique of Temporal Retiming

Authors: Samir Abdelhamid, Noureddine Bouguechal

Abstract:

We report in this paper the procedure of a system of automatic speech recognition based on techniques of the dynamic programming. The technique of temporal retiming is a technique used to synchronize between two forms to compare. We will see how this technique is adapted to the field of the automatic speech recognition. We will expose, in a first place, the theory of the function of retiming which is used to compare and to adjust an unknown form with a whole of forms of reference constituting the vocabulary of the application. Then we will give, in the second place, the various algorithms necessary to their implementation on machine. The algorithms which we will present were tested on part of the corpus of words in Arab language Arabdic-10 [4] and gave whole satisfaction. These algorithms are effective insofar as we apply them to the small ones or average vocabularies.

Keywords: Continuous speech recognition, temporal retiming, phonetic decoding, algorithms, vocal signal, dynamic programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1330
3549 A Simple Chemical Precipitation Method of Titanium Dioxide Nanoparticles Using Polyvinyl Pyrrolidone as a Capping Agent and Their Characterization

Authors: V. P. Muhamed Shajudheen, K. Viswanathan, K. Anitha Rani, A. Uma Maheswari, S. Saravana Kumar

Abstract:

In this paper, a simple chemical precipitation route for the preparation of titanium dioxide nanoparticles, synthesized by using titanium tetra isopropoxide as a precursor and polyvinyl pyrrolidone (PVP) as a capping agent, is reported. The Differential Scanning Calorimetry (DSC) and Thermo Gravimetric Analysis (TGA) of the samples were recorded and the phase transformation temperature of titanium hydroxide, Ti(OH)4 to titanium oxide, TiO2 was investigated. The as-prepared Ti(OH)4 precipitate was annealed at 800°C to obtain TiO2 nanoparticles. The thermal, structural, morphological and textural characterizations of the TiO2 nanoparticle samples were carried out by different techniques such as DSC-TGA, X-Ray Diffraction (XRD), Fourier Transform Infra-Red spectroscopy (FTIR), Micro Raman spectroscopy, UV-Visible absorption spectroscopy (UV-Vis), Photoluminescence spectroscopy (PL) and Field Effect Scanning Electron Microscopy (FESEM) techniques. The as-prepared precipitate was characterized using DSC-TGA and confirmed the mass loss of around 30%. XRD results exhibited no diffraction peaks attributable to anatase phase, for the reaction products, after the solvent removal. The results indicate that the product is purely rutile. The vibrational frequencies of two main absorption bands of prepared samples are discussed from the results of the FTIR analysis. The formation of nanosphere of diameter of the order of 10 nm, has been confirmed by FESEM. The optical band gap was found by using UV-Visible spectrum. From photoluminescence spectra, a strong emission was observed. The obtained results suggest that this method provides a simple, efficient and versatile technique for preparing TiO2 nanoparticles and it has the potential to be applied to other systems for photocatalytic activity.

Keywords: TiO2 nanoparticles, chemical precipitation route, phase transition, Fourier Transform Infra-Red spectroscopy, micro Raman spectroscopy, UV-Visible absorption spectroscopy, Photoluminescence spectroscopy, Field Effect Scanning Electron Microscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4253
3548 Average Current Estimation Technique for Reliability Analysis of Multiple Semiconductor Interconnects

Authors: Ki-Young Kim, Jae-Ho Lim, Deok-Min Kim, Seok-Yoon Kim

Abstract:

Average current analysis checking the impact of current flow is very important to guarantee the reliability of semiconductor systems. As semiconductor process technologies improve, the coupling capacitance often become bigger than self capacitances. In this paper, we propose an analytic technique for analyzing average current on interconnects in multi-conductor structures. The proposed technique has shown to yield the acceptable errors compared to HSPICE results while providing computational efficiency.

Keywords: current moment, interconnect modeling, reliability analysis, worst-case switching

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1369
3547 Dynamic Web-Based 2D Medical Image Visualization and Processing Software

Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail

Abstract:

In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.

Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 773
3546 Statistical Wavelet Features, PCA, and SVM Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the supportvectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: Discrete Wavelet Transform, Electroencephalogram, Pattern Recognition, Principal Component Analysis, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3097
3545 Hybrid Method Using Wavelets and Predictive Method for Compression of Speech Signal

Authors: Karima Siham Aoubid, Mohamed Boulemden

Abstract:

The development of the signal compression algorithms is having compressive progress. These algorithms are continuously improved by new tools and aim to reduce, an average, the number of bits necessary to the signal representation by means of minimizing the reconstruction error. The following article proposes the compression of Arabic speech signal by a hybrid method combining the wavelet transform and the linear prediction. The adopted approach rests, on one hand, on the original signal decomposition by ways of analysis filters, which is followed by the compression stage, and on the other hand, on the application of the order 5, as well as, the compression signal coefficients. The aim of this approach is the estimation of the predicted error, which will be coded and transmitted. The decoding operation is then used to reconstitute the original signal. Thus, the adequate choice of the bench of filters is useful to the transform in necessary to increase the compression rate and induce an impercevable distortion from an auditive point of view.

Keywords: Compression, linear prediction analysis, multiresolution analysis, speech signal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1316
3544 An Analysis of Compression Methods and Implementation of Medical Images in Wireless Network

Authors: C. Rajan, K. Geetha, S. Geetha

Abstract:

The motivation of image compression technique is to reduce the irrelevance and redundancy of the image data in order to store or pass data in an efficient way from one place to another place. There are several types of compression methods available. Without the help of compression technique, the file size is knowingly larger, usually several megabytes, but by doing the compression technique, it is possible to reduce file size up to 10% as of the original without noticeable loss in quality. Image compression can be lossless or lossy. The compression technique can be applied to images, audio, video and text data. This research work mainly concentrates on methods of encoding, DCT, compression methods, security, etc. Different methodologies and network simulations have been analyzed here. Various methods of compression methodologies and its performance metrics has been investigated and presented in a table manner.

Keywords: Image compression techniques, encoding, DCT, lossy compression, lossless compression, JPEG.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1175
3543 Control Algorithm for Shunt Active Power Filter using Synchronous Reference Frame Theory

Authors: Consalva J. Msigwa, Beda J. Kundy, Bakari M. M. Mwinyiwiwa,

Abstract:

This paper presents a method for obtaining the desired reference current for Voltage Source Converter (VSC) of the Shunt Active Power Filter (SAPF) using Synchronous Reference Frame Theory. The method relies on the performance of the Proportional-Integral (PI) controller for obtaining the best control performance of the SAPF. To improve the performance of the PI controller, the feedback path to the integral term is introduced to compensate the winding up phenomenon due to integrator. Using Reference Frame Transformation, reference signals are transformed from a - b - c stationery frame to 0 - d - q rotating frame. Using the PI controller, the reference signals in the 0 - d - q rotating frame are controlled to get the desired reference signals for the Pulse Width Modulation. The synchronizer, the Phase Locked Loop (PLL) with PI filter is used for synchronization, with much emphasis on minimizing delays. The system performance is examined with Shunt Active Power Filter simulation model.

Keywords: Phase Locked Loop (PLL), Voltage Source Converter (VSC), Shunt Active Power Filter (SAPF), PI, Pulse Width Modulation (PWM)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3545
3542 The Recreation Technique Model from the Perspective of Environmental Quality Elements

Authors: G. Gradinaru, S. Olteanu

Abstract:

The quality improvements of the environmental elements could increase the recreational opportunities in a certain area (destination). The technique of the need for recreation focuses on choosing certain destinations for recreational purposes. The basic exchange taken into consideration is the one between the satisfaction gained after staying in that area and the value expressed in money and time allocated. The number of tourists in the respective area, the duration of staying and the money spent including transportation provide information on how individuals rank the place or certain aspects of the area (such as the quality of the environmental elements). For the statistical analysis of the environmental benefits offered by an area through the need of recreation technique, the following stages are suggested: - characterization of the reference area based on the statistical variables considered; - estimation of the environmental benefit through comparing the reference area with other similar areas (having the same environmental characteristics), from the perspective of the statistical variables considered. The model compared in recreation technique faced with a series of difficulties which refers to the reference area and correct transformation of time in money.

Keywords: Comparison in recreation technique, the quality of the environmental elements, statistical analysis model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1074
3541 Stress Intensity Factors for Plates with Collinear and Non-Aligned Straight Cracks

Authors: Surendran M, Palani G. S, Nagesh R. Iyer

Abstract:

Multi-site damage (MSD) has been a challenge to aircraft, civil and power plant structures. In real life components are subjected to cracking at many vulnerable locations such as the bolt holes. However, we do not consider for the presence of multiple cracks. Unlike components with a single crack, these components are difficult to predict. When two cracks approach one another, their stress fields influence each other and produce enhancing or shielding effect depending on the position of the cracks. In the present study, numerical studies on fracture analysis have been conducted by using the developed code based on the modified virtual crack closure integral (MVCCI) technique and finite element analysis (FEA) software ABAQUS for computing SIF of plates with multiple cracks. Various parametric studies have been carried out and the results have been compared with literature where ever available and also with the solution, obtained by using ABAQUS. By conducting extensive numerical studies expressions for SIF have been obtained for collinear cracks and non-aligned cracks.

Keywords: Crack interaction, Fracture mechanics, Multiple site damage, stress intensity factor, collinear cracks, non-aligned cracks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2909
3540 The Automated Selective Acquisition System

Authors: Atisthan Wuttimanop, Suchada Rianmora

Abstract:

To support design process for launching the product on time, reverse engineering (RE) process has been introduced for quickly generating 3D CAD model from its physical object. The accuracy of the 3D CAD model depends upon the data acquisition technique selected, contact or non-contact methods. In order to reduce times used for acquiring surface and eliminating noises, the automated selective acquisition system has been developed and presented in this research as the alternative channel for non-contact acquisition technique where the data is selectively and locally scanned contour by contour without performing data reduction process. The results present as the organized contour points which are directly used to generate 3D virtual model. The comparison between the proposed technique and another non-contact scanning technique has been presented and discussed.

Keywords: Automated selective acquisition system, Non-contact acquisition, Reverse engineering, 3D scanners.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1590
3539 The Preparation of Silicon and Aluminum Extracts from Tuncbilek and Orhaneli Fly Ashes by Alkali Fusion

Authors: M. Sari Yilmaz, N. Karamahmut Mermer

Abstract:

Coal fly ash is formed as a solid waste product from the combustion of coal in coal fired power stations. Huge amounts of fly ash are produced globally every year and are predicted to increase. Nowadays, less than half of the fly ash is used as a raw material for cement manufacturing, construction and the rest of it is disposed as a waste causing yet another environmental concern. For this reason, the recycling of this kind of slurries into useful materials is quite important in terms of economical and environmental aspects. The purpose of this study is to evaluate the Orhaneli and Tuncbilek coal fly ashes for utilization in some industrial applications. Therefore the mineralogical and chemical compositions of these fly ashes were analyzed by X-ray fluorescence spectroscopy, ourier-transform infrared spectrometer, and X-ray diffraction. The silicon (Si) and aluminum (Al) in the fly ashes were activated by alkali fusion technique with sodium hydroxide. The obtained extracts were analyzed for Si and Al content by inductively coupled plasma optical emission spectrometry.

Keywords: Extraction, Fly ash, Fusion, XRD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1876
3538 Carbon Nanofibers Reinforced P(VdF-HFP) Based Gel Polymer Electrolyte for Lithium-Ion Battery Application

Authors: Anjan Sil, Rajni Sharma, Subrata Ray

Abstract:

The effect of carbon nanofibers (CNFs) on the electrical properties of Poly(vinylidene fluoride-hexafluoropropylene) (P(VdF-HFP)) based gel polymer electrolytes has been investigated in the present work. The length and diameter ranges of CNFs used in the present work are 5-50 μm and 200-600 nm respectively. The nanocomposite gel polymer electrolytes have been synthesized by solution casting technique with varying CNFs content in terms of weight percentage. Electrochemical impedance analysis demonstrates that the reinforcement of carbon nanofibers significantly enhances the ionic conductivity of the polymer electrolyte. The decrease of crystallinity of P(VdF-HFP) due the addition of CNFs has been confirmed by X-ray diffraction (XRD). The interaction of CNFs with various constituents of nanocomposite gel polymer electrolytes has been assessed by Fourier Transform Infrared (FTIR) spectroscopy. Moreover CNFs added gel polymer electrolytes offer superior thermal stability as compared to that of CNFs free electrolytes as confirmed by Thermogravimetric analysis (TGA).

Keywords: Polymer electrolytes, CNFs, Ionic conductivity, TGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1918