Search results for: Color Structure-Texture Image Decomposition
4016 Digital Image Forensics: Discovering the History of Digital Images
Authors: Gurinder Singh, Kulbir Singh
Abstract:
Digital multimedia contents such as image, video, and audio can be tampered easily due to the availability of powerful editing softwares. Multimedia forensics is devoted to analyze these contents by using various digital forensic techniques in order to validate their authenticity. Digital image forensics is dedicated to investigate the reliability of digital images by analyzing the integrity of data and by reconstructing the historical information of an image related to its acquisition phase. In this paper, a survey is carried out on the forgery detection by considering the most recent and promising digital image forensic techniques.Keywords: Computer Forensics, Multimedia Forensics, Image Ballistics, Camera Source Identification, Forgery Detection
Procedia PDF Downloads 2474015 Gray Level Image Encryption
Authors: Roza Afarin, Saeed Mozaffari
Abstract:
The aim of this paper is image encryption using Genetic Algorithm (GA). The proposed encryption method consists of two phases. In modification phase, pixels locations are altered to reduce correlation among adjacent pixels. Then, pixels values are changed in the diffusion phase to encrypt the input image. Both phases are performed by GA with binary chromosomes. For modification phase, these binary patterns are generated by Local Binary Pattern (LBP) operator while for diffusion phase binary chromosomes are obtained by Bit Plane Slicing (BPS). Initial population in GA includes rows and columns of the input image. Instead of subjective selection of parents from this initial population, a random generator with predefined key is utilized. It is necessary to decrypt the coded image and reconstruct the initial input image. Fitness function is defined as average of transition from 0 to 1 in LBP image and histogram uniformity in modification and diffusion phases, respectively. Randomness of the encrypted image is measured by entropy, correlation coefficients and histogram analysis. Experimental results show that the proposed method is fast enough and can be used effectively for image encryption.Keywords: correlation coefficients, genetic algorithm, image encryption, image entropy
Procedia PDF Downloads 3304014 Data Hiding in Gray Image Using ASCII Value and Scanning Technique
Authors: R. K. Pateriya, Jyoti Bharti
Abstract:
This paper presents an approach for data hiding methods which provides a secret communication between sender and receiver. The data is hidden in gray-scale images and the boundary of gray-scale image is used to store the mapping information. In this an approach data is in ASCII format and the mapping is in between ASCII value of hidden message and pixel value of cover image, since pixel value of an image as well as ASCII value is in range of 0 to 255 and this mapping information is occupying only 1 bit per character of hidden message as compared to 8 bit per character thus maintaining good quality of stego image.Keywords: ASCII value, cover image, PSNR, pixel value, stego image, secret message
Procedia PDF Downloads 4144013 High-Capacity Image Steganography using Wavelet-based Fusion on Deep Convolutional Neural Networks
Authors: Amal Khalifa, Nicolas Vana Santos
Abstract:
Steganography has been known for centuries as an efficient approach for covert communication. Due to its popularity and ease of access, image steganography has attracted researchers to find secure techniques for hiding information within an innocent looking cover image. In this research, we propose a novel deep-learning approach to digital image steganography. The proposed method, DeepWaveletFusion, uses convolutional neural networks (CNN) to hide a secret image into a cover image of the same size. Two CNNs are trained back-to-back to merge the Discrete Wavelet Transform (DWT) of both colored images and eventually be able to blindly extract the hidden image. Based on two different image similarity metrics, a weighted gain function is used to guide the learning process and maximize the quality of the retrieved secret image and yet maintaining acceptable imperceptibility. Experimental results verified the high recoverability of DeepWaveletFusion which outperformed similar deep-learning-based methods.Keywords: deep learning, steganography, image, discrete wavelet transform, fusion
Procedia PDF Downloads 904012 Catalytic Decomposition of High Energy Materials Using Nanoparticles of Copper Chromite
Authors: M. Sneha Reddy, M. Arun Kumar, V. Kameswara Rao
Abstract:
Chromites are binary transition metal oxides with a general formula of ACr₂O₄, where A = Mn²⁺, Fe²⁺, Co²⁺, Ni²⁺, and Cu²⁺. Chromites have a normal-type spinel structure with interesting applications in the areas of applied physics, material sciences, and geophysics. They have attracted great consideration because of their unique physicochemical properties and tremendous technological applications in nanodevices, sensor elements, and high-temperature ceramics with useful optical properties. Copper chromite is one of the most efficient spinel oxides, having pronounced commercial application as a catalyst in various chemical reactions like oxidation, hydrogenation, alkylation, dehydrogenation, decomposition of organic compounds, and hydrogen production. Apart from its usage in chemical industries, CuCr₂O₄ finds its major application as a burn rate modifier in solid propellant processing for space launch vehicles globally. Herein we synthesized the nanoparticles of copper chromite using the co-precipitation method. The synthesized nanoparticles were characterized by XRD, TEM, SEM, BET, and TG-DTA. The synthesized nanoparticles of copper chromites were used as a catalyst for the thermal decomposition of various high-energy materials.Keywords: copper chromite, coprecipitation method, high energy materials, catalytic thermal decomposition
Procedia PDF Downloads 774011 Hazardous Effects of Metal Ions on the Thermal Stability of Hydroxylammonium Nitrate
Authors: Shweta Hoyani, Charlie Oommen
Abstract:
HAN-based liquid propellants are perceived as potential substitute for hydrazine in space propulsion. Storage stability for long service life in orbit is one of the key concerns for HAN-based monopropellants because of its reactivity with metallic and non-metallic impurities which could entrain from the surface of fuel tanks and the tubes. The end result of this reactivity directly affects the handling, performance and storability of the liquid propellant. Gaseous products resulting from the decomposition of the propellant can lead to deleterious pressure build up in storage vessels. The partial loss of an energetic component can change the ignition and the combustion behavior and alter the performance of the thruster. The effect of largely plausible metals- iron, copper, chromium, nickel, manganese, molybdenum, zinc, titanium and cadmium on the thermal decomposition mechanism of HAN has been investigated in this context. Studies involving different concentrations of metal ions and HAN at different preheat temperatures have been carried out. Effect of metal ions on the decomposition behavior of HAN has been studied earlier in the context of use of HAN as gun propellant. However the current investigation pertains to the decomposition mechanism of HAN in the context of use of HAN as monopropellant for space propulsion. Decomposition onset temperature, rate of weight loss, heat of reaction were studied using DTA- TGA and total pressure rise and rate of pressure rise during decomposition were evaluated using an in-house built constant volume batch reactor. Besides, reaction mechanism and product profile were studied using TGA-FTIR setup. Iron and copper displayed the maximum reaction. Initial results indicate that iron and copper shows sensitizing effect at concentrations as low as 50 ppm with 60% HAN solution at 80°C. On the other hand 50 ppm zinc does not display any effect on the thermal decomposition of even 90% HAN solution at 80°C.Keywords: hydroxylammonium nitrate, monopropellant, reaction mechanism, thermal stability
Procedia PDF Downloads 4224010 The Miseducation of Color: Examining Racialized Experiences of Students of Color at Predominantly White Institutions (PWIs)
Authors: Sonia Darshini Singh
Abstract:
Recently, the Supreme Court and the federal government made affirmative action illegal. Colleges and universities are no longer allowed to consider race in admissions policies. Colleges and universities had the opportunity to increase racial diversity through affirmative action. Instead, a recent educational outlook has emerged where this race-conscious affirmative action is banned, and elitism is prioritized, thus altering the collegiate experience of students of color. While the statute restricts the consideration of race as a facet in admissions, this prohibition should not allow for the gravity of race and structural racism in the lives of marginalized students to diminish, nor should it limit further efforts to establish equitable access and outcomes for students of color. Not much is known about the racialized experiences of students of color who attend predominantly white institutions in the post-affirmative action era. The purpose of this ethnographic study will be to understand the racialized experiences of students who attend predominantly white institutions (PWI) in New York. This also aims to examine the potential data triangulation between what students wrote about to get into college and their actual racialized experience.Keywords: higher education, predominantly white institution, equity, accessibility, affirmative action
Procedia PDF Downloads 404009 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm
Authors: Hooman Torabifard
Abstract:
In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.Keywords: image summarization, particle swarm optimization, image threshold, image processing
Procedia PDF Downloads 1334008 Pyrolysis and Combustion Kinetics of Palm Kernel Shell Using Thermogravimetric Analysis
Authors: Kanit Manatura
Abstract:
The combustion and pyrolysis behavior of Palm Kernel Shell (PKS) were investigated in a thermogravimetric analyzer. A 10 mg sample of each biomass was heated from 30 °C to 800 °C at four heating rates (within 5, 10, 15 and 30 °C/min) in nitrogen and dry air flow of 20 ml/min instead of pyrolysis and combustion process respectively. During pyrolysis, thermal decomposition occurred on three different stages include dehydration, hemicellulose-cellulose and lignin decomposition on each temperature range. The TG/DTG curves showed the degradation behavior and the pyrolysis/combustion characteristics of the PKS samples which led to apply in thermogravimetric analysis. The kinetic factors including activation energy and pre-exponential factor were determined by the Coats-Redfern method. The obtained kinetic factors are used to simulate the thermal decomposition and compare with experimental data. Rising heating rate leads to shift the mass loss towards higher temperature.Keywords: combustion, palm kernel shell, pyrolysis, thermogravimetric analyzer
Procedia PDF Downloads 2274007 Improved Super-Resolution Using Deep Denoising Convolutional Neural Network
Authors: Pawan Kumar Mishra, Ganesh Singh Bisht
Abstract:
Super-resolution is the technique that is being used in computer vision to construct high-resolution images from a single low-resolution image. It is used to increase the frequency component, recover the lost details and removing the down sampling and noises that caused by camera during image acquisition process. High-resolution images or videos are desired part of all image processing tasks and its analysis in most of digital imaging application. The target behind super-resolution is to combine non-repetition information inside single or multiple low-resolution frames to generate a high-resolution image. Many methods have been proposed where multiple images are used as low-resolution images of same scene with different variation in transformation. This is called multi-image super resolution. And another family of methods is single image super-resolution that tries to learn redundancy that presents in image and reconstruction the lost information from a single low-resolution image. Use of deep learning is one of state of art method at present for solving reconstruction high-resolution image. In this research, we proposed Deep Denoising Super Resolution (DDSR) that is a deep neural network for effectively reconstruct the high-resolution image from low-resolution image.Keywords: resolution, deep-learning, neural network, de-blurring
Procedia PDF Downloads 5174006 Short-Term Load Forecasting Based on Variational Mode Decomposition and Least Square Support Vector Machine
Authors: Jiangyong Liu, Xiangxiang Xu, Bote Luo, Xiaoxue Luo, Jiang Zhu, Lingzhi Yi
Abstract:
To address the problems of non-linearity and high randomness of the original power load sequence causing the degradation of power load forecasting accuracy, a short-term load forecasting method is proposed. The method is based on the Least Square Support Vector Machine optimized by an Improved Sparrow Search Algorithm combined with the Variational Mode Decomposition proposed in this paper. The application of the variational mode decomposition technique decomposes the raw power load data into a series of Intrinsic Mode Functions components, which can reduce the complexity and instability of the raw data while overcoming modal confounding; the proposed improved sparrow search algorithm can solve the problem of difficult selection of learning parameters in the least Square Support Vector Machine. Finally, through comparison experiments, the results show that the method can effectively improve prediction accuracy.Keywords: load forecasting, variational mode decomposition, improved sparrow search algorithm, least square support vector machine
Procedia PDF Downloads 1084005 Wavelet-Based Classification of Myocardial Ischemia, Arrhythmia, Congestive Heart Failure and Sleep Apnea
Authors: Santanu Chattopadhyay, Gautam Sarkar, Arabinda Das
Abstract:
This paper presents wavelet based classification of various heart diseases. Electrocardiogram signals of different heart patients have been studied. Statistical natures of electrocardiogram signals for different heart diseases have been compared with the statistical nature of electrocardiograms for normal persons. Under this study four different heart diseases have been considered as follows: Myocardial Ischemia (MI), Congestive Heart Failure (CHF), Arrhythmia and Sleep Apnea. Statistical nature of electrocardiograms for each case has been considered in terms of kurtosis values of two types of wavelet coefficients: approximate and detail. Nine wavelet decomposition levels have been considered in each case. Kurtosis corresponding to both approximate and detail coefficients has been considered for decomposition level one to decomposition level nine. Based on significant difference, few decomposition levels have been chosen and then used for classification.Keywords: arrhythmia, congestive heart failure, discrete wavelet transform, electrocardiogram, myocardial ischemia, sleep apnea
Procedia PDF Downloads 1344004 Noise Detection Algorithm for Skin Disease Image Identification
Authors: Minakshi Mainaji Sonawane, Bharti W. Gawali, Sudhir Mendhekar, Ramesh R. Manza
Abstract:
People's lives and health are severely impacted by skin diseases. A new study proposes an effective method for identifying the different forms of skin diseases. Image denoising is a technique for improving image quality after it has been harmed by noise. The proposed technique is based on the usage of the wavelet transform. Wavelet transform is the best method for analyzing the image due to the ability to split the image into the sub-band, which has been used to estimate the noise ratio at the noisy image. According to experimental results, the proposed method presents the best values for MSE, PSNR, and Entropy for denoised images. we can found in Also, by using different types of wavelet transform filters is make the proposed approach can obtain the best results 23.13, 20.08, 50.7 for the image denoising processKeywords: MSE, PSNR, entropy, Gaussian filter, DWT
Procedia PDF Downloads 2154003 Parallelization by Domain Decomposition for 1-D Sugarcane Equation with Message Passing Interface
Authors: Ewedafe Simon Uzezi
Abstract:
In this paper we presented a method based on Domain Decomposition (DD) for parallelization of 1-D Sugarcane Equation on parallel platform with parallel paradigms on Master-Slave platform using Message Passing Interface (MPI). The 1-D Sugarcane Equation was discretized using explicit method of discretization requiring evaluation nof temporal and spatial distribution of temperature. This platform gives better predictions of the effects of temperature distribution of the sugarcane problem. This work presented parallel overheads with overlapping communication and communication across parallel computers with numerical results across different block sizes with scalability. However, performance improvement strategies from the DD on various mesh sizes were compared experimentally and parallel results show speedup and efficiency for the parallel algorithms design.Keywords: sugarcane, parallelization, explicit method, domain decomposition, MPI
Procedia PDF Downloads 214002 Simulation of X-Ray Tissue Contrast and Dose Optimisation in Radiological Physics to Improve Medical Imaging Students’ Skills
Authors: Peter J. Riley
Abstract:
Medical Imaging students must understand the roles of Photo-electric Absorption (PE) and Compton Scatter (CS) interactions in patients to enable optimal X-ray imaging in clinical practice. A simulator has been developed that shows relative interaction probabilities, color bars for patient dose from PE, % penetration to the detector, and obscuring CS as Peak Kilovoltage (kVp) changes. Additionally, an anthropomorphic chest X-ray image shows the relative tissue contrasts and overlying CS-fog at that kVp, which determine the detectability of a lesion in the image. A series of interactive exercises with MCQs evaluate the student's understanding; the simulation has improved student perception of the need to acquire "sufficient" rather than maximal contrast to enable patient dose reduction at higher kVp.Keywords: patient dose optimization, radiological physics, simulation, tissue contrast
Procedia PDF Downloads 954001 Red Green Blue Image Encryption Based on Paillier Cryptographic System
Authors: Mamadou I. Wade, Henry C. Ogworonjo, Madiha Gul, Mandoye Ndoye, Mohamed Chouikha, Wayne Patterson
Abstract:
In this paper, we present a novel application of the Paillier cryptographic system to the encryption of RGB (Red Green Blue) images. In this method, an RGB image is first separated into its constituent channel images, and the Paillier encryption function is applied to each of the channels pixel intensity values. Next, the encrypted image is combined and compressed if necessary before being transmitted through an unsecured communication channel. The transmitted image is subsequently recovered by a decryption process. We performed a series of security and performance analyses to the recovered images in order to verify their robustness to security attack. The results show that the proposed image encryption scheme produces highly secured encrypted images.Keywords: image encryption, Paillier cryptographic system, RBG image encryption, Paillier
Procedia PDF Downloads 2384000 An Object-Based Image Resizing Approach
Authors: Chin-Chen Chang, I-Ta Lee, Tsung-Ta Ke, Wen-Kai Tai
Abstract:
Common methods for resizing image size include scaling and cropping. However, these two approaches have some quality problems for reduced images. In this paper, we propose an image resizing algorithm by separating the main objects and the background. First, we extract two feature maps, namely, an enhanced visual saliency map and an improved gradient map from an input image. After that, we integrate these two feature maps to an importance map. Finally, we generate the target image using the importance map. The proposed approach can obtain desired results for a wide range of images.Keywords: energy map, visual saliency, gradient map, seam carving
Procedia PDF Downloads 4763999 Effects of Microwave Heating Rate on the Color, Total Anthocyanin Content and Total Phenolics of Elderberry Juice during Come-up-Time
Authors: Balunkeswar Nayak, Hanjun Cao, Xinruo Zhang
Abstract:
Elderberry could protect human health from oxidative stress, and reduce aging and certain cardiovascular diseases due to the presence of bioactive phytochemicals with high antioxidant capacity. However, these bioactive phytochemicals, such as anthocyanins and other phenolic acids, are susceptible to degradation during processing of elderberries to juice, jam, and powder due to intensity and duration of thermal exposure. The effects of microwave heating rate during come-up-times, using a domestic 2450 MHz microwave, on the color, total anthocyanin content and total phenolics on elderberry juice was studied. With a variation of come-up-time from 30 sec to 15 min at different power levels (10–50 % of total wattage), the temperature of elderberry juice vary from 40.6 °C to 91.5 °C. However, the color parameters (L, A, and B), total anthocyanin content (using pH differential method) and total phenolics did not vary significantly when compared to the control samples.Keywords: elderberry, microwave, color, thermal exposure
Procedia PDF Downloads 6033998 Reactive Dyed Superhydrophobic Cotton Fabric Production by Sol-Gel Method
Authors: Kuddis Büyükakıllı
Abstract:
The pretreated and bleached mercerized cotton fabric was dyed with reactive Everzol Brilliant Yellow 4GR (C.I. Yellow 160) dyestuff. Superhydrophobicity is provided to white and reactive dyed fabrics by using a nanotechnological sol-gel method with tetraethoxysilane and fluorcarbon water repellent agents by the two-step method. The effect of coating on color yield, fastness and functional properties of fabric was investigated. It was observed that water drop contact angles were higher in colorless coated fabrics compared to colored coated fabrics, there was no significant color change in colored superhydrophobic fabric and high color fastness values. Although there are no significant color losses in the fabrics after multiple washing and dry cleaning processes, water drop contact angles are greatly reduced.Keywords: fluorcarbon water repellent agent, colored cotton fabric, sol-gel, superhydrophobic
Procedia PDF Downloads 1183997 Continuous-Time and Discrete-Time Singular Value Decomposition of an Impulse Response Function
Authors: Rogelio Luck, Yucheng Liu
Abstract:
This paper proposes the continuous-time singular value decomposition (SVD) for the impulse response function, a special kind of Green’s functions e⁻⁽ᵗ⁻ ᵀ⁾, in order to find a set of singular functions and singular values so that the convolutions of such function with the set of singular functions on a specified domain are the solutions to the inhomogeneous differential equations for those singular functions. A numerical example was illustrated to verify the proposed method. Besides the continuous-time SVD, a discrete-time SVD is also presented for the impulse response function, which is modeled using a Toeplitz matrix in the discrete system. The proposed method has broad applications in signal processing, dynamic system analysis, acoustic analysis, thermal analysis, as well as macroeconomic modeling.Keywords: singular value decomposition, impulse response function, Green’s function , Toeplitz matrix , Hankel matrix
Procedia PDF Downloads 1563996 Each One, Reach One: Peer Mentoring Support for Faculty Women of Color
Authors: Teresa Leary Handy
Abstract:
As awareness of the importance of diversity has increased in society, higher education has also begun to recognize the importance of supporting faculty of color. In the university setting, faculty women of color specifically encounter barriers that impact their level of job satisfaction, retention rates, and pedagogical practices. These barriers and challenges not only undermine faculty diversity efforts but also hinder the ability of colleges and universities to provide a supportive environment that fosters students' academic success and sense of belonging. Faculty who are marginalized and on the periphery in higher education institutions need support so that they can feel confident in building a student’s sense of belonging which can impact a student’s academic success and goal of earning a college degree. This study examined and sought to understand the importance of supporting faculty of color, specifically women faculty of color, and how this type of faculty support can impact student academic success and a student’s sense of belonging. The study furthered original research on strategies to move an institution forward on the equity spectrum to support belonging and inclusions as core culture elements.Keywords: equity, inclusion, belonging, women, faculty support
Procedia PDF Downloads 673995 Investigation of Interlayer Shear Effects in Asphalt Overlay on Existing Rigid Airfield Pavement Using Digital Image Correlation
Authors: Yuechao Lei, Lei Zhang
Abstract:
The interface shear between asphalt overlay and existing rigid airport pavements occurs due to differences in the mechanical properties of materials subjected to aircraft loading. Interlayer contact influences the mechanical characteristics of the asphalt overlay directly. However, the effective interlayer relative displacement obtained accurately using existing displacement sensors of the loading apparatus remains challenging. This study aims to utilize digital image correlation technology to enhance the accuracy of interfacial contact parameters by obtaining effective interlayer relative displacements. Composite structure specimens were prepared, and fixtures for interlayer shear tests were designed and fabricated. Subsequently, a digital image recognition scheme for required markers was designed and optimized. Effective interlayer relative displacement values were obtained through image recognition and calculation of surface markers on specimens. Finite element simulations validated the mechanical response of composite specimens with interlayer shearing. Results indicated that an optimized marking approach using the wall mending agent for surface application and color coding enhanced the image recognition quality of marking points on the specimen surface. Further image extraction provided effective interlayer relative displacement values during interlayer shear, thereby improving the accuracy of interface contact parameters. For composite structure specimens utilizing Styrene-Butadiene-Styrene (SBS) modified asphalt as the tack coat, the corresponding maximum interlayer shear stress strength was 0.6 MPa, and fracture energy was 2917 J/m2. This research provides valuable insights for investigating the impact of interlayer contact in composite pavement structures on the mechanical characteristics of asphalt overlay.Keywords: interlayer contact, effective relative displacement, digital image correlation technology, composite pavement structure, asphalt overlay
Procedia PDF Downloads 483994 Automatic Classification Using Dynamic Fuzzy C Means Algorithm and Mathematical Morphology: Application in 3D MRI Image
Authors: Abdelkhalek Bakkari
Abstract:
Image segmentation is a critical step in image processing and pattern recognition. In this paper, we proposed a new robust automatic image classification based on a dynamic fuzzy c-means algorithm and mathematical morphology. The proposed segmentation algorithm (DFCM_MM) has been applied to MR perfusion images. The obtained results show the validity and robustness of the proposed approach.Keywords: segmentation, classification, dynamic, fuzzy c-means, MR image
Procedia PDF Downloads 4783993 A Survey on Types of Noises and De-Noising Techniques
Authors: Amandeep Kaur
Abstract:
Digital Image processing is a fundamental tool to perform various operations on the digital images for pattern recognition, noise removal and feature extraction. In this paper noise removal technique has been described for various types of noises. This paper comprises discussion about various noises available in the image due to different environmental, accidental factors. In this paper, various de-noising approaches have been discussed that utilize different wavelets and filters for de-noising. By analyzing various papers on image de-noising we extract that wavelet based de-noise approaches are much effective as compared to others.Keywords: de-noising techniques, edges, image, image processing
Procedia PDF Downloads 3363992 A Simple Adaptive Atomic Decomposition Voice Activity Detector Implemented by Matching Pursuit
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
A simple adaptive voice activity detector (VAD) is implemented using Gabor and gammatone atomic decomposition of speech for high Gaussian noise environments. Matching pursuit is used for atomic decomposition, and is shown to achieve optimal speech detection capability at high data compression rates for low signal to noise ratios. The most active dictionary elements found by matching pursuit are used for the signal reconstruction so that the algorithm adapts to the individual speakers dominant time-frequency characteristics. Speech has a high peak to average ratio enabling matching pursuit greedy heuristic of highest inner products to isolate high energy speech components in high noise environments. Gabor and gammatone atoms are both investigated with identical logarithmically spaced center frequencies, and similar bandwidths. The algorithm performs equally well for both Gabor and gammatone atoms with no significant statistical differences. The algorithm achieves 70% accuracy at a 0 dB SNR, 90% accuracy at a 5 dB SNR and 98% accuracy at a 20dB SNR using 30dB SNR as a reference for voice activity.Keywords: atomic decomposition, gabor, gammatone, matching pursuit, voice activity detection
Procedia PDF Downloads 2903991 Detect Circles in Image: Using Statistical Image Analysis
Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee
Abstract:
The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.Keywords: image processing, median filter, projection, scale-space, segmentation, threshold
Procedia PDF Downloads 4323990 A Hybrid Watermarking Model Based on Frequency of Occurrence
Authors: Hamza A. A. Al-Sewadi, Adnan H. M. Al-Helali, Samaa A. K. Khamis
Abstract:
Ownership proofs of multimedia such as text, image, audio or video files can be achieved by the burial of watermark is them. It is achieved by introducing modifications into these files that are imperceptible to the human senses but easily recoverable by a computer program. These modifications would be in the time domain or frequency domain or both. This paper presents a procedure for watermarking by mixing amplitude modulation with frequency transformation histogram; namely a specific value is used to modulate the intensity component Y of the YIQ components of the carrier image. This scheme is referred to as histogram embedding technique (HET). Results comparison with those of other techniques such as discrete wavelet transform (DWT), discrete cosine transform (DCT) and singular value decomposition (SVD) have shown an enhance efficiency in terms of ease and performance. It has manifested a good degree of robustness against various environment effects such as resizing, rotation and different kinds of noise. This method would prove very useful technique for copyright protection and ownership judgment.Keywords: authentication, copyright protection, information hiding, ownership, watermarking
Procedia PDF Downloads 5653989 Adaptive Dehazing Using Fusion Strategy
Authors: M. Ramesh Kanthan, S. Naga Nandini Sujatha
Abstract:
The goal of haze removal algorithms is to enhance and recover details of scene from foggy image. In enhancement the proposed method focus into two main categories: (i) image enhancement based on Adaptive contrast Histogram equalization, and (ii) image edge strengthened Gradient model. Many circumstances accurate haze removal algorithms are needed. The de-fog feature works through a complex algorithm which first determines the fog destiny of the scene, then analyses the obscured image before applying contrast and sharpness adjustments to the video in real-time to produce image the fusion strategy is driven by the intrinsic properties of the original image and is highly dependent on the choice of the inputs and the weights. Then the output haze free image has reconstructed using fusion methodology. In order to increase the accuracy, interpolation method has used in the output reconstruction. A promising retrieval performance is achieved especially in particular examples.Keywords: single image, fusion, dehazing, multi-scale fusion, per-pixel, weight map
Procedia PDF Downloads 4643988 Co-Factors of Hypertension and Decomposition of Inequalities in Its Prevalence in India: Evidence from NFHS-4
Authors: Ayantika Biswas
Abstract:
Hypertension still remains one of the most important preventable contributors to adult mortality and morbidity and a major public health challenge worldwide. Studying regional and rural-urban differences in prevalence and assessment of the contributions of different indicators is essential in determining the drivers of this condition. The 2015-16 National Family Health Survey data has been used for the study. Bivariate analysis, multinomial regression analysis, concentration indices and decomposition of concentration indices assessing contribution of factors has been undertaken in the present study. An overall concentration index of 0.003 has been found for hypertensive population, which shows its concentration among the richer wealth quintiles. The contribution of factors like age 45 to 49 years, years of schooling between 5 to 9 years are factors that are important contributors to inequality in hypertension occurrence. Studies should be conducted to find approaches to prevent or delay the onset of the condition.Keywords: hypertension, decomposition, inequalities, India
Procedia PDF Downloads 1393987 Speaker Identification by Atomic Decomposition of Learned Features Using Computational Auditory Scene Analysis Principals in Noisy Environments
Authors: Thomas Bryan, Veton Kepuska, Ivica Kostanic
Abstract:
Speaker recognition is performed in high Additive White Gaussian Noise (AWGN) environments using principals of Computational Auditory Scene Analysis (CASA). CASA methods often classify sounds from images in the time-frequency (T-F) plane using spectrograms or cochleargrams as the image. In this paper atomic decomposition implemented by matching pursuit performs a transform from time series speech signals to the T-F plane. The atomic decomposition creates a sparsely populated T-F vector in “weight space” where each populated T-F position contains an amplitude weight. The weight space vector along with the atomic dictionary represents a denoised, compressed version of the original signal. The arraignment or of the atomic indices in the T-F vector are used for classification. Unsupervised feature learning implemented by a sparse autoencoder learns a single dictionary of basis features from a collection of envelope samples from all speakers. The approach is demonstrated using pairs of speakers from the TIMIT data set. Pairs of speakers are selected randomly from a single district. Each speak has 10 sentences. Two are used for training and 8 for testing. Atomic index probabilities are created for each training sentence and also for each test sentence. Classification is performed by finding the lowest Euclidean distance between then probabilities from the training sentences and the test sentences. Training is done at a 30dB Signal-to-Noise Ratio (SNR). Testing is performed at SNR’s of 0 dB, 5 dB, 10 dB and 30dB. The algorithm has a baseline classification accuracy of ~93% averaged over 10 pairs of speakers from the TIMIT data set. The baseline accuracy is attributable to short sequences of training and test data as well as the overall simplicity of the classification algorithm. The accuracy is not affected by AWGN and produces ~93% accuracy at 0dB SNR.Keywords: time-frequency plane, atomic decomposition, envelope sampling, Gabor atoms, matching pursuit, sparse dictionary learning, sparse autoencoder
Procedia PDF Downloads 289