Search results for: Image Mining
1178 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials
Authors: Carlo Bianchini, Lorenzo Catena
Abstract:
The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.
Keywords: Architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7111177 Efficient HAAR Wavelet Transform with Embedded Zerotrees of Wavelet Compression for Color Images
Authors: S. Piramu Kailasam
Abstract:
This study is expected to compress true color image with compression algorithms in color spaces to provide high compression rates. The need of high compression ratio is to improve storage space. Alternative aim is to rank compression algorithms in a suitable color space. The dataset is sequence of true color images with size 128 x 128. HAAR Wavelet is one of the famous wavelet transforms, has great potential and maintains image quality of color images. HAAR wavelet Transform using Set Partitioning in Hierarchical Trees (SPIHT) algorithm with different color spaces framework is applied to compress sequence of images with angles. Embedded Zerotrees of Wavelet (EZW) is a powerful standard method to sequence data. Hence the proposed compression frame work of HAAR wavelet, xyz color space, morphological gradient and applied image with EZW compression, obtained improvement to other methods, in terms of Compression Ratio, Mean Square Error, Peak Signal Noise Ratio and Bits Per Pixel quality measures.
Keywords: Color Spaces, HAAR Wavelet, Morphological Gradient, Embedded Zerotrees Wavelet Compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5181176 An Approach to Image Extraction and Accurate Skin Detection from Web Pages
Authors: Moheb R. Girgis, Tarek M. Mahmoud, Tarek Abd-El-Hafeez
Abstract:
This paper proposes a system to extract images from web pages and then detect the skin color regions of these images. As part of the proposed system, using BandObject control, we built a Tool bar named 'Filter Tool Bar (FTB)' by modifying the Pavel Zolnikov implementation. The Yahoo! Team provides us with the Yahoo! SDK API, which also supports image search and is really useful. In the proposed system, we introduced three new methods for extracting images from the web pages (after loading the web page by using the proposed FTB, before loading the web page physically from the localhost, and before loading the web page from any server). These methods overcome the drawback of the regular expressions method for extracting images suggested by Ilan Assayag. The second part of the proposed system is concerned with the detection of the skin color regions of the extracted images. So, we studied two famous skin color detection techniques. The first technique is based on the RGB color space and the second technique is based on YUV and YIQ color spaces. We modified the second technique to overcome the failure of detecting complex image's background by using the saturation parameter to obtain an accurate skin detection results. The performance evaluation of the efficiency of the proposed system in extracting images before and after loading the web page from localhost or any server in terms of the number of extracted images is presented. Finally, the results of comparing the two skin detection techniques in terms of the number of pixels detected are presented.
Keywords: Browser Helper Object, Color spaces, Image and URL extraction, Skin detection, Web Browser events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18961175 Rigid and Non-rigid Registration of Binary Objects using the Weighted Ratio Image
Authors: Panos Kotsas, Tony Dodd
Abstract:
This paper presents the application of a signal intensity independent similarity criterion for rigid and non-rigid body registration of binary objects. The criterion is defined as the weighted ratio image of two images. The ratio is computed on a voxel per voxel basis and weighting is performed by setting the raios between signal and background voxels to a standard high value. The mean squared value of the weighted ratio is computed over the union of the signal areas of the two images and it is minimized using the Chebyshev polynomial approximation.Keywords: rigid and non-rigid body registration, binary objects
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13311174 A Nonlinear Parabolic Partial Differential Equation Model for Image Enhancement
Authors: Tudor Barbu
Abstract:
We present a robust nonlinear parabolic partial differential equation (PDE)-based denoising scheme in this article. Our approach is based on a second-order anisotropic diffusion model that is described first. Then, a consistent and explicit numerical approximation algorithm is constructed for this continuous model by using the finite-difference method. Finally, our restoration experiments and method comparison, which prove the effectiveness of this proposed technique, are discussed in this paper.Keywords: Image denoising and restoration, nonlinear PDE model, anisotropic diffusion, numerical approximation scheme, finite differences.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13031173 A Secure Semi-Fragile Watermarking Scheme for Authentication and Recovery of Images Based On Wavelet Transform
Authors: Rafiullah Chamlawi, Asifullah Khan, Adnan Idris, Zahid Munir
Abstract:
Authentication of multimedia contents has gained much attention in recent times. In this paper, we propose a secure semi-fragile watermarking, with a choice of two watermarks to be embedded. This technique operates in integer wavelet domain and makes use of semi fragile watermarks for achieving better robustness. A self-recovering algorithm is employed, that hides the image digest into some Wavelet subbands to detect possible malevolent object manipulation undergone by the image (object replacing and/or deletion). The Semi-fragility makes the scheme tolerant for JPEG lossy compression as low as quality of 70%, and locate the tempered area accurately. In addition, the system ensures more security because the embedded watermarks are protected with private keys. The computational complexity is reduced using parameterized integer wavelet transform. Experimental results show that the proposed scheme guarantees the safety of watermark, image recovery and location of the tempered area accurately.
Keywords: Integer Wavelet Transform (IWT), Discrete Cosine Transform (DCT), JPEG Compression, Authentication and Self- Recovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20841172 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering
Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel
Abstract:
Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.Keywords: Classification, data mining, spam filtering, naive Bayes, decision tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14971171 A Modified AES Based Algorithm for Image Encryption
Authors: M. Zeghid, M. Machhout, L. Khriji, A. Baganne, R. Tourki
Abstract:
With the fast evolution of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. In this paper, we analyze the Advanced Encryption Standard (AES), and we add a key stream generator (A5/1, W7) to AES to ensure improving the encryption performance; mainly for images characterised by reduced entropy. The implementation of both techniques has been realized for experimental purposes. Detailed results in terms of security analysis and implementation are given. Comparative study with traditional encryption algorithms is shown the superiority of the modified algorithm.Keywords: Cryptography, Encryption, Advanced EncryptionStandard (AES), ECB mode, statistical analysis, key streamgenerator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50581170 High Capacity Data Hiding based on Predictor and Histogram Modification
Authors: Hui-Yu Huang, Shih-Hsu Chang
Abstract:
In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortionKeywords: data hiding, predictor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18861169 Scatterer Density in Edge and Coherence Enhancing Nonlinear Anisotropic Diffusion for Medical Ultrasound Speckle Reduction
Authors: Ahmed Badawi, J. Michael Johnson, Mohamed Mahfouz
Abstract:
This paper proposes new enhancement models to the methods of nonlinear anisotropic diffusion to greatly reduce speckle and preserve image features in medical ultrasound images. By incorporating local physical characteristics of the image, in this case scatterer density, in addition to the gradient, into existing tensorbased image diffusion methods, we were able to greatly improve the performance of the existing filtering methods, namely edge enhancing (EE) and coherence enhancing (CE) diffusion. The new enhancement methods were tested using various ultrasound images, including phantom and some clinical images, to determine the amount of speckle reduction, edge, and coherence enhancements. Scatterer density weighted nonlinear anisotropic diffusion (SDWNAD) for ultrasound images consistently outperformed its traditional tensor-based counterparts that use gradient only to weight the diffusivity function. SDWNAD is shown to greatly reduce speckle noise while preserving image features as edges, orientation coherence, and scatterer density. SDWNAD superior performances over nonlinear coherent diffusion (NCD), speckle reducing anisotropic diffusion (SRAD), adaptive weighted median filter (AWMF), wavelet shrinkage (WS), and wavelet shrinkage with contrast enhancement (WSCE), make these methods ideal preprocessing steps for automatic segmentation in ultrasound imaging.Keywords: Nonlinear anisotropic diffusion, ultrasound imaging, speckle reduction, scatterer density estimation, edge based enhancement, coherence enhancement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19061168 Segmentation Free Nastalique Urdu OCR
Authors: Sobia T. Javed, Sarmad Hussain, Ameera Maqbool, Samia Asloob, Sehrish Jamil, Huma Moin
Abstract:
The electronically available Urdu data is in image form which is very difficult to process. Printed Urdu data is the root cause of problem. So for the rapid progress of Urdu language we need an OCR systems, which can help us to make Urdu data available for the common person. Research has been carried out for years to automata Arabic and Urdu script. But the biggest hurdle in the development of Urdu OCR is the challenge to recognize Nastalique Script which is taken as standard for writing Urdu language. Nastalique script is written diagonally with no fixed baseline which makes the script somewhat complex. Overlap is present not only in characters but in the ligatures as well. This paper proposes a method which allows successful recognition of Nastalique Script.Keywords: HMM, Image processing, Optical CharacterRecognition, Urdu OCR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21591167 Similarity Based Retrieval in Case Based Reasoning for Analysis of Medical Images
Authors: M. Das Gupta, S. Banerjee
Abstract:
Content Based Image Retrieval (CBIR) coupled with Case Based Reasoning (CBR) is a paradigm that is becoming increasingly popular in the diagnosis and therapy planning of medical ailments utilizing the digital content of medical images. This paper presents a survey of some of the promising approaches used in the detection of abnormalities in retina images as well in mammographic screening and detection of regions of interest in MRI scans of the brain. We also describe our proposed algorithm to detect hard exudates in fundus images of the retina of Diabetic Retinopathy patients.
Keywords: Case based reasoning, Exudates, Retina image, Similarity based retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21241166 Reuse of Huge Industrial Areas
Authors: Martina Perinkova, Lenka Kolarcikova, Marketa Twrda
Abstract:
Brownfields are one of the most important problems that must be solved by today's cities. The topic of this article is description of developing a comprehensive transformation of postindustrial area of the former iron factory national cultural heritage lower Vítkovice. City of Ostrava used to be industrial superpower of the Czechoslovak Republic, especially in the area of coal mining and iron production, after declining industrial production and mining in the 80s left many unused areas of former factories generally brownfields and backfields. Since the late 90s we are observing how the city officials or private entities seeking to remedy this situation. Regeneration of brownfields is a very expensive and long-term process. The area is now rebuilt for tourists and residents of the city in the entertainment, cultural, and social center. It was necessary do the reconstruction of the industrial monuments. Equally important was the construction of new buildings, which helped reusing of the entire complex. This is a unique example of transformation of technical monuments and completion of necessary new objects, so that the area could start working again and reintegrate back into the urban system.Keywords: Brownfields, conversion, historical and industrial buildings, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15781165 Recognition of Gene Names from Gene Pathway Figures Using Siamese Network
Authors: Muhammad Azam, Micheal Olaolu Arowolo, Fei He, Mihail Popescu, Dong Xu
Abstract:
The number of biological papers is growing quickly, which means that the number of biological pathway figures in those papers is also increasing quickly. Each pathway figure shows extensive biological information, like the names of genes and how the genes are related. However, manually annotating pathway figures takes a lot of time and work. Even though using advanced image understanding models could speed up the process of curation, these models still need to be made more accurate. To improve gene name recognition from pathway figures, we applied a Siamese network to map image segments to a library of pictures containing known genes in a similar way to person recognition from photos in many photo applications. We used a triple loss function and a triplet spatial pyramid pooling network by combining the triplet convolution neural network and the spatial pyramid pooling (TSPP-Net). We compared VGG19 and VGG16 as the Siamese network model. VGG16 achieved better performance with an accuracy of 93%, which is much higher than Optical Character Recognition (OCR) results.
Keywords: Biological pathway, image understanding, gene name recognition, object detection, Siamese network, Visual Geometry Group.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6751164 High Capacity Spread-Spectrum Watermarking for Telemedicine Applications
Authors: Basant Kumar, Animesh Anand, S.P. Singh, Anand Mohan
Abstract:
This paper presents a new spread-spectrum watermarking algorithm for digital images in discrete wavelet transform (DWT) domain. The algorithm is applied for embedding watermarks like patient identification /source identification or doctors signature in binary image format into host digital radiological image for potential telemedicine applications. Performance of the algorithm is analysed by varying the gain factor, subband decomposition levels, and size of watermark. Simulation results show that the proposed method achieves higher watermarking capacity.Keywords: Watermarking, spread-spectrum, discrete wavelettransform, telemedicine
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22181163 Denoising based on Wavelets and Deblurring via Self-Organizing Map for Synthetic Aperture Radar Images
Authors: Mario Mastriani
Abstract:
This work deals with unsupervised image deblurring. We present a new deblurring procedure on images provided by lowresolution synthetic aperture radar (SAR) or simply by multimedia in presence of multiplicative (speckle) or additive noise, respectively. The method we propose is defined as a two-step process. First, we use an original technique for noise reduction in wavelet domain. Then, the learning of a Kohonen self-organizing map (SOM) is performed directly on the denoised image to take out it the blur. This technique has been successfully applied to real SAR images, and the simulation results are presented to demonstrate the effectiveness of the proposed algorithms.Keywords: Blur, Kohonen self-organizing map, noise, speckle, synthetic aperture radar.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17351162 An Efficient Burst Errors Combating for Image Transmission over Mobile WPANs
Authors: Mohsen A. M. El-Bendary, Mostafa A. R. El-Tokhy
Abstract:
This paper presents an efficient burst error spreading tool. Also, it studies a vital issue in wireless communications, which is the transmission of images over wireless networks. IEEE ZigBee 802.15.4 is a short-range communication standard that could be used for small distance multimedia transmissions. In fact, the ZigBee network is a Wireless Personal Area Network (WPAN), which needs a strong interleaving mechanism for protection against error bursts. Also, it is low power technology and utilized in the Wireless Sensor Networks (WSN) implementation. This paper presents the chaotic interleaving scheme as a data randomization tool for this purpose. This scheme depends on the chaotic Baker map. The mobility effects on the image transmission are studied with different velocity through utilizing the Jakes’ model. A comparison study between the proposed chaotic interleaving scheme and the traditional block and convolutional interleaving schemes for image transmission over a correlated fading channel is presented. The simulation results show the superiority of the proposed chaotic interleaving scheme over the traditional schemes.
Keywords: WPANs, Burst Errors, Mobility, Interleaving Techniques, Fading channels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20321161 Acute Coronary Syndrome Prediction Using Data Mining Techniques- An Application
Authors: Tahseen A. Jilani, Huda Yasin, Madiha Yasin, C. Ardil
Abstract:
In this paper we use data mining techniques to investigate factors that contribute significantly to enhancing the risk of acute coronary syndrome. We assume that the dependent variable is diagnosis – with dichotomous values showing presence or absence of disease. We have applied binary regression to the factors affecting the dependent variable. The data set has been taken from two different cardiac hospitals of Karachi, Pakistan. We have total sixteen variables out of which one is assumed dependent and other 15 are independent variables. For better performance of the regression model in predicting acute coronary syndrome, data reduction techniques like principle component analysis is applied. Based on results of data reduction, we have considered only 14 out of sixteen factors.
Keywords: Acute coronary syndrome (ACS), binary logistic regression analyses, myocardial ischemia (MI), principle component analysis, unstable angina (U.A.).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21141160 Data Mining Determination of Sunlight Average Input for Solar Power Plant
Authors: Fl. Loury, P. Sablonière, C. Lamoureux, G. Magnier, Th. Gutierrez
Abstract:
A method is proposed to extract faithful representative patterns from data set of observations when they are suffering from non-negligible fluctuations. Supposing time interval between measurements to be extremely small compared to observation time, it consists in defining first a subset of intermediate time intervals characterizing coherent behavior. Data projection on these intervals gives a set of curves out of which an ideally “perfect” one is constructed by taking the sup limit of them. Then comparison with average real curve in corresponding interval gives an efficiency parameter expressing the degradation consecutive to fluctuation effect. The method is applied to sunlight data collected in a specific place, where ideal sunlight is the one resulting from direct exposure at location latitude over the year, and efficiency is resulting from action of meteorological parameters, mainly cloudiness, at different periods of the year. The extracted information already gives interesting element of decision, before being used for analysis of plant control.
Keywords: Base Input Reconstruction, Data Mining, Efficiency Factor, Information Pattern Operator.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15281159 An Improved Switching Median filter for Uniformly Distributed Impulse Noise Removal
Authors: Rajoo Pandey
Abstract:
The performance of an image filtering system depends on its ability to detect the presence of noisy pixels in the image. Most of the impulse detection schemes assume the presence of salt and pepper noise in the images and do not work satisfactorily in case of uniformly distributed impulse noise. In this paper, a new algorithm is presented to improve the performance of switching median filter in detection of uniformly distributed impulse noise. The performance of the proposed scheme is demonstrated by the results obtained from computer simulations on various images.Keywords: Switching median filter, Impulse noise, Imagefiltering, Impulse detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19561158 An 8-Bit, 100-MSPS Fully Dynamic SAR ADC for Ultra-High Speed Image Sensor
Authors: F. Rarbi, D. Dzahini, W. Uhring
Abstract:
In this paper, a dynamic and power efficient 8-bit and 100-MSPS Successive Approximation Register (SAR) Analog-to-Digital Converter (ADC) is presented. The circuit uses a non-differential capacitive Digital-to-Analog (DAC) architecture segmented by 2. The prototype is produced in a commercial 65-nm 1P7M CMOS technology with 1.2-V supply voltage. The size of the core ADC is 208.6 x 103.6 µm2. The post-layout noise simulation results feature a SNR of 46.9 dB at Nyquist frequency, which means an effective number of bit (ENOB) of 7.5-b. The total power consumption of this SAR ADC is only 1.55 mW at 100-MSPS. It achieves then a figure of merit of 85.6 fJ/step.
Keywords: CMOS analog to digital converter, dynamic comparator, image sensor application, successive approximation register.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13031157 The Sequestration of Heavy Metals Contaminating the Wonderfonteinspruit Catchment Area using Natural Zeolite
Authors: P.P. Diale, S.S.L. Mkhize, E. Muzenda, J. Zimba
Abstract:
For more than 120 years, gold mining formed the backbone the South Africa-s economy. The consequence of mine closure was observed in large-scale land degradation and widespread pollution of surface water and groundwater. This paper investigates the feasibility of using natural zeolite in removing heavy metals contaminating the Wonderfonteinspruit Catchment Area (WCA), a water stream with high levels of heavy metals and radionuclide pollution. Batch experiments were conducted to study the adsorption behavior of natural zeolite with respect to Fe2+, Mn2+, Ni2+, and Zn2+. The data was analysed using the Langmuir and Freudlich isotherms. Langmuir was found to correlate the adsorption of Fe2+, Mn2+, Ni2+, and Zn2+ better, with the adsorption capacity of 11.9 mg/g, 1.2 mg/g, 1.3 mg/g, and 14.7 mg/g, respectively. Two kinetic models namely, pseudo-first order and pseudo second order were also tested to fit the data. Pseudo-second order equation was found to be the best fit for the adsorption of heavy metals by natural zeolite. Zeolite functionalization with humic acid increased its uptake ability.Keywords: gold-mining, natural zeolites, water pollution, WestRand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25251156 An Intelligent Nondestructive Testing System of Ultrasonic Infrared Thermal Imaging Based on Embedded Linux
Authors: Hao Mi, Ming Yang, Tian-yue Yang
Abstract:
Ultrasonic infrared nondestructive testing is a kind of testing method with high speed, accuracy and localization. However, there are still some problems, such as the detection requires manual real-time field judgment, the methods of result storage and viewing are still primitive. An intelligent non-destructive detection system based on embedded linux is put forward in this paper. The hardware part of the detection system is based on the ARM (Advanced Reduced Instruction Set Computer Machine) core and an embedded linux system is built to realize image processing and defect detection of thermal images. The CLAHE algorithm and the Butterworth filter are used to process the thermal image, and then the boa server and CGI (Common Gateway Interface) technology are used to transmit the test results to the display terminal through the network for real-time monitoring and remote monitoring. The system also liberates labor and eliminates the obstacle of manual judgment. According to the experiment result, the system provides a convenient and quick solution for industrial non-destructive testing.Keywords: Remote monitoring, non-destructive testing, embedded linux system, image processing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9661155 Determining Cluster Boundaries Using Particle Swarm Optimization
Authors: Anurag Sharma, Christian W. Omlin
Abstract:
Self-organizing map (SOM) is a well known data reduction technique used in data mining. Data visualization can reveal structure in data sets that is otherwise hard to detect from raw data alone. However, interpretation through visual inspection is prone to errors and can be very tedious. There are several techniques for the automatic detection of clusters of code vectors found by SOMs, but they generally do not take into account the distribution of code vectors; this may lead to unsatisfactory clustering and poor definition of cluster boundaries, particularly where the density of data points is low. In this paper, we propose the use of a generic particle swarm optimization (PSO) algorithm for finding cluster boundaries directly from the code vectors obtained from SOMs. The application of our method to unlabeled call data for a mobile phone operator demonstrates its feasibility. PSO algorithm utilizes U-matrix of SOMs to determine cluster boundaries; the results of this novel automatic method correspond well to boundary detection through visual inspection of code vectors and k-means algorithm.
Keywords: Particle swarm optimization, self-organizing maps, clustering, data mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17181154 Comparison of Adsorbents for Ammonia Removal from Mining Wastewater
Authors: Farooq A. Al-Sheikh, Carol Moralejo, Mark Pritzker, William A. Anderson, Ali Elkamel
Abstract:
Ammonia in mining wastewater is a significant problem, and treatment can be especially difficult in cold climates where biological treatment is not feasible. An adsorption process is one of the alternative processes that can be used to reduce ammonia concentrations to acceptable limits, and therefore a LEWATIT resin strongly acidic H+ form ion exchange resin and a Bowie Chabazite Na form AZLB-Na zeolite were tested to assess their effectiveness. For these adsorption tests, two packed bed columns (a mini-column constructed from a 32-cm long x 1-cm diameter piece of glass tubing, and a 60-cm long x 2.5-cm diameter Ace Glass chromatography column) were used containing varying quantities of the adsorbents. A mining wastewater with ammonia concentrations of 22.7 mg/L was fed through the columns at controlled flowrates. In the experimental work, maximum capacities of the LEWATIT ion exchange resin were 0.438, 0.448, and 1.472 mg/g for 3, 6, and 9 g respectively in a mini column and 1.739 mg/g for 141.5 g in a larger Ace column while the capacities for the AZLB-Na zeolite were 0.424, and 0.784 mg/g for 3, and 6 g respectively in the mini column and 1.1636 mg/g for 38.5 g in the Ace column. In the theoretical work, Thomas, Adams-Bohart, and Yoon-Nelson models were constructed to describe a breakthrough curve of the adsorption process and find the constants of the above-mentioned models. In the regeneration tests, 5% hydrochloric acid, HCl (v/v) and 10% sodium hydroxide, NaOH (w/v) were used to regenerate the LEWATIT resin and AZLB-Na zeolite with 44 and 63.8% recovery, respectively. In conclusion, continuous flow adsorption using a LEWATIT ion exchange resin and an AZLB-Na zeolite is efficient when using a co-flow technique for removal of the ammonia from wastewater. Thomas, Adams-Bohart, and Yoon-Nelson models satisfactorily fit the data with R2 closer to 1 in all cases.
Keywords: AZLB-Na zeolite, continuous adsorption, LEWATIT resin, models, regeneration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12361153 A Simulation Software for DNA Computing Algorithms Implementation
Authors: M. S. Muhammad, S. M. W. Masra, K. Kipli, N. Zamhari
Abstract:
The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.Keywords: DNA computing, PCR, POA, simulation software
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18141152 New Features for Specific JPEG Steganalysis
Authors: Johann Barbier, Eric Filiol, Kichenakoumar Mayoura
Abstract:
We present in this paper a new approach for specific JPEG steganalysis and propose studying statistics of the compressed DCT coefficients. Traditionally, steganographic algorithms try to preserve statistics of the DCT and of the spatial domain, but they cannot preserve both and also control the alteration of the compressed data. We have noticed a deviation of the entropy of the compressed data after a first embedding. This deviation is greater when the image is a cover medium than when the image is a stego image. To observe this deviation, we pointed out new statistic features and combined them with the Multiple Embedding Method. This approach is motivated by the Avalanche Criterion of the JPEG lossless compression step. This criterion makes possible the design of detectors whose detection rates are independent of the payload. Finally, we designed a Fisher discriminant based classifier for well known steganographic algorithms, Outguess, F5 and Hide and Seek. The experiemental results we obtained show the efficiency of our classifier for these algorithms. Moreover, it is also designed to work with low embedding rates (< 10-5) and according to the avalanche criterion of RLE and Huffman compression step, its efficiency is independent of the quantity of hidden information.
Keywords: Compressed frequency domain, Fisher discriminant, specific JPEG steganalysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21621151 Advanced Image Analysis Tools Development for the Early Stage Bronchial Cancer Detection
Authors: P. Bountris, E. Farantatos, N. Apostolou
Abstract:
Autofluorescence (AF) bronchoscopy is an established method to detect dysplasia and carcinoma in situ (CIS). For this reason the “Sotiria" Hospital uses the Karl Storz D-light system. However, in early tumor stages the visualization is not that obvious. With the help of a PC, we analyzed the color images we captured by developing certain tools in Matlab®. We used statistical methods based on texture analysis, signal processing methods based on Gabor models and conversion algorithms between devicedependent color spaces. Our belief is that we reduced the error made by the naked eye. The tools we implemented improve the quality of patients' life.Keywords: Bronchoscopy, digital image processing, lung cancer, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14331150 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. UAV techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. In this paper, a methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of RGB and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.
Keywords: Aerial thermography, data processing, drone, low-cost, point cloud.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3411149 Determinants of Brand Equity: Offering a Model to Chocolate Industry
Authors: Emari Hossien
Abstract:
This study examined the underlying dimensions of brand equity in the chocolate industry. For this purpose, researchers developed a model to identify which factors are influential in building brand equity. The second purpose was to assess brand loyalty and brand images mediating effect between brand attitude, brand personality, brand association with brand equity. The study employed structural equation modeling to investigate the causal relationships between the dimensions of brand equity and brand equity itself. It specifically measured the way in which consumers’ perceptions of the dimensions of brand equity affected the overall brand equity evaluations. Data were collected from a sample of consumers of chocolate industry in Iran. The results of this empirical study indicate that brand loyalty and brand image are important components of brand equity in this industry. Moreover, the role of brand loyalty and brand image as mediating factors in the intention of brand equity are supported. The principal contribution of the present research is that it provides empirical evidence of the multidimensionality of consumer based brand equity, supporting Aaker´s and Keller´s conceptualization of brand equity. The present research also enriched brand equity building by incorporating the brand personality and brand image, as recommended by previous researchers. Moreover, creating the brand equity index in chocolate industry of Iran particularly is novel.Keywords: brand equity, brand personality, structural equationmodeling, Iran.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3614