Search results for: Adjacent pixel intensity differencequantization (APIDQ)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 805

Search results for: Adjacent pixel intensity differencequantization (APIDQ)

565 A GPU Based Texture Mapping Technique for 3D Models Using Multi-View Images

Authors: In Lee, Kyung-Kyu Kang, Jaewoon Lee, Dongho Kim

Abstract:

Previous the 3D model texture generation from multi-view images and mapping algorithms has issues in the texture chart generation which are the self-intersection and the concentration of the texture in texture space. Also we may suffer from some problems due to the occluded areas, such as inside parts of thighs. In this paper we propose a texture mapping technique for 3D models using multi-view images on the GPU. We do texture mapping directly on the GPU fragment shader per pixel without generation of the texture map. And we solve for the occluded area using the 3D model depth information. Our method needs more calculation on the GPU than previous works, but it has shown real-time performance and previously mentioned problems do not occur.

Keywords: Texture Mapping, Multi-view Images, Camera Calibration, GPU Shader.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1894
564 Identifying Blind Spots in a Stereo View for Early Decisions in SI for Fusion based DMVC

Authors: H. Ali, K. Hameed, N. Khan

Abstract:

In DMVC, we have more than one options of sources available for construction of side information. The newer techniques make use of both the techniques simultaneously by constructing a bitmask that determines the source of every block or pixel of the side information. A lot of computation is done to determine each bit in the bitmask. In this paper, we have tried to define areas that can only be well predicted by temporal interpolation and not by multiview interpolation or synthesis. We predict that all such areas that are not covered by two cameras cannot be appropriately predicted by multiview synthesis and if we can identify such areas in the first place, we don-t need to go through the script of computations for all the pixels that lie in those areas. Moreover, this paper also defines a technique based on KLT to mark the above mentioned areas before any other processing is done on the side view.

Keywords: Side Information, Distributed Multiview Video Coding, Fusion, Early Decision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1292
563 Heat Flux Reduction Research in Hypersonic Flow with Opposing Jet

Authors: Yisheng Rong, Jian Sun, Weiqiang Liu, Renjun Zhan

Abstract:

A CFD study on heat flux reduction in hypersonic flow with opposing jet has been conducted. Flowfield parameters, reattachment point position, surface pressure distributions and heat flux distributions are obtained and validated with experiments. The physical mechanism of heat reduction has been analyzed. When the opposing jet blows, the freestream is blocked off, flows to the edges and not interacts with the surface to form aerodynamic heating. At the same time, the jet flows back to form cool recirculation region, which reduces the difference in temperature between the surface and the nearby gas, and then reduces the heat flux. As the pressure ratio increases, the interface between jet and freestream is gradually pushed away from the surface. Larger the total pressure ratio is, lower the heat flux is. To study the effect of the intensity of opposing jet more reasonably, a new parameter RPA has been introduced by combining the flux and the total pressure ratio. The study shows that the same shock wave position and total heat load can be obtained with the same RPA with different fluxes and the total pressures, which means the new parameter could stand for the intensity of opposing jet and could be used to analyze the influence of opposing jet on flow field and aerodynamic heating.

Keywords: opposing jet, aerodynamic heating, total pressure ratio, thermal protection system

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2017
562 Florida’s Groundwater and Surface Water System Reliability in Terms of Climate Change and Sea-Level Rise

Authors: Rahman Davtalab, Saba Ghotbi

Abstract:

Florida is one of the most vulnerable states to natural disasters among the 50 states of the USA. The state exposed by tropical storms, hurricanes, storm surge, landslide, etc. Besides the mentioned natural phenomena, global warming, sea-level rise, and other anthropogenic environmental changes make a very complicated and unpredictable system for decision-makers. In this study, we tried to highlight the effects of climate change and sea-level rise on surface water and groundwater systems for three different geographical locations in Florida; Main Canal of Jacksonville Beach in the northeast of Florida adjacent to the Atlantic Ocean, Grace Lake in central Florida, far away from surrounded coastal line, and Mc Dill in Florida and adjacent to Tampa Bay and Mexican Gulf. An integrated hydrologic and hydraulic model was developed and simulated for all three cases, including surface water, groundwater, or a combination of both. For the case study of Main Canal-Jacksonville Beach, the investigation showed that a 76 cm sea-level rise in time horizon 2060 could increase the flow velocity of the tide cycle for the main canal's outlet and headwater. This case also revealed how the sea level rise could change the tide duration, potentially affecting the coastal ecosystem. As expected, sea-level rise can raise the groundwater level. Therefore, for the Mc Dill case, the effect of groundwater rise on soil storage and the performance of stormwater retention ponds is investigated. The study showed that sea-level rise increased the pond’s seasonal high water up to 40 cm by time horizon 2060. The reliability of the retention pond is dropped from 99% for the current condition to 54% for the future. The results also proved that the retention pond could not retain and infiltrate the designed treatment volume within 72 hours, which is a significant indication of increasing pollutants in the future. Grace Lake case study investigates the effects of climate change on groundwater recharge. This study showed that using the dynamically downscaled data of the groundwater recharge can decline up to 24 % by the mid-21st century. 

Keywords: groundwater, surface water, Florida, retention pond, tide, sea-level rise

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 519
561 Supercompression for Full-HD and 4k-3D (8k)Digital TV Systems

Authors: Mario Mastriani

Abstract:

In this work, we developed the concept of supercompression, i.e., compression above the compression standard used. In this context, both compression rates are multiplied. In fact, supercompression is based on super-resolution. That is to say, supercompression is a data compression technique that superpose spatial image compression on top of bit-per-pixel compression to achieve very high compression ratios. If the compression ratio is very high, then we use a convolutive mask inside decoder that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. Specifically, the mentio-ned mask is coded inside texture memory of a GPGPU.

Keywords: General-Purpose computation on Graphics Processing Units, Image Compression, Interpolation, Super-resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937
560 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory

Authors: Chiung-Hui Chen

Abstract:

The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.

Keywords: Behavior, big data, hierarchical Hidden Markov Model, intelligent object.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 712
559 Epileptic Seizure Prediction by Exploiting Signal Transitions Phenomena

Authors: Mohammad Zavid Parvez, Manoranjan Paul

Abstract:

A seizure prediction method is proposed by extracting global features using phase correlation between adjacent epochs for detecting relative changes and local features using fluctuation/ deviation within an epoch for determining fine changes of different EEG signals. A classifier and a regularization technique are applied for the reduction of false alarms and improvement of the overall prediction accuracy. The experiments show that the proposed method outperforms the state-of-the-art methods and provides high prediction accuracy (i.e., 97.70%) with low false alarm using EEG signals in different brain locations from a benchmark data set.

Keywords: Epilepsy, Seizure, Phase Correlation, Fluctuation, Deviation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2414
558 Automatic Real-Patient Medical Data De-Identification for Research Purposes

Authors: Petr Vcelak, Jana Kleckova

Abstract:

Our Medicine-oriented research is based on a medical data set of real patients. It is a security problem to share patient private data with peoples other than clinician or hospital staff. We have to remove person identification information from medical data. The medical data without private data are available after a de-identification process for any research purposes. In this paper, we introduce an universal automatic rule-based de-identification application to do all this stuff on an heterogeneous medical data. A patient private identification is replaced by an unique identification number, even in burnedin annotation in pixel data. The identical identification is used for all patient medical data, so it keeps relationships in a data. Hospital can take an advantage of a research feedback based on results.

Keywords: DASTA, De-identification, DICOM, Health Level Seven, Medical data, OCR, Personal data

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599
557 Laser Transmission through Vegetative Material

Authors: Juliana A. Fracarolli, Adilson M. Enes, Inácio M. Dal Fabbro, Silvestre Rodrigues

Abstract:

The dynamic speckle or biospeckle is an interference phenomenon generated at the reflection of a coherent light by an active surface or even by a particulate or living body surface. The above mentioned phenomenon gave scientific support to a method named biospeckle which has been employed to study seed viability, biological activity, tissue senescence, tissue water content, fruit bruising, etc. Since the above mentioned method is not invasive and yields numerical values, it can be considered for possible automation associated to several processes, including selection and sorting. Based on these preliminary considerations, this research work proposed to study the interaction of a laser beam with vegetative samples by measuring the incident light intensity and the transmitted light beam intensity at several vegetative slabs of varying thickness. Tests were carried on fifteen slices of apple tissue divided into three thickness groups, i.e., 4 mm, 5 mm, 18 mm and 22 mm. A diode laser beam of 10mW and 632 nm wavelength and a Samsung digital camera were employed to carry the tests. Outgoing images were analyzed by comparing the gray gradient of a fixed image column of each image to obtain a laser penetration scale into the tissue, according to the slice thickness.

Keywords: Fruit, laser, laser transmission, vegetative tissue.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
556 Numerical Investigation on Optimizing Fatigue Life in a Lap Joint Structure

Authors: P. Zamani, S. Mohajerzadeh, R. Masoudinejad, Kh. Farhangdoost

Abstract:

Riveting process is one of the important ways to keep fastening the lap joints in aircraft structures. Failure of aircraft lap joints directly depends on the stress field in the joint. An important application of riveting process is in the construction of aircraft fuselage structures. In this paper, a 3D finite element method is carried out in order to optimize residual stress field in a riveted lap joint and also to estimate its fatigue life. In continue, a number of experiments are designed and analyzed using design of experiments (DOE). Then, Taguchi method is used to select an optimized case between different levels of each factor. Besides that, the factor which affects the most on residual stress field is investigated. Such optimized case provides the maximum residual stress field. Fatigue life of the optimized joint is estimated by Paris-Erdogan law. Stress intensity factors (SIFs) are calculated using both finite element analysis and experimental formula. In addition, the effect of residual stress field, geometry and secondary bending are considered in SIF calculation. A good agreement is found between results of such methods. Comparison between optimized fatigue life and fatigue life of other joints has shown an improvement in the joint’s life.

Keywords: Fatigue life, Residual stress, Riveting process, Stress intensity factor, Taguchi method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2130
555 A Comparative Study of Various Tone Mapping Methods

Authors: YasirSalih, AamirSaeed Malik, Wazirahbt.Md-Esa

Abstract:

In the recent years, high dynamic range imaging has gain popularity with the advancement in digital photography. In this contribution we present a subjective evaluation of various tone production and tone mapping techniques by a number of participants. Firstly, standard HDR images were used and the participants were asked to rate them based on a given rating scheme. After that, the participant was asked to rate HDR image generated using linear and nonlinear combination approach of multiple exposure images. The experimental results showed that linearly generated HDR images have better visualization than the nonlinear combined ones. In addition, Reinhard et al. and the exponential tone mapping operators have shown better results compared to logarithmic and the Garrett et al. tone mapping operators.

Keywords: tone mapping, high dynamic range, low dynamic range, bits per pixel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3310
554 Modeling the Effects of Type and Intensity of Selective Logging on Forests of the Amazon

Authors: Theodore N.S. Karfakis, Anna Andrade, Carolina Volkmer-Castilho, Dennis R. Valle, Eric Arets, Paul van Gardingen

Abstract:

The aim of the work presented here was to either use existing forest dynamic simulation models or calibrate a new one both within the SYMFOR framework with the purpose of examining changes in stand level basal area and functional composition in response to selective logging considering trees > 10 cm d.b.h for two areas of undisturbed Amazonian non flooded tropical forest in Brazil and one in Peru. Model biological realism was evaluated for forest in the undisturbed and selectively logged state and it was concluded that forest dynamics were realistically represented. Results of the logging simulation experiments showed that in relation to undisturbed forest simulation subject to no form of harvesting intervention there was a significant amount of change over a 90 year simulation period that was positively proportional to the intensity of logging. Areas which had in the dynamic equilibrium of undisturbed forest a greater proportion of a specific ecological guild of trees known as the light hardwoods (LHW’s) seemed to respond more favorably in terms of less deviation but only within a specific range of baseline forest composition beyond which compositional diversity became more important. These finds are in line partially with practical management experience and partiality basic systematics theory respectively.

Keywords: Amazonbasin, ecological species guild, selective logging, simulation modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1634
553 Spatial Objects Shaping with High-Pressure Abrasive Water Jet Controlled By Virtual Image Luminance

Authors: P. J. Borkowski, J. A. Borkowski

Abstract:

The paper presents a novel method for the 3D shaping of different materials using a high-pressure abrasive water jet and a flat target image. For steering movement process of the jet a principle similar to raster image way of record and readout was used. However, respective colors of pixel of such a bitmap are connected with adequate jet feed rate that causes erosion of material with adequate depth. Thanks to that innovation, one can observe spatial imaging of the object. Theoretical basis as well as spatial model of material shaping and experimental stand including steering program are presented in. There are also presented methodic and some experimental erosion results as well as practical example of object-s bas-relief made of metal.

Keywords: High-pressure, abrasive, water jet, material shaping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1389
552 Fast Document Segmentation Using Contourand X-Y Cut Technique

Authors: Boontee Kruatrachue, Narongchai Moongfangklang, Kritawan Siriboon

Abstract:

This paper describes fast and efficient method for page segmentation of document containing nonrectangular block. The segmentation is based on edge following algorithm using small window of 16 by 32 pixels. This segmentation is very fast since only border pixels of paragraph are used without scanning the whole page. Still, the segmentation may contain error if the space between them is smaller than the window used in edge following. Consequently, this paper reduce this error by first identify the missed segmentation point using direction information in edge following then, using X-Y cut at the missed segmentation point to separate the connected columns. The advantage of the proposed method is the fast identification of missed segmentation point. This methodology is faster with fewer overheads than other algorithms that need to access much more pixel of a document.

Keywords: Contour Direction Technique, Missed SegmentationPoints, Page Segmentation, Recursive X-Y Cut Technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2730
551 Intra Prediction using Weighted Average of Pixel Values According to Prediction Direction

Authors: Kibaek Kim, Dongjin Jung, Jinik Jang, Jechang Jeong

Abstract:

In this paper, we proposed a method to reduce quantization error. In order to reduce quantization error, low pass filtering is applied on neighboring samples of current block in H.264/AVC. However, it has a weak point that low pass filtering is performed regardless of prediction direction. Since it doesn-t consider prediction direction, it may not reduce quantization error effectively. Proposed method considers prediction direction for low pass filtering and uses a threshold condition for reducing flag bit. We compare our experimental result with conventional method in H.264/AVC and we can achieve the average bit-rate reduction of 1.534% by applying the proposed method. Bit-rate reduction between 0.580% and 3.567% are shown for experimental results.

Keywords: Coding efficiency, H.264/AVC, Intra prediction, Low pass filter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
550 Plants Cover Effects on Overland Flow and on Soil Erosion under Simulated Rainfall Intensity

Authors: H. Madi, L. Mouzai, M. Bouhadef

Abstract:

The purpose of this article is to study the effects of plants cover on overland flow and, therefore, its influences on the amount of eroded and transported soil. In this investigation, all the experiments were conducted in the LEGHYD laboratory using a rainfall simulator and a soil tray. The experiments were conducted using an experimental plot (soil tray) which is 2m long, 0.5 m wide and 0.15 m deep. The soil used is an agricultural sandy soil (62,08% coarse sand, 19,14% fine sand, 11,57% silt and 7,21% clay). Plastic rods (4 mm in diameter) were used to simulate the plants at different densities: 0 stem/m2 (bared soil), 126 stems/m², 203 stems/m², 461 stems/m² and 2500 stems/m²). The used rainfall intensity is 73mm/h and the soil tray slope is fixed to 3°. The results have shown that the overland flow velocities decreased with increasing stems density, and the density cover has a great effect on sediment concentration. Darcy–Weisbach and Manning friction coefficients of overland flow increased when the stems density increased. Froude and Reynolds numbers decreased with increasing stems density and, consequently, the flow regime of all treatments was laminar and subcritical. From these findings, we conclude that increasing the plants cover can efficiently reduce soil loss and avoid denuding the roots plants.

Keywords: Soil erosion, vegetation, stems density, overland flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3084
549 A High-Crosstalk Silicon Photonic Arrayed Waveguide Grating

Authors: Qing Fang, Lianxi Jia, Junfeng Song, Chao Li, Xianshu Luo, Mingbin Yu, Guoqiang Lo

Abstract:

In this paper, we demonstrated a 1 × 4 silicon photonic cascaded arrayed waveguide grating, which is fabricated on a SOI wafer with a 220 nm top Si layer and a 2µm buried oxide layer. The measured on-chip transmission loss of this cascaded arrayed waveguide grating is ~ 5.6 dB, including the fiber-to-waveguide coupling loss. The adjacent crosstalk is 33.2 dB. Compared to the normal single silicon photonic arrayed waveguide grating with a crosstalk of ~ 12.5 dB, the crosstalk of this device has been dramatically increased.

Keywords: Silicon photonic, arrayed waveguide grating, high-crosstalk, cascaded structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765
548 Color Image Segmentation using Adaptive Spatial Gaussian Mixture Model

Authors: M.Sujaritha, S. Annadurai

Abstract:

An adaptive spatial Gaussian mixture model is proposed for clustering based color image segmentation. A new clustering objective function which incorporates the spatial information is introduced in the Bayesian framework. The weighting parameter for controlling the importance of spatial information is made adaptive to the image content to augment the smoothness towards piecewisehomogeneous region and diminish the edge-blurring effect and hence the name adaptive spatial finite mixture model. The proposed approach is compared with the spatially variant finite mixture model for pixel labeling. The experimental results with synthetic and Berkeley dataset demonstrate that the proposed method is effective in improving the segmentation and it can be employed in different practical image content understanding applications.

Keywords: Adaptive; Spatial, Mixture model, Segmentation, Color.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2451
547 Computer Study of Cluster Mechanism of Anti-greenhouse Effect

Authors: A. Galashev

Abstract:

Absorption spectra of infra-red (IR) radiation of the disperse water medium absorbing the most important greenhouse gases: CO2 , N2O , CH4 , C2H2 , C2H6 have been calculated by the molecular dynamics method. Loss of the absorbing ability at the formation of clusters due to a reduction of the number of centers interacting with IR radiation, results in an anti-greenhouse effect. Absorption of O3 molecules by the (H2O)50 cluster is investigated at its interaction with Cl- ions. The splitting of ozone molecule on atoms near to cluster surface was observed. Interaction of water cluster with Cl- ions causes the increase of integrated intensity of emission spectra of IR radiation, and also essential reduction of the similar characteristic of Raman spectrum. Relative integrated intensity of absorption of IR radiation for small water clusters was designed. Dependences of the quantity of weight on altitude for vapor of monomers, clusters, droplets, crystals and mass of all moisture were determined. The anti-greenhouse effect of clusters was defined as the difference of increases of average global temperature of the Earth, caused by absorption of IR radiation by free water molecules forming clusters, and absorption of clusters themselves. The greenhouse effect caused by clusters makes 0.53 K, and the antigreenhouse one is equal to 1.14 K. The increase of concentration of CO2 in the atmosphere does not always correlate with the amplification of greenhouse effect.

Keywords: Greenhouse gases, infrared absorption and Raman spectra, molecular dynamics method, water clusters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447
546 An Efficient Feature Extraction Algorithm for the Recognition of Handwritten Arabic Digits

Authors: Ahmad T. Al-Taani

Abstract:

In this paper, an efficient structural approach for recognizing on-line handwritten digits is proposed. After reading the digit from the user, the slope is estimated and normalized for adjacent nodes. Based on the changing of signs of the slope values, the primitives are identified and extracted. The names of these primitives are represented by strings, and then a finite state machine, which contains the grammars of the digits, is traced to identify the digit. Finally, if there is any ambiguity, it will be resolved. Experiments showed that this technique is flexible and can achieve high recognition accuracy for the shapes of the digits represented in this work.

Keywords: Digits Recognition, Pattern Recognition, FeatureExtraction, Structural Primitives, Document Processing, Handwritten Recognition, Primitives Selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2589
545 Video Data Mining based on Information Fusion for Tamper Detection

Authors: Girija Chetty, Renuka Biswas

Abstract:

In this paper, we propose novel algorithmic models based on information fusion and feature transformation in crossmodal subspace for different types of residue features extracted from several intra-frame and inter-frame pixel sub-blocks in video sequences for detecting digital video tampering or forgery. An evaluation of proposed residue features – the noise residue features and the quantization features, their transformation in cross-modal subspace, and their multimodal fusion, for emulated copy-move tamper scenario shows a significant improvement in tamper detection accuracy as compared to single mode features without transformation in cross-modal subspace.

Keywords: image tamper detection, digital forensics, correlation features image fusion

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1861
544 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1362
543 Hidden State Probabilistic Modeling for Complex Wavelet Based Image Registration

Authors: F. C. Calnegru

Abstract:

This article presents a computationally tractable probabilistic model for the relation between the complex wavelet coefficients of two images of the same scene. The two images are acquisitioned at distinct moments of times, or from distinct viewpoints, or by distinct sensors. By means of the introduced probabilistic model, we argue that the similarity between the two images is controlled not by the values of the wavelet coefficients, which can be altered by many factors, but by the nature of the wavelet coefficients, that we model with the help of hidden state variables. We integrate this probabilistic framework in the construction of a new image registration algorithm. This algorithm has sub-pixel accuracy and is robust to noise and to other variations like local illumination changes. We present the performance of our algorithm on various image types.

Keywords: Complex wavelet transform, image registration, modeling using hidden state variables, probabilistic similaritymeasure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1342
542 Color Image Segmentation Using Kekre-s Algorithm for Vector Quantization

Authors: H. B. Kekre, Tanuja K. Sarode, Bhakti Raul

Abstract:

In this paper we propose segmentation approach based on Vector Quantization technique. Here we have used Kekre-s fast codebook generation algorithm for segmenting low-altitude aerial image. This is used as a preprocessing step to form segmented homogeneous regions. Further to merge adjacent regions color similarity and volume difference criteria is used. Experiments performed with real aerial images of varied nature demonstrate that this approach does not result in over segmentation or under segmentation. The vector quantization seems to give far better results as compared to conventional on-the-fly watershed algorithm.

Keywords: Image Segmentation, , Codebook, Codevector, data compression, Encoding

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158
541 Topology Preservation in SOM

Authors: E. Arsuaga Uriarte, F. Díaz Martín

Abstract:

The SOM has several beneficial features which make it a useful method for data mining. One of the most important features is the ability to preserve the topology in the projection. There are several measures that can be used to quantify the goodness of the map in order to obtain the optimal projection, including the average quantization error and many topological errors. Many researches have studied how the topology preservation should be measured. One option consists of using the topographic error which considers the ratio of data vectors for which the first and second best BMUs are not adjacent. In this work we present a study of the behaviour of the topographic error in different kinds of maps. We have found that this error devaluates the rectangular maps and we have studied the reasons why this happens. Finally, we suggest a new topological error to improve the deficiency of the topographic error.

Keywords: Map lattice, Self-Organizing Map, topographic error, topology preservation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2945
540 Automatic Authentication of Handwritten Documents via Low Density Pixel Measurements

Authors: Abhijit Mitra, Pranab Kumar Banerjee, C. Ardil

Abstract:

We introduce an effective approach for automatic offline au- thentication of handwritten samples where the forgeries are skillfully done, i.e., the true and forgery sample appearances are almost alike. Subtle details of temporal information used in online verification are not available offline and are also hard to recover robustly. Thus the spatial dynamic information like the pen-tip pressure characteristics are considered, emphasizing on the extraction of low density pixels. The points result from the ballistic rhythm of a genuine signature which a forgery, however skillful that may be, always lacks. Ten effective features, including these low density points and den- sity ratio, are proposed to make the distinction between a true and a forgery sample. An adaptive decision criteria is also derived for better verification judgements.

Keywords: Handwritten document verification, Skilled forgeries, Low density pixels, Adaptive decision boundary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
539 A new Adaptive Approach for Histogram based Mouth Segmentation

Authors: Axel Panning, Robert Niese, Ayoub Al-Hamadi, Bernd Michaelis

Abstract:

The segmentation of mouth and lips is a fundamental problem in facial image analyisis. In this paper we propose a method for lip segmentation based on rg-color histogram. Statistical analysis shows, using the rg-color-space is optimal for this purpose of a pure color based segmentation. Initially a rough adaptive threshold selects a histogram region, that assures that all pixels in that region are skin pixels. Based on that pixels we build a gaussian model which represents the skin pixels distribution and is utilized to obtain a refined, optimal threshold. We are not incorporating shape or edge information. In experiments we show the performance of our lip pixel segmentation method compared to the ground truth of our dataset and a conventional watershed algorithm.

Keywords: Feature extraction, Segmentation, Image processing, Application

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
538 On Chromaticity of Wheels

Authors: Zainab Yasir Al-Rekaby, Abdul Jalil M. Khalaf

Abstract:

Let the vertices of a graph such that every two adjacent vertices have different color is a very common problem in the graph theory. This is known as proper coloring of graphs. The possible number of different proper colorings on a graph with a given number of colors can be represented by a function called the chromatic polynomial. Two graphs G and H are said to be chromatically equivalent, if they share the same chromatic polynomial. A Graph G is chromatically unique, if G is isomorphic to H for any graph H such that G is chromatically equivalent to H. The study of chromatically equivalent and chromatically unique problems is called chromaticity. This paper shows that a wheel W12 is chromatically unique.

Keywords: Chromatic Polynomial, Chromatically Equivalent, Chromatically Unique, Wheel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2070
537 Nullity of t-Tupple Graphs

Authors: Khidir R. Sharaf, Didar A. Ali

Abstract:

The nullity η(G) of a graph is the occurrence of zero as an eigenvalue in its spectra. A zero-sum weighting of a graph G is real valued function, say f from vertices of G to the set of real numbers, provided that for each vertex of G the summation of the weights f(w) over all neighborhood w of v is zero for each v in G.A high zero-sum weighting of G is one that uses maximum number of non-zero independent variables. If G is graph with an end vertex, and if H is an induced subgraph of G obtained by deleting this vertex together with the vertex adjacent to it, then, η(G)= η(H). In this paper, a high zero-sum weighting technique and the endvertex procedure are applied to evaluate the nullity of t-tupple and generalized t-tupple graphs are derived  and determined for some special types of graphs,

 Also, we introduce and prove some important results about the t-tupple coalescence, Cartesian and Kronecker products of nut graphs.

Keywords: Graph theory, Graph spectra, Nullity of graphs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1866
536 Investigation on Feature Extraction and Classification of Medical Images

Authors: P. Gnanasekar, A. Nagappan, S. Sharavanan, O. Saravanan, D. Vinodkumar, T. Elayabharathi, G. Karthik

Abstract:

In this paper we present the deep study about the Bio- Medical Images and tag it with some basic extracting features (e.g. color, pixel value etc). The classification is done by using a nearest neighbor classifier with various distance measures as well as the automatic combination of classifier results. This process selects a subset of relevant features from a group of features of the image. It also helps to acquire better understanding about the image by describing which the important features are. The accuracy can be improved by increasing the number of features selected. Various types of classifications were evolved for the medical images like Support Vector Machine (SVM) which is used for classifying the Bacterial types. Ant Colony Optimization method is used for optimal results. It has high approximation capability and much faster convergence, Texture feature extraction method based on Gabor wavelets etc..

Keywords: ACO Ant Colony Optimization, Correlogram, CCM Co-Occurrence Matrix, RTS Rough-Set theory

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2968