Search results for: spatial interpolation algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4090

Search results for: spatial interpolation algorithm

4060 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: Real-Time Spatial Big Data, Quality Of Service, Vertical partitioning, Horizontal partitioning, Matching algorithm, Hamming distance, Stream query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1014
4059 Simulation of Organic Matter Variability on a Sugarbeet Field Using the Computer Based Geostatistical Methods

Authors: M. Rüstü Karaman, Tekin Susam, Fatih Er, Servet Yaprak, Osman Karkacıer

Abstract:

Computer based geostatistical methods can offer effective data analysis possibilities for agricultural areas by using vectorial data and their objective informations. These methods will help to detect the spatial changes on different locations of the large agricultural lands, which will lead to effective fertilization for optimal yield with reduced environmental pollution. In this study, topsoil (0-20 cm) and subsoil (20-40 cm) samples were taken from a sugar beet field by 20 x 20 m grids. Plant samples were also collected from the same plots. Some physical and chemical analyses for these samples were made by routine methods. According to derived variation coefficients, topsoil organic matter (OM) distribution was more than subsoil OM distribution. The highest C.V. value of 17.79% was found for topsoil OM. The data were analyzed comparatively according to kriging methods which are also used widely in geostatistic. Several interpolation methods (Ordinary,Simple and Universal) and semivariogram models (Spherical, Exponential and Gaussian) were tested in order to choose the suitable methods. Average standard deviations of values estimated by simple kriging interpolation method were less than average standard deviations (topsoil OM ± 0.48, N ± 0.37, subsoil OM ± 0.18) of measured values. The most suitable interpolation method was simple kriging method and exponantial semivariogram model for topsoil, whereas the best optimal interpolation method was simple kriging method and spherical semivariogram model for subsoil. The results also showed that these computer based geostatistical methods should be tested and calibrated for different experimental conditions and semivariogram models.

Keywords: Geostatistic, kriging, organic matter, sugarbeet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
4058 Simulation Data Summarization Based on Spatial Histograms

Authors: Jing Zhao, Yoshiharu Ishikawa, Chuan Xiao, Kento Sugiura

Abstract:

In order to analyze large-scale scientific data, research on data exploration and visualization has gained popularity. In this paper, we focus on the exploration and visualization of scientific simulation data, and define a spatial V-Optimal histogram for data summarization. We propose histogram construction algorithms based on a general binary hierarchical partitioning as well as a more specific one, the l-grid partitioning. For effective data summarization and efficient data visualization in scientific data analysis, we propose an optimal algorithm as well as a heuristic algorithm for histogram construction. To verify the effectiveness and efficiency of the proposed methods, we conduct experiments on the massive evacuation simulation data.

Keywords: Simulation data, data summarization, spatial histograms, exploration and visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 704
4057 Investigating Polynomial Interpolation Functions for Zooming Low Resolution Digital Medical Images

Authors: Maninder Pal

Abstract:

Medical digital images usually have low resolution because of nature of their acquisition. Therefore, this paper focuses on zooming these images to obtain better level of information, required for the purpose of medical diagnosis. For this purpose, a strategy for selecting pixels in zooming operation is proposed. It is based on the principle of analog clock and utilizes a combination of point and neighborhood image processing. In this approach, the hour hand of clock covers the portion of image to be processed. For alignment, the center of clock points at middle pixel of the selected portion of image. The minute hand is longer in length, and is used to gain information about pixels of the surrounding area. This area is called neighborhood pixels region. This information is used to zoom the selected portion of the image. The proposed algorithm is implemented and its performance is evaluated for many medical images obtained from various sources such as X-ray, Computerized Tomography (CT) scan and Magnetic Resonance Imaging (MRI). However, for illustration and simplicity, the results obtained from a CT scanned image of head is presented. The performance of algorithm is evaluated in comparison to various traditional algorithms in terms of Peak signal-to-noise ratio (PSNR), maximum error, SSIM index, mutual information and processing time. From the results, the proposed algorithm is found to give better performance than traditional algorithms.

Keywords: Zooming, interpolation, medical images, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1529
4056 GPU-Based Volume Rendering for Medical Imagery

Authors: Hadjira Bentoumi, Pascal Gautron, Kadi Bouatouch

Abstract:

We present a method for fast volume rendering using graphics hardware (GPU). To our knowledge, it is the first implementation on the GPU. Based on the Shear-Warp algorithm, our GPU-based method provides real-time frame rates and outperforms the CPU-based implementation. When the number of slices is not sufficient, we add in-between slices computed by interpolation. This improves then the quality of the rendered images. We have also implemented the ray marching algorithm on the GPU. The results generated by the three algorithms (CPU-based and GPU-based Shear- Warp, GPU-based Ray Marching) for two test models has proved that the ray marching algorithm outperforms the shear-warp methods in terms of speed up and image quality.

Keywords: Volume rendering, graphics processors

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1818
4055 Spatial thinking Issues: Towards Rural Sociological Research Agenda in the Third Millennium

Authors: Abdel-Samad M. Ali

Abstract:

Does the spatial perspective provide a common thread for rural sociology? Have rural sociologists succeeded in bringing order to their data using spatial analysis models and techniques? A trial answer to such questions, as touchstones of theoretical and applied sociological studies in rural areas, is the point at issue in the present paper. Spatial analyses have changed the way rural sociologists approach scientific problems. Rural sociology is spatial by nature because much, if not most, of its research topics has a spatial “awareness." However, such spatial awareness is not quite the same as spatial analysis because it is not typically associated with underlying theories and hypotheses about spatial patterns that are designed to be tested for their specific spatial content. This paper presents pressing issues for future research to reintroduce mainstream rural sociology to the concept of space.

Keywords: Maps, Rural Sociology, Space, Spatial variations

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904
4054 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model

Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li

Abstract:

Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.

Keywords: Spatial Information Network, Traffic prediction, Wavelet decomposition, Time series model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 576
4053 High Capacity Reversible Watermarking through Interpolated Error Shifting

Authors: Hae-Yeoun Lee

Abstract:

Reversible watermarking that not only protects the copyright but also preserve the original quality of the digital content have been intensively studied. In particular, the demand for reversible watermarking has increased. In this paper, we propose a reversible watermarking scheme based on interpolation-error shifting and error pre-compensation. The intensity of a pixel is interpolated from the intensities of neighboring pixels, and the difference histogram between the interpolated and the original intensities is obtained and modified to embed the watermark message. By restoring the difference histogram, the embedded watermark is extracted and the original image is recovered by compensating for the interpolation error. The overflow and underflow are prevented by error pre-compensation. To show the performance of the method, the proposed algorithm is compared with other methods using various test images.

Keywords: Reversible watermarking, High capacity, High quality, Interpolated error shifting, Error pre-compensation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187
4052 Generalized Morphological 3D Shape Decomposition Grayscale Interframe Interpolation Method

Authors: Dragos Nicolae VIZIREANU

Abstract:

One of the main image representations in Mathematical Morphology is the 3D Shape Decomposition Representation, useful for Image Compression and Representation,and Pattern Recognition. The 3D Morphological Shape Decomposition representation can be generalized a number of times,to extend the scope of its algebraic characteristics as much as possible. With these generalizations, the Morphological Shape Decomposition 's role to serve as an efficient image decomposition tool is extended to grayscale images.This work follows the above line, and further develops it. Anew evolutionary branch is added to the 3D Morphological Shape Decomposition's development, by the introduction of a 3D Multi Structuring Element Morphological Shape Decomposition, which permits 3D Morphological Shape Decomposition of 3D binary images (grayscale images) into "multiparameter" families of elements. At the beginning, 3D Morphological Shape Decomposition representations are based only on "1 parameter" families of elements for image decomposition.This paper addresses the gray scale inter frame interpolation by means of mathematical morphology. The new interframe interpolation method is based on generalized morphological 3D Shape Decomposition. This article will present the theoretical background of the morphological interframe interpolation, deduce the new representation and show some application examples.Computer simulations could illustrate results.

Keywords: 3D shape decomposition representation, mathematical morphology, gray scale interframe interpolation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1700
4051 Cubic Trigonometric B-Spline Applied to Linear Two-Point Boundary Value Problems of Order Two

Authors: Nur Nadiah Abd Hamid , Ahmad Abd. Majid, Ahmad Izani Md. Ismail

Abstract:

Linear two-point boundary value problems of order two are solved using cubic trigonometric B-spline interpolation method (CTBIM). Cubic trigonometric B-spline is a piecewise function consisting of trigonometric equations. This method is tested on some problems and the results are compared with cubic B-spline interpolation method (CBIM) from the literature. CTBIM is found to approximate the solution slightly more accurately than CBIM if the problems are trigonometric.

Keywords: trigonometric B-spline, two-point boundary valueproblem, spline interpolation, cubic spline

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2522
4050 A Framework for Data Mining Based Multi-Agent: An Application to Spatial Data

Authors: H. Baazaoui Zghal, S. Faiz, H. Ben Ghezala

Abstract:

Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.

Keywords: Databases, data mining, multi-agent, spatial datamart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2008
4049 Design of a 4-DOF Robot Manipulator with Optimized Algorithm for Inverse Kinematics

Authors: S. Gómez, G. Sánchez, J. Zarama, M. Castañeda Ramos, J. Escoto Alcántar, J. Torres, A. Núñez, S. Santana, F. Nájera, J. A. Lopez

Abstract:

This paper shows in detail the mathematical model of direct and inverse kinematics for a robot manipulator (welding type) with four degrees of freedom. Using the D-H parameters, screw theory, numerical, geometric and interpolation methods, the theoretical and practical values of the position of robot were determined using an optimized algorithm for inverse kinematics obtaining the values of the particular joints in order to determine the virtual paths in a relatively short time.

Keywords: Kinematics, degree of freedom, optimization, robot manipulator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6035
4048 Density Clustering Based On Radius of Data (DCBRD)

Authors: A.M. Fahim, A. M. Salem, F. A. Torkey, M. A. Ramadan

Abstract:

Clustering algorithms are attractive for the task of class identification in spatial databases. However, the application to large spatial databases rises the following requirements for clustering algorithms: minimal requirements of domain knowledge to determine the input parameters, discovery of clusters with arbitrary shape and good efficiency on large databases. The well-known clustering algorithms offer no solution to the combination of these requirements. In this paper, a density based clustering algorithm (DCBRD) is presented, relying on a knowledge acquired from the data by dividing the data space into overlapped regions. The proposed algorithm discovers arbitrary shaped clusters, requires no input parameters and uses the same definitions of DBSCAN algorithm. We performed an experimental evaluation of the effectiveness and efficiency of it, and compared this results with that of DBSCAN. The results of our experiments demonstrate that the proposed algorithm is significantly efficient in discovering clusters of arbitrary shape and size.

Keywords: Clustering Algorithms, Arbitrary Shape of clusters, cluster Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1835
4047 An Improved Quality Adaptive Rate Filtering Technique Based on the Level Crossing Sampling

Authors: Saeed Mian Qaisar, Laurent Fesquet, Marc Renaudin

Abstract:

Mostly the systems are dealing with time varying signals. The Power efficiency can be achieved by adapting the system activity according to the input signal variations. In this context an adaptive rate filtering technique, based on the level crossing sampling is devised. It adapts the sampling frequency and the filter order by following the input signal local variations. Thus, it correlates the processing activity with the signal variations. Interpolation is required in the proposed technique. A drastic reduction in the interpolation error is achieved by employing the symmetry during the interpolation process. Processing error of the proposed technique is calculated. The computational complexity of the proposed filtering technique is deduced and compared to the classical one. Results promise a significant gain of the computational efficiency and hence of the power consumption.

Keywords: Level Crossing Sampling, Activity Selection, Rate Filtering, Computational Complexity, Interpolation Error.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
4046 Spatial Distribution of Cd, Zn and Hg in Groundwater at Rayong Province, Thailand

Authors: T. Makkasap, T. Satapanajaru

Abstract:

The objective of this study was to evaluate the distribution patterns of Cd, Zn and Hg in groundwater by geospatial interpolation. The study was performed at Rayong province in the eastern part of Thailand, with high agricultural and industrial activities. Groundwater samples were collected twice a year from 31 tubewells around this area. Inductively Coupled Plasma-Atomic Emission Spectrometer (ICP-AES) was used to measure the concentrations of Cd, Zn, and Hg in groundwater samples. The results demonstrated that concentrations of Cd, Zn and Hg range from 0.000-0.297 mg/L (x = 0.021±0.033 mg/L), 0.022-33.236 mg/L (x = 4.214±4.766 mg/L) and 0.000-0.289 mg/L (x = 0.023±0.034 mg/L), respectively. Most of the heavy metals concentrations were exceeded groundwater quality standards as specified in the Ministry of Natural Resources and Environment, Thailand. The trend distribution of heavy metals were high concentrations at the southeastern part of the area that especially vulnerable to heavy metals and other contaminants.

Keywords: Groundwater, Heavy metals, Kriging, Rayong, Spatial distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1911
4045 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2644
4044 A Preliminary Study for Design of Automatic Block Reallocation Algorithm with Genetic Algorithm Method in the Land Consolidation Projects

Authors: Tayfun Çay, Yaşar İnceyol, Abdurrahman Özbeyaz

Abstract:

Land reallocation is one of the most important steps in land consolidation projects. Many different models were proposed for land reallocation in the literature such as Fuzzy Logic, block priority based land reallocation and Spatial Decision Support Systems. A model including four parts is considered for automatic block reallocation with genetic algorithm method in land consolidation projects. These stages are preparing data tables for a project land, determining conditions and constraints of land reallocation, designing command steps and logical flow chart of reallocation algorithm and finally writing program codes of Genetic Algorithm respectively. In this study, we designed the first three steps of the considered model comprising four steps.

Keywords: Genetic algorithm, land consolidation, landholding, land reallocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1869
4043 A Distributed Algorithm for Intrinsic Cluster Detection over Large Spatial Data

Authors: Sauravjyoti Sarmah, Rosy Das, Dhruba Kr. Bhattacharyya

Abstract:

Clustering algorithms help to understand the hidden information present in datasets. A dataset may contain intrinsic and nested clusters, the detection of which is of utmost importance. This paper presents a Distributed Grid-based Density Clustering algorithm capable of identifying arbitrary shaped embedded clusters as well as multi-density clusters over large spatial datasets. For handling massive datasets, we implemented our method using a 'sharednothing' architecture where multiple computers are interconnected over a network. Experimental results are reported to establish the superiority of the technique in terms of scale-up, speedup as well as cluster quality.

Keywords: Clustering, Density-based, Grid-based, Adaptive Grid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
4042 CT Reconstruction from a Limited Number of X-Ray Projections

Authors: Tao Quang Bang, Insu Jeon

Abstract:

Most CT reconstruction system x-ray computed tomography (CT) is a well established visualization technique in medicine and nondestructive testing. However, since CT scanning requires sampling of radiographic projections from different viewing angles, common CT systems with mechanically moving parts are too slow for dynamic imaging, for instance of multiphase flows or live animals. A large number of X-ray projections are needed to reconstruct CT images, so the collection and calculation of the projection data consume too much time and harmful for patient. For the purpose of solving the problem, in this study, we proposed a method for tomographic reconstruction of a sample from a limited number of x-ray projections by using linear interpolation method. In simulation, we presented reconstruction from an experimental x-ray CT scan of a Aluminum phantom that follows to two steps: X-ray projections will be interpolated using linear interpolation method and using it for CT reconstruction based upon Ordered Subsets Expectation Maximization (OSEM) method.

Keywords: CT reconstruction, X-ray projections, Interpolation technique, OSEM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2355
4041 The Role of Velocity Map Quality in Estimation of Intravascular Pressure Distribution

Authors: Ali Pashaee, Parisa Shooshtari, Gholamreza Atae, Nasser Fatouraee

Abstract:

Phase-Contrast MR imaging methods are widely used for measurement of blood flow velocity components. Also there are some other tools such as CT and Ultrasound for velocity map detection in intravascular studies. These data are used in deriving flow characteristics. Some clinical applications are investigated which use pressure distribution in diagnosis of intravascular disorders such as vascular stenosis. In this paper an approach to the problem of measurement of intravascular pressure field by using velocity field obtained from flow images is proposed. The method presented in this paper uses an algorithm to calculate nonlinear equations of Navier- Stokes, assuming blood as an incompressible and Newtonian fluid. Flow images usually suffer the lack of spatial resolution. Our attempt is to consider the effect of spatial resolution on the pressure distribution estimated from this method. In order to achieve this aim, velocity map of a numerical phantom is derived at six different spatial resolutions. To determine the effects of vascular stenoses on pressure distribution, a stenotic phantom geometry is considered. A comparison between the pressure distribution obtained from the phantom and the pressure resulted from the algorithm is presented. In this regard we also compared the effects of collocated and staggered computational grids on the pressure distribution resulted from this algorithm.

Keywords: Flow imaging, pressure distribution estimation, phantom, resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
4040 Research on Development and Accuracy Improvement of an Explosion Proof Combustible Gas Leak Detector Using an IR Sensor

Authors: Gyoutae Park, Seungho Han, Byungduk Kim, Youngdo Jo, Yongsop Shim, Yeonjae Lee, Sangguk Ahn, Hiesik Kim, Jungil Park

Abstract:

In this paper, we presented not only development technology of an explosion proof type and portable combustible gas leak detector but also algorithm to improve accuracy for measuring gas concentrations. The presented techniques are to apply the flame-proof enclosure and intrinsic safe explosion proof to an infrared gas leak detector at first in Korea and to improve accuracy using linearization recursion equation and Lagrange interpolation polynomial. Together, we tested sensor characteristics and calibrated suitable input gases and output voltages. Then, we advanced the performances of combustible gaseous detectors through reflecting demands of gas safety management fields. To check performances of two company's detectors, we achieved the measurement tests with eight standard gases made by Korea Gas Safety Corporation. We demonstrated our instruments better in detecting accuracy other than detectors through experimental results.

Keywords: Gas sensor, leak, detector, accuracy, interpolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1363
4039 Visualization of Sediment Thickness Variation for Sea Bed Logging using Spline Interpolation

Authors: Hanita Daud, Noorhana Yahya, Vijanth Sagayan, Muizuddin Talib

Abstract:

This paper discusses on the use of Spline Interpolation and Mean Square Error (MSE) as tools to process data acquired from the developed simulator that shall replicate sea bed logging environment. Sea bed logging (SBL) is a new technique that uses marine controlled source electromagnetic (CSEM) sounding technique and is proven to be very successful in detecting and characterizing hydrocarbon reservoirs in deep water area by using resistivity contrasts. It uses very low frequency of 0.1Hz to 10 Hz to obtain greater wavelength. In this work the in house built simulator was used and was provided with predefined parameters and the transmitted frequency was varied for sediment thickness of 1000m to 4000m for environment with and without hydrocarbon. From series of simulations, synthetics data were generated. These data were interpolated using Spline interpolation technique (degree of three) and mean square error (MSE) were calculated between original data and interpolated data. Comparisons were made by studying the trends and relationship between frequency and sediment thickness based on the MSE calculated. It was found that the MSE was on increasing trends in the set up that has the presence of hydrocarbon in the setting than the one without. The MSE was also on decreasing trends as sediment thickness was increased and with higher transmitted frequency.

Keywords: Spline Interpolation, Mean Square Error, Sea Bed Logging, Controlled Source Electromagnetic

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625
4038 Enhancing Multi-Frame Images Using Self-Delaying Dynamic Networks

Authors: Lewis E. Hibell, Honghai Liu, David J. Brown

Abstract:

This paper presents the use of a newly created network structure known as a Self-Delaying Dynamic Network (SDN) to create a high resolution image from a set of time stepped input frames. These SDNs are non-recurrent temporal neural networks which can process time sampled data. SDNs can store input data for a lifecycle and feature dynamic logic based connections between layers. Several low resolution images and one high resolution image of a scene were presented to the SDN during training by a Genetic Algorithm. The SDN was trained to process the input frames in order to recreate the high resolution image. The trained SDN was then used to enhance a number of unseen noisy image sets. The quality of high resolution images produced by the SDN is compared to that of high resolution images generated using Bi-Cubic interpolation. The SDN produced images are superior in several ways to the images produced using Bi-Cubic interpolation.

Keywords: Image Enhancement, Neural Networks, Multi-Frame.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1159
4037 Human Action Recognition Based on Ridgelet Transform and SVM

Authors: A. Ouanane, A. Serir

Abstract:

In this paper, a novel algorithm based on Ridgelet Transform and support vector machine is proposed for human action recognition. The Ridgelet transform is a directional multi-resolution transform and it is more suitable for describing the human action by performing its directional information to form spatial features vectors. The dynamic transition between the spatial features is carried out using both the Principal Component Analysis and clustering algorithm K-means. First, the Principal Component Analysis is used to reduce the dimensionality of the obtained vectors. Then, the kmeans algorithm is then used to perform the obtained vectors to form the spatio-temporal pattern, called set-of-labels, according to given periodicity of human action. Finally, a Support Machine classifier is used to discriminate between the different human actions. Different tests are conducted on popular Datasets, such as Weizmann and KTH. The obtained results show that the proposed method provides more significant accuracy rate and it drives more robustness in very challenging situations such as lighting changes, scaling and dynamic environment

Keywords: Human action, Ridgelet Transform, PCA, K-means, SVM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2027
4036 Real-time Tracking in Image Sequences based-on Parameters Updating with Temporal and Spatial Neighborhoods Mixture Gaussian Model

Authors: Hu Haibo, Zhao Hong

Abstract:

Gaussian mixture background model is widely used in moving target detection of the image sequences. However, traditional Gaussian mixture background model usually considers the time continuity of the pixels, and establishes background through statistical distribution of pixels without taking into account the pixels- spatial similarity, which will cause noise, imperfection and other problems. This paper proposes a new Gaussian mixture modeling approach, which combines the color and gradient of the spatial information, and integrates the spatial information of the pixel sequences to establish Gaussian mixture background. The experimental results show that the movement background can be extracted accurately and efficiently, and the algorithm is more robust, and can work in real time in tracking applications.

Keywords: Gaussian mixture model, real-time tracking, sequence image, gradient.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443
4035 Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

Authors: Muhammad Sajjad, Naveed Khattak, Noman Jafri

Abstract:

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

Keywords: Adaptive, digital image processing, imagemagnification, interpolation, geometrical shapes, qualitative &quantitative analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763
4034 Single Frame Supercompression of Still Images,Video, High Definition TV and Digital Cinema

Authors: Mario Mastriani

Abstract:

Super-resolution is nowadays used for a high-resolution image produced from several low-resolution noisy frames. In this work, we consider the problem of high-quality interpolation of a single noise-free image. Such images may come from different sources, i.e., they may be frames of videos, individual pictures, etc. On the other hand, in the encoder we apply a downsampling via bidimen-sional interpolation of each frame, and in the decoder we apply a upsampling by which we restore the original size of the image. If the compression ratio is very high, then we use a convolutive mask that restores the edges, eliminating the blur. Finally, both, the encoder and the complete decoder are implemented on General-Purpose computation on Graphics Processing Units (GPGPU) cards. In fact, the mentioned mask is coded inside texture memory of a GPGPU.

Keywords: General-Purpose computation on Graphics ProcessingUnits, Image Compression, Interpolation, Super-resolution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962
4033 Artifacts in Spiral X-ray CT Scanners: Problems and Solutions

Authors: Mehran Yazdi, Luc Beaulieu

Abstract:

Artifact is one of the most important factors in degrading the CT image quality and plays an important role in diagnostic accuracy. In this paper, some artifacts typically appear in Spiral CT are introduced. The different factors such as patient, equipment and interpolation algorithm which cause the artifacts are discussed and new developments and image processing algorithms to prevent or reduce them are presented.

Keywords: CT artifacts, Spiral CT, Artifact removal.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4445
4032 A New Muscle Architecture Model with Non-Uniform Distribution of Muscle Fiber Types

Authors: Javier Navallas, Armando Malanda, Luis Gila, Javier Rodriguez, Ignacio Rodriguez

Abstract:

According to previous studies, some muscles present a non-homogeneous spatial distribution of its muscle fiber types and motor unit types. However, available muscle models only deal with muscles with homogeneous distributions. In this paper, a new architecture muscle model is proposed to permit the construction of non-uniform distributions of muscle fibers within the muscle cross section. The idea behind is the use of a motor unit placement algorithm that controls the spatial overlapping of the motor unit territories of each motor unit type. Results show the capabilities of the new algorithm to reproduce arbitrary muscle fiber type distributions.

Keywords: muscle model, muscle architecture, motor unit, EMG simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1553
4031 Educational Values of Virtual Reality: The Case of Spatial Ability

Authors: Elinda Ai-Lim Lee, Kok Wai Wong, Chun Che Fung

Abstract:

The use of Virtual Reality (VR) in schools and higher education is proliferating. Due to its interactive and animated features, it is regarded as a promising technology to increase students- spatial ability. Spatial ability is assumed to have a prominent role in science and engineering domains. However, research concerning individual differences such as spatial ability in the context of VR is still at its infancy. Moreover, empirical studies that focus on the features of VR to improve spatial ability are to date rare. Thus, this paper explores the possible educational values of VR in relation to spatial ability to call for more research concerning spatial ability in the context of VR based on studies in computerbased learning. It is believed that the incorporation of state-of-the-art VR technology for educational purposes should be justified by the enhanced benefits for the target learners.

Keywords: Ability-as-compensator, ability-as-enhancer, spatialability, virtual reality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2157