Search results for: Sequence retrieval.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 745

Search results for: Sequence retrieval.

355 Lung Nodule Detection in CT Scans

Authors: M. Antonelli, G. Frosini, B. Lazzerini, F. Marcelloni

Abstract:

In this paper we describe a computer-aided diagnosis (CAD) system for automated detection of pulmonary nodules in computed-tomography (CT) images. After extracting the pulmonary parenchyma using a combination of image processing techniques, a region growing method is applied to detect nodules based on 3D geometric features. We applied the CAD system to CT scans collected in a screening program for lung cancer detection. Each scan consists of a sequence of about 300 slices stored in DICOM (Digital Imaging and Communications in Medicine) format. All malignant nodules were detected and a low false-positive detection rate was achieved.

Keywords: computer assisted diagnosis, medical imagesegmentation, shape recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1828
354 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Authors: Sanaa Chafik, ImaneDaoudi, Mounim A. El Yacoubi, Hamid El Ouardi

Abstract:

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

Keywords: Approximate Nearest Neighbor Search, Content based image retrieval (CBIR), Curse of dimensionality, Locality sensitive hashing, Multidimensional indexing, Scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
353 Combined Simulated Annealing and Genetic Algorithm to Solve Optimization Problems

Authors: Younis R. Elhaddad

Abstract:

Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.

Keywords: Genetic Algorithm, Optimization problems, Simulated Annealing, Traveling Salesman Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3443
352 AGV Guidance System: An Application of Simple Active Contour for Visual Tracking

Authors: M.Asif, M.R.Arshad, P.A.Wilson

Abstract:

In this paper, a simple active contour based visual tracking algorithm is presented for outdoor AGV application which is currently under development at the USM robotic research group (URRG) lab. The presented algorithm is computationally low cost and able to track road boundaries in an image sequence and can easily be implemented on available low cost hardware. The proposed algorithm used an active shape modeling using the B-spline deformable template and recursive curve fitting method to track the current orientation of the road.

Keywords: Active contour, B-spline, recursive curve fitting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2121
351 Programming Language Extension Using Structured Query Language for Database Access

Authors: Chapman Eze Nnadozie

Abstract:

Relational databases constitute a very vital tool for the effective management and administration of both personal and organizational data. Data access ranges from a single user database management software to a more complex distributed server system. This paper intends to appraise the use a programming language extension like structured query language (SQL) to establish links to a relational database (Microsoft Access 2013) using Visual C++ 9 programming language environment. The methodology used involves the creation of tables to form a database using Microsoft Access 2013, which is Object Linking and Embedding (OLE) database compliant. The SQL command is used to query the tables in the database for easy extraction of expected records inside the visual C++ environment. The findings of this paper reveal that records can easily be accessed and manipulated to filter exactly what the user wants, such as retrieval of records with specified criteria, updating of records, and deletion of part or the whole records in a table.

Keywords: Data access, database, database management system, OLE, programming language, records, relational database, software, SQL, table.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 721
350 Attack Detection through Image Adaptive Self Embedding Watermarking

Authors: S. Shefali, S. M. Deshpande, S. G. Tamhankar

Abstract:

Now a days, a significant part of commercial and governmental organisations like museums, cultural organizations, libraries, commercial enterprises, etc. invest intensively in new technologies for image digitization, digital libraries, image archiving and retrieval. Hence image authorization, authentication and security has become prime need. In this paper, we present a semi-fragile watermarking scheme for color images. The method converts the host image into YIQ color space followed by application of orthogonal dual domains of DCT and DWT transforms. The DCT helps to separate relevant from irrelevant image content to generate silent image features. DWT has excellent spatial localisation to help aid in spatial tamper characterisation. Thus image adaptive watermark is generated based of image features which allows the sharp detection of microscopic changes to locate modifications in the image. Further, the scheme utilises the multipurpose watermark consisting of soft authenticator watermark and chrominance watermark. Which has been proved fragile to some predefined processing like intentinal fabrication of the image or forgery and robust to other incidental attacks caused in the communication channel.

Keywords: Cryptography, Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), Watermarking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2043
349 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
348 Payment for Pain: Differences between Hypothetical and Real Preferences

Authors: J. Trarbach, S. Schosser, B. Vogt

Abstract:

Decision-makers tend to prefer the first alternative over subsequent alternatives which is called the primacy effect. To reliably measure this effect, we conducted an experiment with real consequences for preference statements. Therefore, we elicit preferences of subjects using a rating scale, i.e. hypothetical preferences, and willingness to pay, i.e. real preferences, for two sequences of pain. Within these sequences, both overall intensity and duration of pain are identical. Hence, a rational decision-maker should be indifferent, whereas the primacy effect predicts a stronger preference for the first sequence. What we see is a primacy effect only for hypothetical preferences. This effect vanishes for real preferences.

Keywords: Decision making, primacy effect, real incentives, willingness to pay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 872
347 Soccer Video Edition Using a Multimodal Annotation

Authors: Fendri Emna, Ben-Abdallah Hanêne, Ben-Hamadou Abdelmajid

Abstract:

In this paper, we present an approach for soccer video edition using a multimodal annotation. We propose to associate with each video sequence of a soccer match a textual document to be used for further exploitation like search, browsing and abstract edition. The textual document contains video meta data, match meta data, and match data. This document, generated automatically while the video is analyzed, segmented and classified, can be enriched semi automatically according to the user type and/or a specialized recommendation system.

Keywords: XML, Multimodal Annotation, recommendation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441
346 Comparison of Multi-User Detectors of DS-CDMA System

Authors: Kavita Khairnar, Shikha Nema

Abstract:

DS-CDMA system is well known wireless technology. This system suffers from MAI (Multiple Access Interference) caused by Direct Sequence users. Multi-User Detection schemes were introduced to detect the users- data in presence of MAI. This paper focuses on linear multi-user detection schemes used for data demodulation. Simulation results depict the performance of three detectors viz-conventional detector, Decorrelating detector and Subspace MMSE (Minimum Mean Square Error) detector. It is seen that the performance of these detectors depends on the number of paths and the length of Gold code used.

Keywords: Cross Correlation Matrix, MAI, Multi-UserDetection, Multipath Effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113
345 Key Frame Based Video Summarization via Dependency Optimization

Authors: Janya Sainui

Abstract:

As a rapid growth of digital videos and data communications, video summarization that provides a shorter version of the video for fast video browsing and retrieval is necessary. Key frame extraction is one of the mechanisms to generate video summary. In general, the extracted key frames should both represent the entire video content and contain minimum redundancy. However, most of the existing approaches heuristically select key frames; hence, the selected key frames may not be the most different frames and/or not cover the entire content of a video. In this paper, we propose a method of video summarization which provides the reasonable objective functions for selecting key frames. In particular, we apply a statistical dependency measure called quadratic mutual informaion as our objective functions for maximizing the coverage of the entire video content as well as minimizing the redundancy among selected key frames. The proposed key frame extraction algorithm finds key frames as an optimization problem. Through experiments, we demonstrate the success of the proposed video summarization approach that produces video summary with better coverage of the entire video content while less redundancy among key frames comparing to the state-of-the-art approaches.

Keywords: Video summarization, key frame extraction, dependency measure, quadratic mutual information, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 964
344 Domain Knowledge Representation through Multiple Sub Ontologies: An Application Interoperability

Authors: Sunitha Abburu, Golla Suresh Babu

Abstract:

The issues that limit application interoperability is lack of common vocabulary, common structure, application domain knowledge ontology based semantic technology provides solutions that resolves application interoperability issues. Ontology is broadly used in diverse applications such as artificial intelligence, bioinformatics, biomedical, information integration, etc. Ontology can be used to interpret the knowledge of various domains. To reuse, enrich the available ontologies and reduce the duplication of ontologies of the same domain, there is a strong need to integrate the ontologies of the particular domain. The integrated ontology gives complete knowledge about the domain by sharing this comprehensive domain ontology among the groups. As per the literature survey there is no well-defined methodology to represent knowledge of a whole domain. The current research addresses a systematic methodology for knowledge representation using multiple sub-ontologies at different levels that addresses application interoperability and enables semantic information retrieval. The current method represents complete knowledge of a domain by importing concepts from multiple sub ontologies of same and relative domains that reduces ontology duplication, rework, implementation cost through ontology reusability.

Keywords: Knowledge acquisition, knowledge representation, knowledge transfer, ontologies, semantics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 971
343 Face Detection in Color Images using Color Features of Skin

Authors: Fattah Alizadeh, Saeed Nalousi, Chiman Savari

Abstract:

Because of increasing demands for security in today-s society and also due to paying much more attention to machine vision, biometric researches, pattern recognition and data retrieval in color images, face detection has got more application. In this article we present a scientific approach for modeling human skin color, and also offer an algorithm that tries to detect faces within color images by combination of skin features and determined threshold in the model. Proposed model is based on statistical data in different color spaces. Offered algorithm, using some specified color threshold, first, divides image pixels into two groups: skin pixel group and non-skin pixel group and then based on some geometric features of face decides which area belongs to face. Two main results that we received from this research are as follow: first, proposed model can be applied easily on different databases and color spaces to establish proper threshold. Second, our algorithm can adapt itself with runtime condition and its results demonstrate desirable progress in comparison with similar cases.

Keywords: face detection, skin color modeling, color, colorfulimages, face recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2313
342 Fixed Point Theorems for Set Valued Mappings in Partially Ordered Metric Spaces

Authors: Ismat Beg, Asma Rashid Butt

Abstract:

Let (X,) be a partially ordered set and d be a metric on X such that (X, d) is a complete metric space. Assume that X satisfies; if a non-decreasing sequence xn → x in X, then xn  x, for all n. Let F be a set valued mapping from X into X with nonempty closed bounded values satisfying; (i) there exists κ ∈ (0, 1) with D(F(x), F(y)) ≤ κd(x, y), for all x  y, (ii) if d(x, y) < ε < 1 for some y ∈ F(x) then x  y, (iii) there exists x0 ∈ X, and some x1 ∈ F(x0) with x0  x1 such that d(x0, x1) < 1. It is shown that F has a fixed point. Several consequences are also obtained.

Keywords: Fixed point, partially ordered set, metric space, set valued mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2033
341 Impact of Similarity Ratings on Human Judgement

Authors: Ian A. McCulloh, Madelaine Zinser, Jesse Patsolic, Michael Ramos

Abstract:

Recommender systems are a common artificial intelligence (AI) application. For any given input, a search system will return a rank-ordered list of similar items. As users review returned items, they must decide when to halt the search and either revise search terms or conclude their requirement is novel with no similar items in the database. We present a statistically designed experiment that investigates the impact of similarity ratings on human judgement to conclude a search item is novel and halt the search. In the study, 450 participants were recruited from Amazon Mechanical Turk to render judgement across 12 decision tasks. We find the inclusion of ratings increases the human perception that items are novel. Percent similarity increases novelty discernment when compared with star-rated similarity or the absence of a rating. Ratings reduce the time to decide and improve decision confidence. This suggests that the inclusion of similarity ratings can aid human decision-makers in knowledge search tasks.

Keywords: Ratings, rankings, crowdsourcing, empirical studies, user studies, similarity measures, human-centered computing, novelty in information retrieval.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 437
340 Constraints on IRS Control: An Alternative Approach to Tax Gap Analysis

Authors: J. T. Manhire

Abstract:

A tax authority wants to take actions it knows will foster the greatest degree of voluntary taxpayer compliance to reduce the “tax gap.” This paper suggests that even if a tax authority could attain a state of complete knowledge, there are constraints on whether and to what extent such actions would result in reducing the macro-level tax gap. These limits are not merely a consequence of finite agency resources. They are inherent in the system itself. To show that this is one possible interpretation of the tax gap data, the paper formulates known results in a different way by analyzing tax compliance as a population with a single covariate. This leads to a standard use of the logistic map to analyze the dynamics of non-compliance growth or decay over a sequence of periods. This formulation gives the same results as the tax gap studies performed over the past fifty years in the U.S. given the published margins of error. Limitations and recommendations for future work are discussed, along with some implications for tax policy.

Keywords: Tax law, tax compliance, tax gap, income tax.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 736
339 Judges System for Classifiers Specialization

Authors: Abdel Rodríguez, Isis Bonet, Ricardo Grau, María M. García

Abstract:

In this paper we designed and implemented a new ensemble of classifiers based on a sequence of classifiers which were specialized in regions of the training dataset where errors of its trained homologous are concentrated. In order to separate this regions, and to determine the aptitude of each classifier to properly respond to a new case, it was used another set of classifiers built hierarchically. We explored a selection based variant to combine the base classifiers. We validated this model with different base classifiers using 37 training datasets. It was carried out a statistical comparison of these models with the well known Bagging and Boosting, obtaining significantly superior results with the hierarchical ensemble using Multilayer Perceptron as base classifier. Therefore, we demonstrated the efficacy of the proposed ensemble, as well as its applicability to general problems.

Keywords: classifiers, delegation, ensemble

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1306
338 Accurate Optical Flow Based on Spatiotemporal Gradient Constancy Assumption

Authors: Adam Rabcewicz

Abstract:

Variational methods for optical flow estimation are known for their excellent performance. The method proposed by Brox et al. [5] exemplifies the strength of that framework. It combines several concepts into single energy functional that is then minimized according to clear numerical procedure. In this paper we propose a modification of that algorithm starting from the spatiotemporal gradient constancy assumption. The numerical scheme allows to establish the connection between our model and the CLG(H) method introduced in [18]. Experimental evaluation carried out on synthetic sequences shows the significant superiority of the spatial variant of the proposed method. The comparison between methods for the realworld sequence is also enclosed.

Keywords: optical flow, variational methods, gradient constancy assumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2183
337 Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Authors: Parvinder S. Sandhu, Hardeep Singh

Abstract:

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Keywords: Clustering, ID3, LSA, Neuro-fuzzy System, SVD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662
336 Electrodermal Activity Measurement Using Constant Current AC Source

Authors: Cristian Chacha, David Asiain, Jesús Ponce de León, José Ramón Beltrán

Abstract:

This work explores and characterizes the behavior of the AFE AD5941 in impedance measurement using an embedded algorithm that allows using a constant current AC source. The main aim of this research is to improve the exact measurement of impedance values for their application in EDA-focused wearable devices. Through comprehensive study and characterization, it has been observed that employing a measurement sequence with a constant current source produces results with increased dispersion but higher accuracy and a more linear behavior with respect to error. As a result, this approach leads to a more accurate system for impedance measurement.

Keywords: Electrodermal Activity, constant current AC source, wearable, precision, accuracy, impedance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 88
335 Digital Preservation in Nigeria Universities Libraries: A Comparison between University of Nigeria Nsukka and Ahmadu Bello University Zaria

Authors: Suleiman Musa, Shuaibu Sidi Safiyanu

Abstract:

This study examined the digital preservation in Nigeria university libraries. A comparison between the university of Nigeria Nsukka (UNN) and Ahmadu Bello University Zaria (ABU, Zaria). The study utilized primary source of data obtained from two selected institution librarians. Finding revealed varying results in terms of skills acquired by librarians before and after digitization of the two institutions. The study reports that journals publication, text book, CD-ROMS, conference papers and proceedings, theses, dissertations and seminar papers are among the information resources available for digitization. The study further documents that copyright issue, power failure, and unavailability of needed materials are among the challenges facing the digitization of library of the institution. On the basis of the finding, the study concluded that digitization of library enhances efficiency in organization and retrieval of information services. The study therefore recommended that software should be upgraded with backup, training of the librarians on digital process, installation of antivirus and enhancement of technical collaboration between the library and MIS.

Keywords: Digitalization, preservation, libraries, comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1726
334 CAD/CAM Algorithms for 3D Woven Multilayer Textile Structures

Authors: Martin A. Smith, Xiaogang Chen

Abstract:

This paper proposes new algorithms for the computeraided design and manufacture (CAD/CAM) of 3D woven multi-layer textile structures. Existing commercial CAD/CAM systems are often restricted to the design and manufacture of 2D weaves. Those CAD/CAM systems that do support the design and manufacture of 3D multi-layer weaves are often limited to manual editing of design paper grids on the computer display and weave retrieval from stored archives. This complex design activity is time-consuming, tedious and error-prone and requires considerable experience and skill of a technical weaver. Recent research reported in the literature has addressed some of the shortcomings of commercial 3D multi-layer weave CAD/CAM systems. However, earlier research results have shown the need for further work on weave specification, weave generation, yarn path editing and layer binding. Analysis of 3D multi-layer weaves in this research has led to the design and development of efficient and robust algorithms for the CAD/CAM of 3D woven multi-layer textile structures. The resulting algorithmically generated weave designs can be used as a basis for lifting plans that can be loaded onto looms equipped with electronic shedding mechanisms for the CAM of 3D woven multi-layer textile structures.

Keywords: CAD/CAM, Multi-layer, Textile, Weave.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2571
333 A Simulation Software for DNA Computing Algorithms Implementation

Authors: M. S. Muhammad, S. M. W. Masra, K. Kipli, N. Zamhari

Abstract:

The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.

Keywords: DNA computing, PCR, POA, simulation software

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
332 Development of Perez-Du Mortier Calibration Algorithm for Ground-Based Aerosol Optical Depth Measurement with Validation using SMARTS Model

Authors: Jedol Dayou, Jackson Hian Wui Chang, Rubena Yusoff, Ag. Sufiyan Abd. Hamid, Fauziah Sulaiman, Justin Sentian

Abstract:

Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.

Keywords: Aerosol optical depth, direct normal irradiance, Langley calibration, radiance-based validation, SMARTS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808
331 Secure and Efficient Transmission of Aggregated Data for Mobile Wireless Sensor Networks

Authors: A. Krishna Veni, R.Geetha

Abstract:

Wireless Sensor Networks (WSNs) are suitable for many scenarios in the real world. The retrieval of data is made efficient by the data aggregation techniques. Many techniques for the data aggregation are offered and most of the existing schemes are not energy efficient and secure. However, the existing techniques use the traditional clustering approach where there is a delay during the packet transmission since there is no proper scheduling. The presented system uses the Velocity Energy-efficient and Link-aware Cluster-Tree (VELCT) scheme in which there is a Data Collection Tree (DCT) which improves the lifetime of the network. The VELCT scheme and the construction of DCT reduce the delay and traffic. The network lifetime can be increased by avoiding the frequent change in cluster topology. Secure and Efficient Transmission of Aggregated data (SETA) improves the security of the data transmission via the trust value of the nodes prior the aggregation of data. Since SETA considers the data only from the trustworthy nodes for aggregation, it is more secure in transmitting the data thereby improving the accuracy of aggregated data.

Keywords: Aggregation, lifetime, network security, wireless sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
330 A Novel Dual-Purpose Image Watermarking Technique

Authors: Maha Sharkas, Dahlia R. ElShafie, Nadder Hamdy

Abstract:

Image watermarking has proven to be quite an efficient tool for the purpose of copyright protection and authentication over the last few years. In this paper, a novel image watermarking technique in the wavelet domain is suggested and tested. To achieve more security and robustness, the proposed techniques relies on using two nested watermarks that are embedded into the image to be watermarked. A primary watermark in form of a PN sequence is first embedded into an image (the secondary watermark) before being embedded into the host image. The technique is implemented using Daubechies mother wavelets where an arbitrary embedding factor α is introduced to improve the invisibility and robustness. The proposed technique has been applied on several gray scale images where a PSNR of about 60 dB was achieved.

Keywords: Image watermarking, Multimedia Security, Wavelets, Image Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1699
329 Blind Identification of MA Models Using Cumulants

Authors: Mohamed Boulouird, Moha M'Rabet Hassani

Abstract:

In this paper, many techniques for blind identification of moving average (MA) process are presented. These methods utilize third- and fourth-order cumulants of the noisy observations of the system output. The system is driven by an independent and identically distributed (i.i.d) non-Gaussian sequence that is not observed. Two nonlinear optimization algorithms, namely the Gradient Descent and the Gauss-Newton algorithms are exposed. An algorithm based on the joint-diagonalization of the fourth-order cumulant matrices (FOSI) is also considered, as well as an improved version of the classical C(q, 0, k) algorithm based on the choice of the Best 1-D Slice of fourth-order cumulants. To illustrate the effectiveness of our methods, various simulation examples are presented.

Keywords: Cumulants, Identification, MA models, Parameter estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1409
328 Enhancement Effect of Superparamagnetic Iron Oxide Nanoparticle-Based MRI Contrast Agent at Different Concentrations and Magnetic Field Strengths

Authors: Bimali Sanjeevani Weerakoon, Toshiaki Osuga, Takehisa Konishi

Abstract:

Magnetic Resonance Imaging Contrast Agents (MRI-CM) are significant in the clinical and biological imaging as they have the ability to alter the normal tissue contrast, thereby affecting the signal intensity to enhance the visibility and detectability of images. Superparamagnetic Iron Oxide (SPIO) nanoparticles, coated with dextran or carboxydextran are currently available for clinical MR imaging of the liver. Most SPIO contrast agents are T2 shortening agents and Resovist (Ferucarbotran) is one of a clinically tested, organ-specific, SPIO agent which has a low molecular carboxydextran coating. The enhancement effect of Resovist depends on its relaxivity which in turn depends on factors like magnetic field strength, concentrations, nanoparticle properties, pH and temperature. Therefore, this study was conducted to investigate the impact of field strength and different contrast concentrations on enhancement effects of Resovist. The study explored the MRI signal intensity of Resovist in the physiological range of plasma from T2-weighted spin echo sequence at three magnetic field strengths: 0.47 T (r1=15, r2=101), 1.5 T (r1=7.4, r2=95), and 3 T (r1=3.3, r2=160) and the range of contrast concentrations by a mathematical simulation. Relaxivities of r1 and r2 (L mmol-1 Sec-1) were obtained from a previous study and the selected concentrations were 0.05, 0.06, 0.07, 0.08, 0.09, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 2.0, and 3.0 mmol/L. T2-weighted images were simulated using TR/TE ratio as 2000 ms /100 ms. According to the reference literature, with increasing magnetic field strengths, the r1 relaxivity tends to decrease while the r2 did not show any systematic relationship with the selected field strengths. In parallel, this study results revealed that the signal intensity of Resovist at lower concentrations tends to increase than the higher concentrations. The highest reported signal intensity was observed in the low field strength of 0.47 T. The maximum signal intensities for 0.47 T, 1.5 T and 3 T were found at the concentration levels of 0.05, 0.06 and 0.05 mmol/L, respectively. Furthermore, it was revealed that, the concentrations higher than the above, the signal intensity was decreased exponentially. An inverse relationship can be found between the field strength and T2 relaxation time, whereas, the field strength was increased, T2 relaxation time was decreased accordingly. However, resulted T2 relaxation time was not significantly different between 0.47 T and 1.5 T in this study. Moreover, a linear correlation of transverse relaxation rates (1/T2, s–1) with the concentrations of Resovist can be observed. According to these results, it can conclude that the concentration of SPIO nanoparticle contrast agents and the field strengths of MRI are two important parameters which can affect the signal intensity of T2-weighted SE sequence. Therefore, when MR imaging those two parameters should be considered prudently.

Keywords: Concentration, Resovist, Field strength, Relaxivity, Signal intensity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1997
327 Visual Search Based Indoor Localization in Low Light via RGB-D Camera

Authors: Yali Zheng, Peipei Luo, Shinan Chen, Jiasheng Hao, Hong Cheng

Abstract:

Most of traditional visual indoor navigation algorithms and methods only consider the localization in ordinary daytime, while we focus on the indoor re-localization in low light in the paper. As RGB images are degraded in low light, less discriminative infrared and depth image pairs are taken, as the input, by RGB-D cameras, the most similar candidates, as the output, are searched from databases which is built in the bag-of-word framework. Epipolar constraints can be used to relocalize the query infrared and depth image sequence. We evaluate our method in two datasets captured by Kinect2. The results demonstrate very promising re-localization results for indoor navigation system in low light environments.

Keywords: Indoor navigation, low light, RGB-D camera, vision based.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
326 On the Fast Convergence of DD-LMS DFE Using a Good Strategy Initialization

Authors: Y.Ben Jemaa, M.Jaidane

Abstract:

In wireless communication system, a Decision Feedback Equalizer (DFE) to cancel the intersymbol interference (ISI) is required. In this paper, an exact convergence analysis of the (DFE) adapted by the Least Mean Square (LMS) algorithm during the training phase is derived by taking into account the finite alphabet context of data transmission. This allows us to determine the shortest training sequence that allows to reach a given Mean Square Error (MSE). With the intention of avoiding the problem of ill-convergence, the paper proposes an initialization strategy for the blind decision directed (DD) algorithm. This then yields a semi-blind DFE with high speed and good convergence.

Keywords: Adaptive Decision Feedback Equalizer, PerformanceAnalysis, Finite Alphabet Case, Ill-Convergence, Convergence speed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2071