Search results for: sequence mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1052

Search results for: sequence mining

542 A Bayesian Classification System for Facilitating an Institutional Risk Profile Definition

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for easy creation and classification of institutional risk profiles supporting endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support set up of the most important risk factors. Subsequently, risk profiles employ risk factors classifier and associated configurations to support digital preservation experts with a semi-automatic estimation of endangerment group for file format risk profiles. Our goal is to make use of an expert knowledge base, accuired through a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation of risk factors for a requried dimension for analysis. Using the naive Bayes method, the decision support system recommends to an expert the matching risk profile group for the previously selected institutional risk profile. The proposed methods improve the visibility of risk factor values and the quality of a digital preservation process. The presented approach is designed to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and values of file format risk profiles. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert and to define its profile group. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.

Keywords: linked open data, information integration, digital libraries, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 696
541 IT Systems of the US Federal Courts, Justice, and Governance

Authors: Joseph Zernik

Abstract:

Validity, integrity, and impacts of the IT systems of the US federal courts have been studied as part of the Human Rights Alert-NGO (HRA) submission for the 2015 Universal Periodic Review (UPR) of human rights in the United States by the Human Rights Council (HRC) of the United Nations (UN). The current report includes overview of IT system analysis, data-mining and case studies. System analysis and data-mining show: Development and implementation with no lawful authority, servers of unverified identity, invalidity in implementation of electronic signatures, authentication instruments and procedures, authorities and permissions; discrimination in access against the public and unrepresented (pro se) parties and in favor of attorneys; widespread publication of invalid judicial records and dockets, leading to their false representation and false enforcement. A series of case studies documents the impacts on individuals' human rights, on banking regulation, and on international matters. Significance is discussed in the context of various media and expert reports, which opine unprecedented corruption of the US justice system today, and which question, whether the US Constitution was in fact suspended. Similar findings were previously reported in IT systems of the State of California and the State of Israel, which were incorporated, subject to professional HRC staff review, into the UN UPR reports (2010 and 2013). Solutions are proposed, based on the principles of publicity of the law and the separation of power: Reliance on US IT and legal experts under accountability to the legislative branch, enhancing transparency, ongoing vigilance by human rights and internet activists. IT experts should assume more prominent civic duties in the safeguard of civil society in our era.

Keywords: E-justice, federal courts, United States, human rights, banking regulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2116
540 Lung Nodule Detection in CT Scans

Authors: M. Antonelli, G. Frosini, B. Lazzerini, F. Marcelloni

Abstract:

In this paper we describe a computer-aided diagnosis (CAD) system for automated detection of pulmonary nodules in computed-tomography (CT) images. After extracting the pulmonary parenchyma using a combination of image processing techniques, a region growing method is applied to detect nodules based on 3D geometric features. We applied the CAD system to CT scans collected in a screening program for lung cancer detection. Each scan consists of a sequence of about 300 slices stored in DICOM (Digital Imaging and Communications in Medicine) format. All malignant nodules were detected and a low false-positive detection rate was achieved.

Keywords: computer assisted diagnosis, medical imagesegmentation, shape recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1802
539 Combined Simulated Annealing and Genetic Algorithm to Solve Optimization Problems

Authors: Younis R. Elhaddad

Abstract:

Combinatorial optimization problems arise in many scientific and practical applications. Therefore many researchers try to find or improve different methods to solve these problems with high quality results and in less time. Genetic Algorithm (GA) and Simulated Annealing (SA) have been used to solve optimization problems. Both GA and SA search a solution space throughout a sequence of iterative states. However, there are also significant differences between them. The GA mechanism is parallel on a set of solutions and exchanges information using the crossover operation. SA works on a single solution at a time. In this work SA and GA are combined using new technique in order to overcome the disadvantages' of both algorithms.

Keywords: Genetic Algorithm, Optimization problems, Simulated Annealing, Traveling Salesman Problem

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3413
538 AGV Guidance System: An Application of Simple Active Contour for Visual Tracking

Authors: M.Asif, M.R.Arshad, P.A.Wilson

Abstract:

In this paper, a simple active contour based visual tracking algorithm is presented for outdoor AGV application which is currently under development at the USM robotic research group (URRG) lab. The presented algorithm is computationally low cost and able to track road boundaries in an image sequence and can easily be implemented on available low cost hardware. The proposed algorithm used an active shape modeling using the B-spline deformable template and recursive curve fitting method to track the current orientation of the road.

Keywords: Active contour, B-spline, recursive curve fitting.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2090
537 Towards End-To-End Disease Prediction from Raw Metagenomic Data

Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker

Abstract:

Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.

Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 876
536 Payment for Pain: Differences between Hypothetical and Real Preferences

Authors: J. Trarbach, S. Schosser, B. Vogt

Abstract:

Decision-makers tend to prefer the first alternative over subsequent alternatives which is called the primacy effect. To reliably measure this effect, we conducted an experiment with real consequences for preference statements. Therefore, we elicit preferences of subjects using a rating scale, i.e. hypothetical preferences, and willingness to pay, i.e. real preferences, for two sequences of pain. Within these sequences, both overall intensity and duration of pain are identical. Hence, a rational decision-maker should be indifferent, whereas the primacy effect predicts a stronger preference for the first sequence. What we see is a primacy effect only for hypothetical preferences. This effect vanishes for real preferences.

Keywords: Decision making, primacy effect, real incentives, willingness to pay.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 849
535 Soccer Video Edition Using a Multimodal Annotation

Authors: Fendri Emna, Ben-Abdallah Hanêne, Ben-Hamadou Abdelmajid

Abstract:

In this paper, we present an approach for soccer video edition using a multimodal annotation. We propose to associate with each video sequence of a soccer match a textual document to be used for further exploitation like search, browsing and abstract edition. The textual document contains video meta data, match meta data, and match data. This document, generated automatically while the video is analyzed, segmented and classified, can be enriched semi automatically according to the user type and/or a specialized recommendation system.

Keywords: XML, Multimodal Annotation, recommendation system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
534 Comparison of Multi-User Detectors of DS-CDMA System

Authors: Kavita Khairnar, Shikha Nema

Abstract:

DS-CDMA system is well known wireless technology. This system suffers from MAI (Multiple Access Interference) caused by Direct Sequence users. Multi-User Detection schemes were introduced to detect the users- data in presence of MAI. This paper focuses on linear multi-user detection schemes used for data demodulation. Simulation results depict the performance of three detectors viz-conventional detector, Decorrelating detector and Subspace MMSE (Minimum Mean Square Error) detector. It is seen that the performance of these detectors depends on the number of paths and the length of Gold code used.

Keywords: Cross Correlation Matrix, MAI, Multi-UserDetection, Multipath Effect.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2091
533 Fixed Point Theorems for Set Valued Mappings in Partially Ordered Metric Spaces

Authors: Ismat Beg, Asma Rashid Butt

Abstract:

Let (X,) be a partially ordered set and d be a metric on X such that (X, d) is a complete metric space. Assume that X satisfies; if a non-decreasing sequence xn → x in X, then xn  x, for all n. Let F be a set valued mapping from X into X with nonempty closed bounded values satisfying; (i) there exists κ ∈ (0, 1) with D(F(x), F(y)) ≤ κd(x, y), for all x  y, (ii) if d(x, y) < ε < 1 for some y ∈ F(x) then x  y, (iii) there exists x0 ∈ X, and some x1 ∈ F(x0) with x0  x1 such that d(x0, x1) < 1. It is shown that F has a fixed point. Several consequences are also obtained.

Keywords: Fixed point, partially ordered set, metric space, set valued mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007
532 Constraints on IRS Control: An Alternative Approach to Tax Gap Analysis

Authors: J. T. Manhire

Abstract:

A tax authority wants to take actions it knows will foster the greatest degree of voluntary taxpayer compliance to reduce the “tax gap.” This paper suggests that even if a tax authority could attain a state of complete knowledge, there are constraints on whether and to what extent such actions would result in reducing the macro-level tax gap. These limits are not merely a consequence of finite agency resources. They are inherent in the system itself. To show that this is one possible interpretation of the tax gap data, the paper formulates known results in a different way by analyzing tax compliance as a population with a single covariate. This leads to a standard use of the logistic map to analyze the dynamics of non-compliance growth or decay over a sequence of periods. This formulation gives the same results as the tax gap studies performed over the past fifty years in the U.S. given the published margins of error. Limitations and recommendations for future work are discussed, along with some implications for tax policy.

Keywords: Tax law, tax compliance, tax gap, income tax.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 717
531 Judges System for Classifiers Specialization

Authors: Abdel Rodríguez, Isis Bonet, Ricardo Grau, María M. García

Abstract:

In this paper we designed and implemented a new ensemble of classifiers based on a sequence of classifiers which were specialized in regions of the training dataset where errors of its trained homologous are concentrated. In order to separate this regions, and to determine the aptitude of each classifier to properly respond to a new case, it was used another set of classifiers built hierarchically. We explored a selection based variant to combine the base classifiers. We validated this model with different base classifiers using 37 training datasets. It was carried out a statistical comparison of these models with the well known Bagging and Boosting, obtaining significantly superior results with the hierarchical ensemble using Multilayer Perceptron as base classifier. Therefore, we demonstrated the efficacy of the proposed ensemble, as well as its applicability to general problems.

Keywords: classifiers, delegation, ensemble

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283
530 Accurate Optical Flow Based on Spatiotemporal Gradient Constancy Assumption

Authors: Adam Rabcewicz

Abstract:

Variational methods for optical flow estimation are known for their excellent performance. The method proposed by Brox et al. [5] exemplifies the strength of that framework. It combines several concepts into single energy functional that is then minimized according to clear numerical procedure. In this paper we propose a modification of that algorithm starting from the spatiotemporal gradient constancy assumption. The numerical scheme allows to establish the connection between our model and the CLG(H) method introduced in [18]. Experimental evaluation carried out on synthetic sequences shows the significant superiority of the spatial variant of the proposed method. The comparison between methods for the realworld sequence is also enclosed.

Keywords: optical flow, variational methods, gradient constancy assumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159
529 Electrodermal Activity Measurement Using Constant Current AC Source

Authors: Cristian Chacha, David Asiain, Jesús Ponce de León, José Ramón Beltrán

Abstract:

This work explores and characterizes the behavior of the AFE AD5941 in impedance measurement using an embedded algorithm that allows using a constant current AC source. The main aim of this research is to improve the exact measurement of impedance values for their application in EDA-focused wearable devices. Through comprehensive study and characterization, it has been observed that employing a measurement sequence with a constant current source produces results with increased dispersion but higher accuracy and a more linear behavior with respect to error. As a result, this approach leads to a more accurate system for impedance measurement.

Keywords: Electrodermal Activity, constant current AC source, wearable, precision, accuracy, impedance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32
528 Finding Authoritative Researchers on Academic Web Sites

Authors: Dalibor Fiala, Karel Jezek, Francois Rousselot

Abstract:

In this paper, we present a methodology for finding authoritative researchers by analyzing academic Web sites. We show a case study in which we concentrate on a set of Czech computer science departments- Web sites. We analyze the relations between them via hyperlinks and find the most important ones using several common ranking algorithms. We then examine the contents of the research papers present on these sites and determine the most authoritative Czech authors.

Keywords: Authorities, citation analysis, prestige, ranking algorithms, Web mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1223
527 A Simulation Software for DNA Computing Algorithms Implementation

Authors: M. S. Muhammad, S. M. W. Masra, K. Kipli, N. Zamhari

Abstract:

The capturing of gel electrophoresis image represents the output of a DNA computing algorithm. Before this image is being captured, DNA computing involves parallel overlap assembly (POA) and polymerase chain reaction (PCR) that is the main of this computing algorithm. However, the design of the DNA oligonucleotides to represent a problem is quite complicated and is prone to errors. In order to reduce these errors during the design stage before the actual in-vitro experiment is carried out; a simulation software capable of simulating the POA and PCR processes is developed. This simulation software capability is unlimited where problem of any size and complexity can be simulated, thus saving cost due to possible errors during the design process. Information regarding the DNA sequence during the computing process as well as the computing output can be extracted at the same time using the simulation software.

Keywords: DNA computing, PCR, POA, simulation software

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
526 Geochemistry of Natural Radionuclides Associated with Acid Mine Drainage (AMD) in a Coal Mining Area in Southern Brazil

Authors: Juliana A. Galhardi, Daniel M. Bonotto

Abstract:

Coal is an important non-renewable energy source of and can be associated with radioactive elements. In Figueira city, Paraná state, Brazil, it was recorded high uranium activity near the coal mine that supplies a local thermoelectric power plant. In this context, the radon activity (Rn-222, produced by the Ra-226 decay in the U-238 natural series) was evaluated in groundwater, river water and effluents produced from the acid mine drainage in the coal reject dumps. The samples were collected in August 2013 and in February 2014 and analyzed at LABIDRO (Laboratory of Isotope and Hydrochemistry), UNESP, Rio Claro city, Brazil, using an alpha spectrometer (AlphaGuard) adjusted to evaluate the mean radon activity concentration in five cycles of 10 minutes. No radon activity concentration above 100 Bq.L-1, which was a previous critic value established by the World Health Organization. The average radon activity concentration in groundwater was higher than in surface water and in effluent samples, possibly due to the accumulation of uranium and radium in the aquifer layers that favors the radon trapping. The lower value in the river waters can indicate dilution and the intermediate value in the effluents may indicate radon absorption in the coal particles of the reject dumps. The results also indicate that the radon activities in the effluents increase with the sample acidification, possibly due to the higher radium leaching and the subsequent radon transport to the drainage flow. The water samples of Laranjinha River and Ribeirão das Pedras stream, which, respectively, supply Figueira city and receive the mining effluent, exhibited higher pH values upstream the mine, reflecting the acid mine drainage discharge. The radionuclides transport indicates the importance of monitoring their activity concentration in natural waters due to the risks that the radioactivity can represent to human health.

Keywords: Radon, radium, acid mine drainage, coal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2022
525 A Novel Dual-Purpose Image Watermarking Technique

Authors: Maha Sharkas, Dahlia R. ElShafie, Nadder Hamdy

Abstract:

Image watermarking has proven to be quite an efficient tool for the purpose of copyright protection and authentication over the last few years. In this paper, a novel image watermarking technique in the wavelet domain is suggested and tested. To achieve more security and robustness, the proposed techniques relies on using two nested watermarks that are embedded into the image to be watermarked. A primary watermark in form of a PN sequence is first embedded into an image (the secondary watermark) before being embedded into the host image. The technique is implemented using Daubechies mother wavelets where an arbitrary embedding factor α is introduced to improve the invisibility and robustness. The proposed technique has been applied on several gray scale images where a PSNR of about 60 dB was achieved.

Keywords: Image watermarking, Multimedia Security, Wavelets, Image Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1675
524 Blind Identification of MA Models Using Cumulants

Authors: Mohamed Boulouird, Moha M'Rabet Hassani

Abstract:

In this paper, many techniques for blind identification of moving average (MA) process are presented. These methods utilize third- and fourth-order cumulants of the noisy observations of the system output. The system is driven by an independent and identically distributed (i.i.d) non-Gaussian sequence that is not observed. Two nonlinear optimization algorithms, namely the Gradient Descent and the Gauss-Newton algorithms are exposed. An algorithm based on the joint-diagonalization of the fourth-order cumulant matrices (FOSI) is also considered, as well as an improved version of the classical C(q, 0, k) algorithm based on the choice of the Best 1-D Slice of fourth-order cumulants. To illustrate the effectiveness of our methods, various simulation examples are presented.

Keywords: Cumulants, Identification, MA models, Parameter estimation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385
523 Enhancement Effect of Superparamagnetic Iron Oxide Nanoparticle-Based MRI Contrast Agent at Different Concentrations and Magnetic Field Strengths

Authors: Bimali Sanjeevani Weerakoon, Toshiaki Osuga, Takehisa Konishi

Abstract:

Magnetic Resonance Imaging Contrast Agents (MRI-CM) are significant in the clinical and biological imaging as they have the ability to alter the normal tissue contrast, thereby affecting the signal intensity to enhance the visibility and detectability of images. Superparamagnetic Iron Oxide (SPIO) nanoparticles, coated with dextran or carboxydextran are currently available for clinical MR imaging of the liver. Most SPIO contrast agents are T2 shortening agents and Resovist (Ferucarbotran) is one of a clinically tested, organ-specific, SPIO agent which has a low molecular carboxydextran coating. The enhancement effect of Resovist depends on its relaxivity which in turn depends on factors like magnetic field strength, concentrations, nanoparticle properties, pH and temperature. Therefore, this study was conducted to investigate the impact of field strength and different contrast concentrations on enhancement effects of Resovist. The study explored the MRI signal intensity of Resovist in the physiological range of plasma from T2-weighted spin echo sequence at three magnetic field strengths: 0.47 T (r1=15, r2=101), 1.5 T (r1=7.4, r2=95), and 3 T (r1=3.3, r2=160) and the range of contrast concentrations by a mathematical simulation. Relaxivities of r1 and r2 (L mmol-1 Sec-1) were obtained from a previous study and the selected concentrations were 0.05, 0.06, 0.07, 0.08, 0.09, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0, 2.0, and 3.0 mmol/L. T2-weighted images were simulated using TR/TE ratio as 2000 ms /100 ms. According to the reference literature, with increasing magnetic field strengths, the r1 relaxivity tends to decrease while the r2 did not show any systematic relationship with the selected field strengths. In parallel, this study results revealed that the signal intensity of Resovist at lower concentrations tends to increase than the higher concentrations. The highest reported signal intensity was observed in the low field strength of 0.47 T. The maximum signal intensities for 0.47 T, 1.5 T and 3 T were found at the concentration levels of 0.05, 0.06 and 0.05 mmol/L, respectively. Furthermore, it was revealed that, the concentrations higher than the above, the signal intensity was decreased exponentially. An inverse relationship can be found between the field strength and T2 relaxation time, whereas, the field strength was increased, T2 relaxation time was decreased accordingly. However, resulted T2 relaxation time was not significantly different between 0.47 T and 1.5 T in this study. Moreover, a linear correlation of transverse relaxation rates (1/T2, s–1) with the concentrations of Resovist can be observed. According to these results, it can conclude that the concentration of SPIO nanoparticle contrast agents and the field strengths of MRI are two important parameters which can affect the signal intensity of T2-weighted SE sequence. Therefore, when MR imaging those two parameters should be considered prudently.

Keywords: Concentration, Resovist, Field strength, Relaxivity, Signal intensity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
522 Visual Search Based Indoor Localization in Low Light via RGB-D Camera

Authors: Yali Zheng, Peipei Luo, Shinan Chen, Jiasheng Hao, Hong Cheng

Abstract:

Most of traditional visual indoor navigation algorithms and methods only consider the localization in ordinary daytime, while we focus on the indoor re-localization in low light in the paper. As RGB images are degraded in low light, less discriminative infrared and depth image pairs are taken, as the input, by RGB-D cameras, the most similar candidates, as the output, are searched from databases which is built in the bag-of-word framework. Epipolar constraints can be used to relocalize the query infrared and depth image sequence. We evaluate our method in two datasets captured by Kinect2. The results demonstrate very promising re-localization results for indoor navigation system in low light environments.

Keywords: Indoor navigation, low light, RGB-D camera, vision based.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640
521 On the Fast Convergence of DD-LMS DFE Using a Good Strategy Initialization

Authors: Y.Ben Jemaa, M.Jaidane

Abstract:

In wireless communication system, a Decision Feedback Equalizer (DFE) to cancel the intersymbol interference (ISI) is required. In this paper, an exact convergence analysis of the (DFE) adapted by the Least Mean Square (LMS) algorithm during the training phase is derived by taking into account the finite alphabet context of data transmission. This allows us to determine the shortest training sequence that allows to reach a given Mean Square Error (MSE). With the intention of avoiding the problem of ill-convergence, the paper proposes an initialization strategy for the blind decision directed (DD) algorithm. This then yields a semi-blind DFE with high speed and good convergence.

Keywords: Adaptive Decision Feedback Equalizer, PerformanceAnalysis, Finite Alphabet Case, Ill-Convergence, Convergence speed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2045
520 Connectivity Characteristic of Transcription Factor

Authors: T. Mahalakshmi, Aswathi B. L., Achuthsankar S. Nair

Abstract:

Transcription factors are a group of proteins that helps for interpreting the genetic information in DNA. Protein-protein interactions play a major role in the execution of key biological functions of a cell. These interactions are represented in the form of a graph with nodes and edges. Studies have showed that some nodes have high degree of connectivity and such nodes, known as hub nodes, are the inevitable parts of the network. In the present paper a method is proposed to identify hub transcription factor proteins using sequence information. On a complete data set of transcription factor proteins available from the APID database, the proposed method showed an accuracy of 77%, sensitivity of 79% and specificity of 76%.

Keywords: Transcription Factor Proteins, Hub Proteins, Shannon Index, Transfer Free Energy to Surface (TFES).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1363
519 Surgical Theater Utilization and PACU Staffing

Authors: Abdulrahim Shamayleh

Abstract:

In this work, the surgical theater of a local hospital in KSA was analyzed using simulation. The focus was on attempting to answer questions related to how many Operating Rooms (ORs) to open and to analyze the performance of the surgical theater in general and mainly the Post Anesthesia Care Unit (PACU) to assist making decisions regarding PACU staffing. The surgical theater consists of ten operating rooms and the PACU unit which has a maximum capacity of fifteen beds. Different sequencing rules to sequence the surgical cases were tested and the Longest Case First (LCF) were superior to others. The results of the different alternatives developed and tested can be used by the manager as a tool to plan and manage the OR and PACU

Keywords: Operating room, post anesthesia care unit, PACUstaffing, sequencing, healthcare

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2543
518 Development of Configuration Software of Space Environment Simulator Control System Based on Linux

Authors: Zhan Haiyang, Zhang Lei, Ning Juan

Abstract:

This paper presents a configuration software solution in Linux, which is used for the control of space environment simulator. After introducing the structure and basic principle, it is said that the developing of QT software frame and the dynamic data exchanging between PLC and computer. The OPC driver in Linux is also developed. This driver realizes many-to-many communication between hardware devices and SCADA software. Moreover, an algorithm named “Scan PRI” is put forward. This algorithm is much more optimizable and efficient compared with "Scan in sequence" in Windows. This software has been used in practical project. It has a good control effect and can achieve the expected goal.

Keywords: Linux OS, configuration software, OPC server driver, MYSQL database.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
517 Neural Network Learning Based on Chaos

Authors: Truong Quang Dang Khoa, Masahiro Nakagawa

Abstract:

Chaos and fractals are novel fields of physics and mathematics showing up a new way of universe viewpoint and creating many ideas to solve several present problems. In this paper, a novel algorithm based on the chaotic sequence generator with the highest ability to adapt and reach the global optima is proposed. The adaptive ability of proposal algorithm is flexible in 2 steps. The first one is a breadth-first search and the second one is a depth-first search. The proposal algorithm is examined by 2 functions, the Camel function and the Schaffer function. Furthermore, the proposal algorithm is applied to optimize training Multilayer Neural Networks.

Keywords: learning and evolutionary computing, Chaos Optimization Algorithm, Artificial Neural Networks, nonlinear optimization, intelligent computational technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1760
516 Performance Analysis of Selective Adaptive Multiple Access Interference Cancellation for Multicarrier DS-CDMA Systems

Authors: Maged Ahmed, Ahmed El-Mahdy

Abstract:

In this paper, Selective Adaptive Parallel Interference Cancellation (SA-PIC) technique is presented for Multicarrier Direct Sequence Code Division Multiple Access (MC DS-CDMA) scheme. The motivation of using SA-PIC is that it gives high performance and at the same time, reduces the computational complexity required to perform interference cancellation. An upper bound expression of the bit error rate (BER) for the SA-PIC under Rayleigh fading channel condition is derived. Moreover, the implementation complexities for SA-PIC and Adaptive Parallel Interference Cancellation (APIC) are discussed and compared. The performance of SA-PIC is investigated analytically and validated via computer simulations.

Keywords: Adaptive interference cancellation, communicationsystems, multicarrier signal processing, spread spectrum

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1827
515 Maximum Norm Analysis of a Nonmatching Grids Method for Nonlinear Elliptic Boundary Value Problem −Δu = f(u)

Authors: Abida Harbi

Abstract:

We provide a maximum norm analysis of a finite element Schwarz alternating method for a nonlinear elliptic boundary value problem of the form -Δu = f(u), on two overlapping sub domains with non matching grids. We consider a domain which is the union of two overlapping sub domains where each sub domain has its own independently generated grid. The two meshes being mutually independent on the overlap region, a triangle belonging to one triangulation does not necessarily belong to the other one. Under a Lipschitz assumption on the nonlinearity, we establish, on each sub domain, an optimal L∞ error estimate between the discrete Schwarz sequence and the exact solution of the boundary value problem.

Keywords: Error estimates, Finite elements, Nonlinear PDEs, Schwarz method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2739
514 A Note on Penalized Power-Divergence Test Statistics

Authors: Aylin Alin

Abstract:

In this paper, penalized power-divergence test statistics have been defined and their exact size properties to test a nested sequence of log-linear models have been compared with ordinary power-divergence test statistics for various penalization, λ and main effect values. Since the ordinary and penalized power-divergence test statistics have the same asymptotic distribution, comparisons have been only made for small and moderate samples. Three-way contingency tables distributed according to a multinomial distribution have been considered. Simulation results reveal that penalized power-divergence test statistics perform much better than their ordinary counterparts.

Keywords: Contingency table, Log-linear models, Penalization, Power-divergence measure, Penalized power-divergence measure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1296
513 Motor Skill Adaptation Depends On the Level of Learning

Authors: Herbert Ugrinowitsch, Suziane Peixoto dos Santos-Naves, Michele Viviene Carbinatto, Rodolfo NovellinoBenda, Go Tani

Abstract:

An experiment was conducted to examine the effect of the level of performance stabilization on the human adaptability to perceptual-motor perturbation in a complex coincident timing task. Three levels of performance stabilization were established operationally: pre-stabilization, stabilization, and super-stabilization groups. Each group practiced the task until reached its level of stabilization in a constant sequence of movements and under a constant time constraint before exposure to perturbation. The results clearly showed that performance stabilization is a pre-condition for adaptation. Moreover, variability before reaching stabilization is harmful to adaptation and persistent variability after stabilization is beneficial. Moreover, the behavior of variability is specific to each measure.

Keywords: Adaptation, motor skill, perturbation, stabilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1765