Search results for: Computer aided detection (CAD)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2767

Search results for: Computer aided detection (CAD)

1657 Modeling, Simulation and Monitoring of Nuclear Reactor Using Directed Graph and Bond Graph

Authors: A. Badoud, M. Khemliche, S. Latreche

Abstract:

The main objective developed in this paper is to find a graphic technique for modeling, simulation and diagnosis of the industrial systems. This importance is much apparent when it is about a complex system such as the nuclear reactor with pressurized water of several form with various several non-linearity and time scales. In this case the analytical approach is heavy and does not give a fast idea on the evolution of the system. The tool Bond Graph enabled us to transform the analytical model into graphic model and the software of simulation SYMBOLS 2000 specific to the Bond Graphs made it possible to validate and have the results given by the technical specifications. We introduce the analysis of the problem involved in the faults localization and identification in the complex industrial processes. We propose a method of fault detection applied to the diagnosis and to determine the gravity of a detected fault. We show the possibilities of application of the new diagnosis approaches to the complex system control. The industrial systems became increasingly complex with the faults diagnosis procedures in the physical systems prove to become very complex as soon as the systems considered are not elementary any more. Indeed, in front of this complexity, we chose to make recourse to Fault Detection and Isolation method (FDI) by the analysis of the problem of its control and to conceive a reliable system of diagnosis making it possible to apprehend the complex dynamic systems spatially distributed applied to the standard pressurized water nuclear reactor.

Keywords: Bond Graph, Modeling, Simulation, Monitoring, Analytical Redundancy Relations, Pressurized Water Reactor, Directed Graph.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1973
1656 The Application of Non-quantitative Modelling in the Analysis of a Network Warfare Environment

Authors: N. Veerasamy, JPH Eloff

Abstract:

Network warfare is an emerging concept that focuses on the network and computer based forms through which information is attacked and defended. Various computer and network security concepts thus play a role in network warfare. Due the intricacy of the various interacting components, a model to better understand the complexity in a network warfare environment would be beneficial. Non-quantitative modeling is a useful method to better characterize the field due to the rich ideas that can be generated based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define the influential conditions in a network warfare environment.

Keywords: Morphological, non-quantitative, network warfare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1374
1655 Investigation of I/Q Imbalance in Coherent Optical OFDM System

Authors: R. S. Fyath, Mustafa A. B. Al-Qadi

Abstract:

The inphase/quadrature (I/Q) amplitude and phase imbalance effects are studied in coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. An analytical model for the I/Q imbalance is developed and supported by simulation results. The results indicate that the I/Q imbalance degrades the BER performance considerably.

Keywords: Coherent detection, I/Q imbalance, OFDM, optical communications

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2562
1654 Fault Classification of Double Circuit Transmission Line Using Artificial Neural Network

Authors: Anamika Jain, A. S. Thoke, R. N. Patel

Abstract:

This paper addresses the problems encountered by conventional distance relays when protecting double-circuit transmission lines. The problems arise principally as a result of the mutual coupling between the two circuits under different fault conditions; this mutual coupling is highly nonlinear in nature. An adaptive protection scheme is proposed for such lines based on application of artificial neural network (ANN). ANN has the ability to classify the nonlinear relationship between measured signals by identifying different patterns of the associated signals. One of the key points of the present work is that only current signals measured at local end have been used to detect and classify the faults in the double circuit transmission line with double end infeed. The adaptive protection scheme is tested under a specific fault type, but varying fault location, fault resistance, fault inception angle and with remote end infeed. An improved performance is experienced once the neural network is trained adequately, which performs precisely when faced with different system parameters and conditions. The entire test results clearly show that the fault is detected and classified within a quarter cycle; thus the proposed adaptive protection technique is well suited for double circuit transmission line fault detection & classification. Results of performance studies show that the proposed neural network-based module can improve the performance of conventional fault selection algorithms.

Keywords: Double circuit transmission line, Fault detection and classification, High impedance fault and Artificial Neural Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3178
1653 Role of ICT and Wage Inequality in Organization

Authors: Shoji Katagiri

Abstract:

This study deals with wage inequality in organization and shows the relationship between ICT and wage in organization. To do so, we incorporate ICT’s factors in organization into our model. ICT’s factors are efficiencies of Enterprise Resource Planning (ERP), Computer Assisted Design/Computer Assisted Manufacturing (CAD/CAM), and NETWORK. The improvement of ICT’s factors decrease the learning cost to solve problem pertaining to the hierarchy in organization. The improvement of NETWORK increases the wage inequality within workers and decreases within managers and entrepreneurs. The improvements of CAD/CAM and ERP increases the wage inequality within all agent, and partially increase it between the agents in hierarchy.

Keywords: Endogenous economic growth, ICT, inequality, capital accumulation, technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 868
1652 A Study on Remote On-Line Diagnostic System for Vehicles by Integrating the Technology of OBD, GPS, and 3G

Authors: Jyong Lin, Shih-Chang Chen, Yu-Tsen Shih, Shi-Huang Chen

Abstract:

This paper presents a remote on-line diagnostic system for vehicles via the use of On-Board Diagnostic (OBD), GPS, and 3G techniques. The main parts of the proposed system are on-board computer, vehicle monitor server, and vehicle status browser. First, the on-board computer can obtain the location of deriver and vehicle status from GPS receiver and OBD interface, respectively. Then on-board computer will connect with the vehicle monitor server through 3G network to transmit the real time vehicle system status. Finally, vehicle status browser could show the remote vehicle status including vehicle speed, engine rpm, battery voltage, engine coolant temperature, and diagnostic trouble codes. According to the experimental results, the proposed system can help fleet managers and car knockers to understand the remote vehicle status. Therefore this system can decrease the time of fleet management and vehicle repair due to the fleet managers and car knockers who find the diagnostic trouble messages in time.

Keywords: Diagnostic Trouble Code (DTC), Electronic Control Unit (ECU), Global Position System (GPS), On-Board Diagnostic (OBD).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3003
1651 Analysis of Histogram Asymmetry for Waste Recognition

Authors: Janusz Bobulski, Kamila Pasternak

Abstract:

Despite many years of effort and research, the problem of waste management is still current. There is a lack of fast and effective algorithms for classifying individual waste fractions. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.

Keywords: Computer vision, environmental protection, image processing, waste management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 286
1650 A Multi Steps Algorithm for Sperm Segmentation in Microscopic Image

Authors: Fereidoon Nowshiravan Rahatabad, Mohammad Hassan Moradi, Vahid Reza Nafisi

Abstract:

Nothing that an effective cure for infertility happens when we can find a unique solution, a great deal of study has been done in this field and this is a hot research subject for to days study. So we could analyze the men-s seaman and find out about fertility and infertility and from this find a true cure for this, since this will be a non invasive and low risk procedure, it will be greatly welcomed. In this research, the procedure has been based on few Algorithms enhancement and segmentation of images which has been done on the images taken from microscope in different fertility institution and have obtained a suitable result from the computer images which in turn help us to distinguish these sperms from fluids and its surroundings.

Keywords: Computer-Assisted Sperm Analysis (CASA), Spermidentification, Segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1630
1649 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy

Authors: P. Selva, B. Lorrain, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard

Abstract:

Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.

Keywords: Cruciform specimen, multiaxial fatigue, Nickelbased superalloy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187
1648 Expressive Modes and Species of Language

Authors: Richard Elling Moe

Abstract:

Computer languages are usually lumped together into broad -paradigms-, leaving us in want of a finer classification of kinds of language. Theories distinguishing between -genuine differences- in language has been called for, and we propose that such differences can be observed through a notion of expressive mode. We outline this concept, propose how it could be operationalized and indicate a possible context for the development of a corresponding theory. Finally we consider a possible application in connection with evaluation of language revision. We illustrate this with a case, investigating possible revisions of the relational algebra in order to overcome weaknesses of the division operator in connection with universal queries.

Keywords: Expressive mode, Computer language species, Evaluation of revision, Relational algebra, Universal database queries

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1318
1647 An Intelligent Transportation System for Safety and Integrated Management of Railway Crossings

Authors: M. Magrini, D. Moroni, G. Palazzese, G. Pieri, D. Azzarelli, A. Spada, L. Fanucci, O. Salvetti

Abstract:

Railway crossings are complex entities whose optimal management cannot be addressed unless with the help of an intelligent transportation system integrating information both on train and vehicular flows. In this paper, we propose an integrated system named SIMPLE (Railway Safety and Infrastructure for Mobility applied at level crossings) that, while providing unparalleled safety in railway level crossings, collects data on rail and road traffic and provides value-added services to citizens and commuters. Such services include for example alerts, via variable message signs to drivers and suggestions for alternative routes, towards a more sustainable, eco-friendly and efficient urban mobility. To achieve these goals, SIMPLE is organized as a System of Systems (SoS), with a modular architecture whose components range from specially-designed radar sensors for obstacle detection to smart ETSI M2M-compliant camera networks for urban traffic monitoring. Computational unit for performing forecast according to adaptive models of train and vehicular traffic are also included. The proposed system has been tested and validated during an extensive trial held in the mid-sized Italian town of Montecatini, a paradigmatic case where the rail network is inextricably linked with the fabric of the city. Results of the tests are reported and discussed.

Keywords: Intelligent Transportation Systems (ITS), railway, railroad crossing, smart camera networks, radar obstacle detection, real-time traffic optimization, IoT, ETSI M2M, transport safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
1646 VHL, PBRM1 and SETD2 Genes in Kidney Cancer: A Molecular Investigation

Authors: Rozhgar A. Khailany, Mehri Igci, Emine Bayraktar, Sakip Erturhan, Metin Karakok, Ahmet Arslan

Abstract:

Kidney cancer is the most lethal urological cancer accounting for 3% of adult malignancies. VHL, a tumor-suppressor gene, is best known to be associated with renal cell carcinoma (RCC). The VHL functions as negative regulator of hypoxia inducible factors. Recent sequencing efforts have identified several novel frequent mutations of histone modifying and chromatin remodeling genes in ccRCC (clear cell RCC) including PBRM1 and SETD2. The PBRM1 gene encodes the BAF180 protein, which involved in transcriptional activation and repression of selected genes. SETD2 encodes a histone methyltransferase, which may play a role in suppressing tumor development. In this study, RNAs of 30 paired tumor and normal samples that were grouped according to the types of kidney cancer and clinical characteristics of patients, including gender and average age were examined by RT-PCR, SSCP and sequencing techniques. VHL, PBRM1 and SETD2 expressions were relatively down-regulated. However, statistically no significance was found (Wilcoxon signed rank test, p>0.05). Interestingly, no mutation was observed on the contrary of previous studies. Understanding the molecular mechanisms involved in the pathogenesis of RCC has aided the development of molecular-targeted drugs for kidney cancer. Further analysis is required to identify the responsible genes rather than VHL, PBRM1 and SETD2 in kidney cancer.

Keywords: Kidney cancer, molecular biomarker, expression analysis, mutation screening.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2001
1645 Forces Association-Based Active Contour

Authors: Aicha Baya Goumeidane, Nafaa. Nacereddine

Abstract:

A welded structure must be inspected to guarantee that the weld quality meets the design requirements to assure safety and reliability. However, X-ray image analyses and defect recognition with the computer vision techniques are very complex. Most difficulties lie in finding the small, irregular defects in poor contrast images which requires pre processing to image, extract, and classify features from strong background noise. This paper addresses the issue of designing methodology to extract defect from noisy background radiograph with image processing. Based on the use of actives contours this methodology seems to give good results

Keywords: Welding, Radiography, Computer vision, Active contour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1879
1644 Assessment-Assisted and Relationship-Based Financial Advising: Using an Empirical Assessment to Understand Personal Investor Risk Tolerance in Professional Advising Relationships

Authors: Jerry Szatko, Edan L. Jorgensen, Stacia Jorgensen

Abstract:

A crucial component to the success of any financial advising relationship is for the financial professional to understand the perceptions, preferences and thought-processes carried by the financial clients they serve. Armed with this information, financial professionals are more quickly able to understand how they can tailor their approach to best match the individual preferences and needs of each personal investor. Our research explores the use of a quantitative assessment tool in the financial services industry to assist in the identification of the personal investor’s consumer behaviors, especially in terms of financial risk tolerance, as it relates to their financial decision making. Through this process, the Unitifi Consumer Insight Tool (UCIT) was created and refined to capture and categorize personal investor financial behavioral categories and the financial personality tendencies of individuals prior to the initiation of a financial advisement relationship. This paper discusses the use of this tool to place individuals in one of four behavior-based financial risk tolerance categories. Our discoveries and research were aided through administration of a web-based survey to a group of over 1,000 individuals. Our findings indicate that it is possible to use a quantitative assessment tool to assist in predicting the behavioral tendencies of personal consumers when faced with consumer financial risk and decisions.

Keywords: Behavior based advising, behavioral finance, financial advising, financial advisor tools, financial risk tolerance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 945
1643 Data Compression in Ultrasonic Network Communication via Sparse Signal Processing

Authors: Beata Zima, Octavio A. Márquez Reyes, Masoud Mohammadgholiha, Jochen Moll, Luca De Marchi

Abstract:

This document presents the approach of using compressed sensing in signal encoding and information transferring within a guided wave sensor network, comprised of specially designed frequency steerable acoustic transducers (FSATs). Wave propagation in a damaged plate was simulated using commercial FEM-based software COMSOL. Guided waves were excited by means of FSATs, characterized by the special shape of its electrodes, and modeled using PIC255 piezoelectric material. The special shape of the FSAT, allows for focusing wave energy in a certain direction, accordingly to the frequency components of its actuation signal, which makes a larger monitored area available. The process begins when a FSAT detects and records reflection from damage in the structure, this signal is then encoded and prepared for transmission, using a combined approach, based on Compressed Sensing Matching Pursuit and Quadrature Amplitude Modulation (QAM). After codification of the signal is in binary, the information is transmitted between the nodes in the network. The message reaches the last node, where it is finally decoded and processed, to be used for damage detection and localization purposes. The main aim of the investigation is to determine the location of detected damage using reconstructed signals. The study demonstrates that the special steerable capabilities of FSATs, not only facilitate the detection of damage but also permit transmitting the damage information to a chosen area in a specific direction of the investigated structure.

Keywords: Data compression, ultrasonic communication, guided waves, FEM analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 356
1642 Multi-Layer Perceptron Neural Network Classifier with Binary Particle Swarm Optimization Based Feature Selection for Brain-Computer Interfaces

Authors: K. Akilandeswari, G. M. Nasira

Abstract:

Brain-Computer Interfaces (BCIs) measure brain signals activity, intentionally and unintentionally induced by users, and provides a communication channel without depending on the brain’s normal peripheral nerves and muscles output pathway. Feature Selection (FS) is a global optimization machine learning problem that reduces features, removes irrelevant and noisy data resulting in acceptable recognition accuracy. It is a vital step affecting pattern recognition system performance. This study presents a new Binary Particle Swarm Optimization (BPSO) based feature selection algorithm. Multi-layer Perceptron Neural Network (MLPNN) classifier with backpropagation training algorithm and Levenberg-Marquardt training algorithm classify selected features.

Keywords: Brain-Computer Interfaces (BCI), Feature Selection (FS), Walsh–Hadamard Transform (WHT), Binary Particle Swarm Optimization (BPSO), Multi-Layer Perceptron (MLP), Levenberg–Marquardt algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171
1641 Liquid Chromatography Microfluidics for Detection and Quantification of Urine Albumin Using Linear Regression Method

Authors: Patricia B. Cruz, Catrina Jean G. Valenzuela, Analyn N. Yumang

Abstract:

Nearly a hundred per million of the Filipino population is diagnosed with Chronic Kidney Disease (CKD). The early stage of CKD has no symptoms and can only be discovered once the patient undergoes urinalysis. Over the years, different methods were discovered and used for the quantification of the urinary albumin such as the immunochemical assays where most of these methods require large machinery that has a high cost in maintenance and resources, and a dipstick test which is yet to be proven and is still debated as a reliable method in detecting early stages of microalbuminuria. This research study involves the use of the liquid chromatography concept in microfluidic instruments with biosensor as a means of separation and detection respectively, and linear regression to quantify human urinary albumin. The researchers’ main objective was to create a miniature system that quantifies and detect patients’ urinary albumin while reducing the amount of volume used per five test samples. For this study, 30 urine samples of unknown albumin concentrations were tested using VITROS Analyzer and the microfluidic system for comparison. Based on the data shared by both methods, the actual vs. predicted regression were able to create a positive linear relationship with an R2 of 0.9995 and a linear equation of y = 1.09x + 0.07, indicating that the predicted values and actual values are approximately equal. Furthermore, the microfluidic instrument uses 75% less in total volume – sample and reagents combined, compared to the VITROS Analyzer per five test samples.

Keywords: Chronic kidney disease, microfluidics, linear regression, VITROS analyzer, urinary albumin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 853
1640 Architecture, Implementation and Application of Tools for Experimental Analysis

Authors: Tom Dowling, Adam Duffy

Abstract:

This paper presents an architecture to assist in the development of tools to perform experimental analysis. Existing implementations of tools based on this architecture are also described in this paper. These tools are applied to the real world problem of fault attack emulation and detection in cryptographic algorithms.

Keywords: Software Architectures and Design, Software Componentsand Reuse, Engineering Secure Software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
1639 Retina Based Mouse Control (RBMC)

Authors: Arslan Qamar Malik, Jehanzeb Ahmad

Abstract:

The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.

Keywords: Human Computer Interaction, Real-Time System, Electro-oculography, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4232
1638 Application of Rapid Prototyping to Create Additive Prototype Using Computer System

Authors: Meftah O. Bashir, Fatma A. Karkory

Abstract:

Rapid prototyping is a new group of manufacturing processes, which allows fabrication of physical of any complexity using a layer by layer deposition technique directly from a computer system. The rapid prototyping process greatly reduces the time and cost necessary to bring a new product to market. The prototypes made by these systems are used in a range of industrial application including design evaluation, verification, testing, and as patterns for casting processes. These processes employ a variety of materials and mechanisms to build up the layers to build the part. The present work was to build a FDM prototyping machine that could control the X-Y motion and material deposition, to generate two-dimensional and three-dimensional complex shapes. This study focused on the deposition of wax material. This work was to find out the properties of the wax materials used in this work in order to enable better control of the FDM process. This study will look at the integration of a computer controlled electro-mechanical system with the traditional FDM additive prototyping process. The characteristics of the wax were also analysed in order to optimise the model production process. These included wax phase change temperature, wax viscosity and wax droplet shape during processing.

Keywords: Rapid prototyping, wax, manufacturing processes, additive prototyping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1666
1637 Implementation of an On-Line PD Measurement System Using HFCT

Authors: F. Haghjoo, M. Sarlak, S.M. Shahrtash

Abstract:

In order to perform on-line measuring and detection of PD signals, a total solution composing of an HFCT, A/D converter and a complete software package is proposed. The software package includes compensation of HFCT contribution, filtering and noise reduction using wavelet transform and soft calibration routines. The results have shown good performance and high accuracy.

Keywords: Partial Discharge, Measurement, On-line, HFCT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1812
1636 Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

Authors: János Levendovszky, András Oláh

Abstract:

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Keywords: Cellular Neural Network, channel equalization, communication over fading channels, multiuser communication, spectral efficiency, statistical sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1512
1635 Detection of Lard in Binary Animal Fats and Vegetable Oils Mixtures and in Some Commercial Processed Foods

Authors: H. A. Al-Kahtani, A. A. Abou Arab, M. Asif

Abstract:

Animal fats (camel, sheep, goat, rabbit and chicken) and vegetable oils (corn, sunflower, palm oil and olive oil) were substituted with different proportions (1, 5, 10 and 20%) of lard. Fatty acid composition in TG and 2-MG were determined using lipase hydrolysis and gas chromatography before and after adulteration. Results indicated that, genuine lard had a high proportion (60.97%) of the total palmitic acid at 2-MG. However, it was 8.70%, 16.40%, 11.38%, 10.57%, 29.97 and 8.97% for camel, beef, sheep, goat, rabbit and chicken, respectively. It could be noticed also the position-2-MG is mostly occupied by unsaturated fatty acids among all tested fats except lard. Vegetable oils (corn, sunflower, palm oil and olive oil) revealed that the levels of palmitic acid esterifies at 2-MG position was 6.84, 1.43, 9.86 and 1.70%, respectively. It could be observed also the studied oils had a higher level of unsaturated fatty acids in the same position, compared with animal fats under investigation. Moreover, palmitic acid esterifies at 2-MG and PAEF increased gradually as the substituted levels increased among all tested fat and oil samples. Statistical analysis showed that the PAEF correlated well with lard level. The detection of lard in some commercial processed foods (5 French fries, 4 Butter fats, 5 processed meat and 6 candy samples) was carried out. Results revealed that 2 samples of French fries and 4 samples of processed meat contained lard due to their higher PAEF, while butter fat and candy were free of lard.

Keywords: Lard, adulteration, PAEF, goat, triglycerides.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2934
1634 Wasting Human and Computer Resources

Authors: Mária Csernoch, Piroska Biró

Abstract:

The legends about “user-friendly” and “easy-to-use” birotical tools (computer-related office tools) have been spreading and misleading end-users. This approach has led us to the extremely high number of incorrect documents, causing serious financial losses in the creating, modifying, and retrieving processes. Our research proved that there are at least two sources of this underachievement: (1) The lack of the definition of the correctly edited, formatted documents. Consequently, end-users do not know whether their methods and results are correct or not. They are not aware of their ignorance. They are so ignorant that their ignorance does not allow them to realize their lack of knowledge. (2) The end-users’ problem solving methods. We have found that in non-traditional programming environments end-users apply, almost exclusively, surface approach metacognitive methods to carry out their computer related activities, which are proved less effective than deep approach methods. Based on these findings we have developed deep approach methods which are based on and adapted from traditional programming languages. In this study, we focus on the most popular type of birotical documents, the text based documents. We have provided the definition of the correctly edited text, and based on this definition, adapted the debugging method known in programming. According to the method, before the realization of text editing, a thorough debugging of already existing texts and the categorization of errors are carried out. With this method in advance to real text editing users learn the requirements of text based documents and also of the correctly formatted text. The method has been proved much more effective than the previously applied surface approach methods. The advantages of the method are that the real text handling requires much less human and computer sources than clicking aimlessly in the GUI (Graphical User Interface), and the data retrieval is much more effective than from error-prone documents.

Keywords: Deep approach metacognitive methods, error-prone birotical documents, financial losses, human and computer resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
1633 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
1632 A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses

Authors: Rashima Mahajan, Dipali Bansal, Shweta Singh

Abstract:

Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotiv EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.

Keywords: Brain Computer Interface (BCI), Electroencephalogram (EEG), EEGLab, BCILab, Emotiv, Emotions, Interval features, Spectral features, Artificial Neural Network, Control applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5290
1631 An Experimental Procedure for Design and Construction of Monocopter and Its Control Using Optical and GPS-Aided AHRS Sensors

Authors: A. Safaee, M. S. Mehrabani, M. B. Menhaj, V. Mousavi, S. Z. Moussavi

Abstract:

Monocopter is a single-wing rotary flying vehicle which has the capability of hovering. This flying vehicle includes two dynamic parts in which more efficiency can be expected rather than other Micro UAVs due to the extended area of wing compared to its fuselage. Low cost and simple mechanism in comparison to other vehicles such as helicopter are the most important specifications of this flying vehicle. In the previous paper we discussed the introduction of the final system but in this paper, the experimental design process of Monocopter and its control algorithm has been investigated in general. Also the editorial bugs in the previous article have been corrected and some translational ambiguities have been resolved. Initially by constructing several prototypes and carrying out many flight tests the main design parameters of this air vehicle were obtained by experimental measurements. Eventually the required main monocopter for this project was constructed. After construction of the monocopter in order to design, implementation and testing of control algorithms first a simple optic system used for determining the heading angle. After doing numerous tests on Test Stand, the control algorithm designed and timing of applying control inputs adjusted. Then other control parameters of system were tuned in flight tests. Eventually the final control system designed and implemented using the AHRS sensor and the final operational tests performed successfully.

Keywords: Monocopter, Flap, Heading Angle, AHRS, Cyclic, Photo Diode.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3423
1630 Analysis of Driver Point of Regard Determinations with Eye-Gesture Templates Using Receiver Operating Characteristic

Authors: Siti Nor Hafizah binti Mohd Zaid, Mohamed Abdel-Maguid, Abdel-Hamid Soliman

Abstract:

An Advance Driver Assistance System (ADAS) is a computer system on board a vehicle which is used to reduce the risk of vehicular accidents by monitoring factors relating to the driver, vehicle and environment and taking some action when a risk is identified. Much work has been done on assessing vehicle and environmental state but there is still comparatively little published work that tackles the problem of driver state. Visual attention is one such driver state. In fact, some researchers claim that lack of attention is the main cause of accidents as factors such as fatigue, alcohol or drug use, distraction and speeding all impair the driver-s capacity to pay attention to the vehicle and road conditions [1]. This seems to imply that the main cause of accidents is inappropriate driver behaviour in cases where the driver is not giving full attention while driving. The work presented in this paper proposes an ADAS system which uses an image based template matching algorithm to detect if a driver is failing to observe particular windscreen cells. This is achieved by dividing the windscreen into 24 uniform cells (4 rows of 6 columns) and matching video images of the driver-s left eye with eye-gesture templates drawn from images of the driver looking at the centre of each windscreen cell. The main contribution of this paper is to assess the accuracy of this approach using Receiver Operating Characteristic analysis. The results of our evaluation give a sensitivity value of 84.3% and a specificity value of 85.0% for the eye-gesture template approach indicating that it may be useful for driver point of regard determinations.

Keywords: Advanced Driver Assistance Systems, Eye-Tracking, Hazard Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
1629 Improved Segmentation of Speckled Images Using an Arithmetic-to-Geometric Mean Ratio Kernel

Authors: J. Daba, J. Dubois

Abstract:

In this work, we improve a previously developed segmentation scheme aimed at extracting edge information from speckled images using a maximum likelihood edge detector. The scheme was based on finding a threshold for the probability density function of a new kernel defined as the arithmetic mean-to-geometric mean ratio field over a circular neighborhood set and, in a general context, is founded on a likelihood random field model (LRFM). The segmentation algorithm was applied to discriminated speckle areas obtained using simple elliptic discriminant functions based on measures of the signal-to-noise ratio with fractional order moments. A rigorous stochastic analysis was used to derive an exact expression for the cumulative density function of the probability density function of the random field. Based on this, an accurate probability of error was derived and the performance of the scheme was analysed. The improved segmentation scheme performed well for both simulated and real images and showed superior results to those previously obtained using the original LRFM scheme and standard edge detection methods. In particular, the false alarm probability was markedly lower than that of the original LRFM method with oversegmentation artifacts virtually eliminated. The importance of this work lies in the development of a stochastic-based segmentation, allowing an accurate quantification of the probability of false detection. Non visual quantification and misclassification in medical ultrasound speckled images is relatively new and is of interest to clinicians.

Keywords: Discriminant function, false alarm, segmentation, signal-to-noise ratio, skewness, speckle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
1628 Mutation Analysis of the ATP7B Gene in 43 Vietnamese Wilson’s Disease Patients

Authors: Huong M. T. Nguyen, Hoa A. P. Nguyen, Mai P. T. Nguyen, Ngoc D. Ngo, Van T. Ta, Hai T. Le, Chi V. Phan

Abstract:

Wilson’s disease (WD) is an autosomal recessive disorder of the copper metabolism, which is caused by a mutation in the copper-transporting P-type ATPase (ATP7B). The mechanism of this disease is the failure of hepatic excretion of copper to bile, and leads to copper deposits in the liver and other organs. The ATP7B gene is located on the long arm of chromosome 13 (13q14.3). This study aimed to investigate the gene mutation in the Vietnamese patients with WD, and make a presymptomatic diagnosis for their familial members. Forty-three WD patients and their 65 siblings were identified as having ATP7B gene mutations. Genomic DNA was extracted from peripheral blood samples; 21 exons and exon-intron boundaries of the ATP7B gene were analyzed by direct sequencing. We recognized four mutations ([R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G) in the sum of 20 detectable mutations, accounting for 87.2% of the total. Mutation S105* was determined to have a high rate (32.6%) in this study. The hotspot regions of ATP7B were found at exons 2, 16, and 8, and intron 14, in 39.6 %, 11.6 %, 9.3%, and 7 % of patients, respectively. Among nine homozygote/compound heterozygote siblings of the patients with WD, three individuals were determined as asymptomatic by screening mutations of the probands. They would begin treatment after diagnosis. In conclusion, 20 different mutations were detected in 43 WD patients. Of this number, four novel mutations were explored, including [R723=; H724Tfs*34], V1042Cfs*79, D1027H, and IVS6+3A>G. The mutation S105* is the most prevalent and has been considered as a biomarker that can be used in a rapid detection assay for diagnosis of WD patients. Exons 2, 8, and 16, and intron 14 should be screened initially for WD patients in Vietnam. Based on risk profile for WD, genetic testing for presymptomatic patients is also useful in diagnosis and treatment.

Keywords: ATP7B gene, mutation detection, presymptomatic diagnosis, Vietnamese Wilson’s disease.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1690