Search results for: noise source identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3118

Search results for: noise source identification

2548 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: Steganography, watermarking, private keys, time complexity measurements.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 798
2547 Oil Debris Signal Detection Based on Integral Transform and Empirical Mode Decomposition

Authors: Chuan Li, Ming Liang

Abstract:

Oil debris signal generated from the inductive oil debris monitor (ODM) is useful information for machine condition monitoring but is often spoiled by background noise. To improve the reliability in machine condition monitoring, the high-fidelity signal has to be recovered from the noisy raw data. Considering that the noise components with large amplitude often have higher frequency than that of the oil debris signal, the integral transform is proposed to enhance the detectability of the oil debris signal. To cancel out the baseline wander resulting from the integral transform, the empirical mode decomposition (EMD) method is employed to identify the trend components. An optimal reconstruction strategy including both de-trending and de-noising is presented to detect the oil debris signal with less distortion. The proposed approach is applied to detect the oil debris signal in the raw data collected from an experimental setup. The result demonstrates that this approach is able to detect the weak oil debris signal with acceptable distortion from noisy raw data.

Keywords: Integral transform, empirical mode decomposition, oil debris, signal processing, detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689
2546 Integration of Multi-Source Data to Monitor Coral Biodiversity

Authors: K. Jitkue, W. Srisang, C. Yaiprasert, K. Jaroensutasinee, M. Jaroensutasinee

Abstract:

This study aims at using multi-source data to monitor coral biodiversity and coral bleaching. We used coral reef at Racha Islands, Phuket as a study area. There were three sources of data: coral diversity, sensor based data and satellite data.

Keywords: Coral reefs, Remote sensing, Sea surfacetemperatue, Satellite imagery.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1526
2545 Real-Time Identification of Media in a Laboratory-Scaled Penetrating Process

Authors: Sheng-Hong Pong, Herng-Yu Huang, Yi-Ju Lee, Shih-Hsuan Chiu

Abstract:

In this paper, a neural network technique is applied to real-time classifying media while a projectile is penetrating through them. A laboratory-scaled penetrating setup was built for the experiment. Features used as the network inputs were extracted from the acceleration of penetrator. 6000 set of features from a single penetration with known media and status were used to train the neural network. The trained system was tested on 30 different penetration experiments. The system produced an accuracy of 100% on the training data set. And, their precision could be 99% for the test data from 30 tests.

Keywords: back-propagation, identification, neural network, penetration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1263
2544 Types of Epilepsies and Findings EEG- LORETA about Epilepsy

Authors: Leila Maleki, Ahmad Esmali Kooraneh, Hossein Taghi Derakhshi

Abstract:

Neural activity in the human brain starts from the early stages of prenatal development. This activity or signals generated by the brain are electrical in nature and represent not only the brain function but also the status of the whole body. At the present moment, three methods can record functional and physiological changes within the brain with high temporal resolution of neuronal interactions at the network level: the electroencephalogram (EEG), the magnet oencephalogram (MEG), and functional magnetic resonance imaging (fMRI); each of these has advantages and shortcomings. EEG recording with a large number of electrodes is now feasible in clinical practice. Multichannel EEG recorded from the scalp surface provides very valuable but indirect information about the source distribution. However, deep electrode measurements yield more reliable information about the source locations intracranial recordings and scalp EEG are used with the source imaging techniques to determine the locations and strengths of the epileptic activity. As a source localization method, Low Resolution Electro-Magnetic Tomography (LORETA) is solved for the realistic geometry based on both forward methods, the Boundary Element Method (BEM) and the Finite Difference Method (FDM). In this paper, we review the findings EEG- LORETA about epilepsy.

Keywords: Epilepsy, EEG, EEG- Loreta, loreta analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3064
2543 Free and Open Source Licences, Software Programmers, and the Social Norm of Reciprocity

Authors: Luke McDonagh

Abstract:

Over the past three decades, free and open source software (FOSS) programmers have developed new, innovative and legally binding licences that have in turn enabled the creation of innumerable pieces of everyday software, including Linux, Mozilla Firefox and Open Office. That FOSS has been highly successful in competing with 'closed source software' (e.g. Microsoft Office) is now undeniable, but in noting this success, it is important to examine in detail why this system of FOSS has been so successful. One key reason is the existence of networks or communities of programmers, who are bound together by a key shared social norm of 'reciprocity'. At the same time, these FOSS networks are not unitary – they are highly diverse and there are large divergences of opinion between members regarding which licences are generally preferable: some members favour the flexible ‘free’ or 'no copyleft' licences, such as BSD and MIT, while other members favour the ‘strong open’ or 'strong copyleft' licences such as GPL. This paper argues that without both the existence of the shared norm of reciprocity and the diversity of licences, it is unlikely that the innovative legal framework provided by FOSS would have succeeded to the extent that it has.

Keywords: Open source, software, licences, reciprocity, networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1033
2542 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: Credibility scale, media credibility components, new media credibility scale, scale development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2843
2541 Information Fusion for Identity Verification

Authors: Girija Chetty, Monica Singh

Abstract:

In this paper we propose a novel approach for ascertaining human identity based on fusion of profile face and gait biometric cues The identification approach based on feature learning in PCA-LDA subspace, and classification using multivariate Bayesian classifiers allows significant improvement in recognition accuracy for low resolution surveillance video scenarios. The experimental evaluation of the proposed identification scheme on a publicly available database [2] showed that the fusion of face and gait cues in joint PCA-LDA space turns out to be a powerful method for capturing the inherent multimodality in walking gait patterns, and at the same time discriminating the person identity..

Keywords: Biometrics, gait recognition, PCA, LDA, Eigenface, Fisherface, Multivariate Gaussian Classifier

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
2540 Appraisal of Relativistic Effects on GNSS Receiver Positioning

Authors: I. Yakubu, Y. Y. Ziggah, E. A. Gyamera

Abstract:

The Global Navigation Satellite System (GNSS) started with the launch of the United State Department of Defense Global Positioning System (GPS). GNSS systems has grown over the years to include: GLONASS (Russia); Galileo (European Union); BeiDou (China). Any GNSS architecture consists of three major segments: Space, Control and User Segments. Errors such as; multipath, ionospheric and tropospheric effects, satellite clocks, receiver noise and orbit errors (relativity effect) have significant effects on GNSS positioning. To obtain centimeter level accuracy, the impacts of the relative motion of the satellites and earth need to be taken into account. This paper discusses the relevance of the theory of relativity as a source of error for GNSS receivers for position fix based on available relevant literature. Review of relevant literature reveals that due to relativity; Time dilation, Gravitational frequency shift and Sagnac effect cause significant influence on the use of GNSS receivers for positioning by an error range of ± 2.5 m based on pseudo-range computation.

Keywords: GNSS, relativistic effects, pseudo-range, accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 330
2539 RFID Logistic Management with Cold Chain Monitoring – Cold Store Case Study

Authors: Mira Trebar

Abstract:

Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.

Keywords: Logistics, warehouse, RFID device, cold chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3686
2538 Computational Identification of MicroRNAs and their Targets in two Species of Evergreen Spruce Tree (Picea)

Authors: Muhammad Y.K. Barozai, Ifthikhar A. Baloch, M. Din

Abstract:

MicroRNAs (miRNAs) are small, non-coding and regulatory RNAs about 20 to 24 nucleotides long. Their conserved nature among the various organisms makes them a good source of new miRNAs discovery by comparative genomics approach. The study resulted in 21 miRNAs of 20 pre-miRNAs belonging to 16 families (miR156, 157, 158, 164, 165, 168, 169, 172, 319, 390, 393, 394, 395, 400, 472 and 861) in evergreen spruce tree (Picea). The miRNA families; miR 157, 158, 164, 165, 168, 169, 319, 390, 393, 394, 400, 472 and 861 are reported for the first time in the Picea. All 20 miRNA precursors form stable minimum free energy stem-loop structure as their orthologues form in Arabidopsis and the mature miRNA reside in the stem portion of the stem loop structure. Sixteen (16) miRNAs are from Picea glauca and five (5) belong to Picea sitchensis. Their targets consist of transcription factors, growth related, stressed related and hypothetical proteins.

Keywords: BLAST, Comparative Genomics, Micro-RNAs, Spruce

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2023
2537 Forecasting Materials Demand from Multi-Source Ordering

Authors: Hui Hsin Huang

Abstract:

The downstream manufactures will order their materials from different upstream suppliers to maintain a certain level of the demand. This paper proposes a bivariate model to portray this phenomenon of material demand. We use empirical data to estimate the parameters of model and evaluate the RMSD of model calibration. The results show that the model has better fitness.

Keywords: Farlie-Gumbel-Morgenstern family of bivariate distributions, multi-source ordering, materials demand quantity, recency, ordering time.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
2536 Curbing Cybercrime by Application of Internet Users’ Identification System (IUIS) in Nigeria

Authors: K. Alese Boniface, K. Adu Michael

Abstract:

Cybercrime is now becoming a big challenge in Nigeria apart from the traditional crime. Inability to identify perpetrators is one of the reasons for the growing menace. This paper proposes a design for monitoring internet users’ activities in order to curbing cybercrime. It requires redefining the operations of Internet Service Providers (ISPs) which will now mandate users to be authenticated before accessing the internet. In implementing this work which can be adapted to a larger scale, a virtual router application is developed and configured to mimic a real router device. A sign-up portal is developed to allow users to register with the ISP. The portal asks for identification information which will include bio-data and government issued identification data like National Identity Card number, et cetera. A unique username and password are chosen by the user to enable access to the internet which will be used to reference him to an Internet Protocol Address (IP Address) of any system he uses on the internet and thereby associating him to any criminal act related to that IP address at that particular time. Questions such as “What happen when another user knows the password and uses it to commit crime?” and other pertinent issues are addressed.

Keywords: Cybercrime, Sign-up Portal, Internet Service Provider (ISP), Internet Protocol Address (IP address).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2211
2535 Cost Benefit Analysis: Evaluation among the Millimetre Wavebands and SHF Bands of Small Cell 5G Networks

Authors: Emanuel Teixeira, Anderson Ramos, Marisa Lourenço, Fernando J. Velez, Jon M. Peha

Abstract:

This article discusses the benefit cost analysis aspects of millimetre wavebands (mmWaves) and Super High Frequency (SHF). The devaluation along the distance of the carrier-to-noise-plus-interference ratio with the coverage distance is assessed by considering two different path loss models, the two-slope urban micro Line-of-Sight (UMiLoS) for the SHF band and the modified Friis propagation model, for frequencies above 24 GHz. The equivalent supported throughput is estimated at the 5.62, 28, 38, 60 and 73 GHz frequency bands and the influence of carrier-to-noise-plus-interference ratio in the radio and network optimization process is explored. Mostly owing to the lessening caused by the behaviour of the two-slope propagation model for SHF band, the supported throughput at this band is higher than at the millimetre wavebands only for the longest cell lengths. The benefit cost analysis of these pico-cellular networks was analysed for regular cellular topologies, by considering the unlicensed spectrum. For shortest distances, we can distinguish an optimal of the revenue in percentage terms for values of the cell length, R ≈ 10 m for the millimeter wavebands and for longest distances an optimal of the revenue can be observed at R ≈ 550 m for the 5.62 GHz. It is possible to observe that, for the 5.62 GHz band, the profit is slightly inferior than for millimetre wavebands, for the shortest Rs, and starts to increase for cell lengths approximately equal to the ratio between the break-point distance and the co-channel reuse factor, achieving a maximum for values of R approximately equal to 550 m.

Keywords: 5G, millimetre wavebands, super high-frequency band, SINR, signal-to-interference-plus-noise ratio, cost benefit analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 683
2534 Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression

Authors: Amarunnishad T.M., Govindan V.K., Abraham T. Mathew

Abstract:

An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.

Keywords: Image compression, Edge detection, Ground truth image, Peak signal to noise ratio

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1671
2533 Simulation of PM10 Source Apportionment at An Urban Site in Southern Taiwan by a Gaussian Trajectory Model

Authors: Chien-Lung Chen, Jeng-Lin Tsai, Feng-Chao Chung, Su-Ching Kuo, Kuo-Hsin Tseng, Pei-Hsuan Kuo, Li-Ying Hsieh, Ying I. Tsai

Abstract:

This study applied the Gaussian trajectory transfer-coefficient model (GTx) to simulate the particulate matter concentrations and the source apportionments at Nanzih Air Quality Monitoring Station in southern Taiwan from November 2007 to February 2008. The correlation coefficient between the observed and the calculated daily PM10 concentrations is 0.5 and the absolute bias of the PM10 concentrations is 24%. The simulated PM10 concentrations matched well with the observed data. Although the emission rate of PM10 was dominated by area sources (58%), the results of source apportionments indicated that the primary sources for PM10 at Nanzih Station were point sources (42%), area sources (20%) and then upwind boundary concentration (14%). The obvious difference of PM10 source apportionment between episode and non-episode days was upwind boundary concentrations which contributed to 20% and 11% PM10 sources, respectively. The gas-particle conversion of secondary aerosol and long range transport played crucial roles on the PM10 contribution to a receptor.

Keywords: back trajectory model, particulate matter, sourceapportionment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1568
2532 Influence of Loudness Compression on Hearing with Bone Anchored Hearing Implants

Authors: Anja Kurz, Marc Flynn, Tobias Good, Marco Caversaccio, Martin Kompis

Abstract:

Bone Anchored Hearing Implants (BAHI) are  routinely used in patients with conductive or mixed hearing loss, e.g.  if conventional air conduction hearing aids cannot be used. New  sound processors and new fitting software now allow the adjustment  of parameters such as loudness compression ratios or maximum  power output separately. Today it is unclear, how the choice of these  parameters influences aided speech understanding in BAHI users.  In this prospective experimental study, the effect of varying the  compression ratio and lowering the maximum power output in a  BAHI were investigated.  Twelve experienced adult subjects with a mixed hearing loss  participated in this study. Four different compression ratios (1.0; 1.3;  1.6; 2.0) were tested along with two different maximum power output  settings, resulting in a total of eight different programs. Each  participant tested each program during two weeks. A blinded Latin  square design was used to minimize bias.  For each of the eight programs, speech understanding in quiet and  in noise was assessed. For speech in quiet, the Freiburg number test  and the Freiburg monosyllabic word test at 50, 65, and 80 dB SPL  were used. For speech in noise, the Oldenburg sentence test was  administered.  Speech understanding in quiet and in noise was improved  significantly in the aided condition in any program, when compared  to the unaided condition. However, no significant differences were  found between any of the eight programs. In contrast, on a subjective  level there was a significant preference for medium compression  ratios of 1.3 to 1.6 and higher maximum power output.

 

Keywords: Bone Anchored Hearing Implant, Compression, Maximum Power Output, Speech understanding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2039
2531 Predicting Protein Function using Decision Tree

Authors: Manpreet Singh, Parminder Kaur Wadhwa, Surinder Kaur

Abstract:

The drug discovery process starts with protein identification because proteins are responsible for many functions required for maintenance of life. Protein identification further needs determination of protein function. Proposed method develops a classifier for human protein function prediction. The model uses decision tree for classification process. The protein function is predicted on the basis of matched sequence derived features per each protein function. The research work includes the development of a tool which determines sequence derived features by analyzing different parameters. The other sequence derived features are determined using various web based tools.

Keywords: Sequence Derived Features, decision tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
2530 Characterization of Ajebo Kaolinite Clay for Production of Natural Pozzolan

Authors: Gbenga M. Ayininuola, Olasunkanmi A. Adekitan

Abstract:

Calcined kaolinite clay (CKC) is a pozzolanic material that is current drawing research attention. This work investigates the conditions for the best performance of a CKC from a kaolinite clay source in Ajebo, Abeokuta (southwest Nigeria) known for its commercial availability. Samples from this source were subjected to X-ray diffractometry (XRD) and differential scanning calorimetry (DSC). XRD shows that kaolinite is the main mineral in the clay source. This mineral is responsible for the pozzolanic behavior of CKC. DSC indicates that the transformation from the clay to CKC occurred between 550 and 750 oC. Using this temperature range, clay samples were milled and different CKC samples were produced in an electric muffle furnace using temperatures of 550, 600, 650, 700, 750 and 800 oC respectively for 1 hour each. This was also repeated for 2 hours. The degree of de-hydroxylation (dtg) and strength activity index (SAI) were also determined for each of the CKC samples. The dtg and SAI tests were repeated two more times for each sample and averages were taken. Results showed that peak dtg occurred at 750 oC for 1 hour calcining combination (94.27%) whereas marginal differences were recorded at some lower temperatures (90.97% for 650 oC for 2 hours; 91.05% for 700 oC for 1 hour and 92.77% for 700 oC for 2 hours). Optimum SAI was reported at 700 oC for 1 hour (99.05%). Rating SAI as a better parameter than dtg, 700 oC for 1 hour combination was adopted as the best calcining condition. The paper recommends the adoption of this clay source for pozzolan production by adopting the calcining conditions established in this work.

Keywords: Calcined kaolinite clay, calcination, optimum-calcining conditions, pozzolanity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1284
2529 Enhancing the Peer-To-Peer Architecture with a Roaming Service and OWL

Authors: Younes Djaghloul, Zizette Boufaida

Abstract:

This paper addresses the problem of building a unified structure to describe a peer-to-peer system. Our approach uses the well-known notations in the P2P area, and provides a global architecture that puts a separation between the platform specific characteristics and the logical ones. In order to enable the navigation of the peer across platforms, a roaming layer is added. The latter provides a capability to define a unique identification of peer and assures the mapping between this identification and those used in each platform. The mapping task is assured by special wrapper. In addition, ontology is proposed to give a clear presentation of the structure of the P2P system without interesting in the content and the resource managed by the peer. The ontology is created according to the web semantic paradigm and using OWL language; so, the structure of the system is considered as a web resource.

Keywords: Peer to peer, ontology, owl.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1352
2528 Development Tendency of Energy: A Short Review

Authors: Rehan Jamil, Irfan Jamil, Ming Li, Zhao Jinquan

Abstract:

Energy is the important source for the development of the society and it‘s the basic support of national economy and the base for human living. As the development of economy, abrupt increase of population and continuous improvement of living standards, the demand of energy increases continuously, which caused the impetuous scramble of energy source in the world, and urged the attention of the countries for current status and development trends of energy.

Keywords: Energy, Energy Supply Situation, Energy Production & Consumption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232
2527 Gait Biometric for Person Re-Identification

Authors: Lavanya Srinivasan

Abstract:

Biometric identification is to identify unique features in a person like fingerprints, iris, ear, and voice recognition that need the subject's permission and physical contact. Gait biometric is used to identify the unique gait of the person by extracting moving features. The main advantage of gait biometric to identify the gait of a person at a distance, without any physical contact. In this work, the gait biometric is used for person re-identification. The person walking naturally compared with the same person walking with bag, coat and case recorded using long wave infrared, short wave infrared, medium wave infrared and visible cameras. The videos are recorded in rural and in urban environments. The pre-processing technique includes human identified using You Only Look Once, background subtraction, silhouettes extraction and synthesis Gait Entropy Image by averaging the silhouettes. The moving features are extracted from the Gait Entropy Energy Image. The extracted features are dimensionality reduced by the Principal Component Analysis and recognized using different classifiers. The comparative results with the different classifier show that Linear Discriminant Analysis outperform other classifiers with 95.8% for visible in the rural dataset and 94.8% for longwave infrared in the urban dataset.

Keywords: biometric, gait, silhouettes, You Only Look Once

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 495
2526 Blind Image Deconvolution by Neural Recursive Function Approximation

Authors: Jiann-Ming Wu, Hsiao-Chang Chen, Chun-Chang Wu, Pei-Hsun Hsu

Abstract:

This work explores blind image deconvolution by recursive function approximation based on supervised learning of neural networks, under the assumption that a degraded image is linear convolution of an original source image through a linear shift-invariant (LSI) blurring matrix. Supervised learning of neural networks of radial basis functions (RBF) is employed to construct an embedded recursive function within a blurring image, try to extract non-deterministic component of an original source image, and use them to estimate hyper parameters of a linear image degradation model. Based on the estimated blurring matrix, reconstruction of an original source image from a blurred image is further resolved by an annealed Hopfield neural network. By numerical simulations, the proposed novel method is shown effective for faithful estimation of an unknown blurring matrix and restoration of an original source image.

Keywords: Blind image deconvolution, linear shift-invariant(LSI), linear image degradation model, radial basis functions (rbf), recursive function, annealed Hopfield neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
2525 Investigation of Wood Chips as Internal Carbon Source Supporting Denitrification Process in Domestic Wastewater Treatment

Authors: Ruth Lorivi, Jianzheng Li, John J. Ambuchi, Kaiwen Deng

Abstract:

Nitrogen removal from wastewater is accomplished by nitrification and denitrification processes. Successful denitrification requires carbon, therefore, if placed after biochemical oxygen demand (BOD) and nitrification process, a carbon source has to be re-introduced into the water. To avoid adding a carbon source, denitrification is usually placed before BOD and nitrification processes. This process however involves recycling the nitrified effluent. In this study wood chips were used as internal carbon source which enabled placement of denitrification after BOD and nitrification process without effluent recycling. To investigate the efficiency of a wood packed aerobic-anaerobic baffled reactor on carbon and nutrients removal from domestic wastewater, a three compartment baffled reactor was presented. Each of the three compartments was packed with 329 g wood chips 1x1cm acting as an internal carbon source for denitrification. The proposed mode of operation was aerobic-anoxic-anaerobic (OAA) with no effluent recycling. The operating temperature, hydraulic retention time (HRT), dissolved oxygen (DO) and pH were 24 ± 2 , 24 h, less than 4 mg/L and 7 ± 1 respectively. The removal efficiencies of chemical oxygen demand (COD), ammonia nitrogen (NH4+-N) and total nitrogen (TN) attained was 99, 87 and 83% respectively. TN removal rate was limited by nitrification as 97% of ammonia converted into nitrate and nitrite was denitrified. These results show that application of wood chips in wastewater treatment processes is an efficient internal carbon source. 

Keywords: Aerobic-anaerobic baffled reactor, denitrification, nitrification, wood chip.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450
2524 Subjective Versus Objective Assessment for Magnetic Resonance Images

Authors: Heshalini Rajagopal, Li Sze Chow, Raveendran Paramesran

Abstract:

Magnetic Resonance Imaging (MRI) is one of the most important medical imaging modality. Subjective assessment of the image quality is regarded as the gold standard to evaluate MR images. In this study, a database of 210 MR images which contains ten reference images and 200 distorted images is presented. The reference images were distorted with four types of distortions: Rician Noise, Gaussian White Noise, Gaussian Blur and DCT compression. The 210 images were assessed by ten subjects. The subjective scores were presented in Difference Mean Opinion Score (DMOS). The DMOS values were compared with four FR-IQA metrics. We have used Pearson Linear Coefficient (PLCC) and Spearman Rank Order Correlation Coefficient (SROCC) to validate the DMOS values. The high correlation values of PLCC and SROCC shows that the DMOS values are close to the objective FR-IQA metrics.

Keywords: Medical Resonance (MR) images, Difference Mean Opinion Score (DMOS), Full Reference Image Quality Assessment (FR-IQA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2189
2523 Optimization Model for Identification of Assembly Alternatives of Large-Scale, Make-to-Order Products

Authors: Henrik Prinzhorn, Peter Nyhuis, Johannes Wagner, Peter Burggräf, Torben Schmitz, Christina Reuter

Abstract:

Assembling large-scale products, such as airplanes, locomotives, or wind turbines, involves frequent process interruptions induced by e.g. delayed material deliveries or missing availability of resources. This leads to a negative impact on the logistical performance of a producer of xxl-products. In industrial practice, in case of interruptions, the identification, evaluation and eventually the selection of an alternative order of assembly activities (‘assembly alternative’) leads to an enormous challenge, especially if an optimized logistical decision should be reached. Therefore, in this paper, an innovative, optimization model for the identification of assembly alternatives that addresses the given problem is presented. It describes make-to-order, large-scale product assembly processes as a resource constrained project scheduling (RCPS) problem which follows given restrictions in practice. For the evaluation of the assembly alternative, a cost-based definition of the logistical objectives (delivery reliability, inventory, make-span and workload) is presented.

Keywords: Assembly scheduling, large-scale products, make-to-order, rescheduling, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1419
2522 A Frequency Grouping Approach for Blind Deconvolution of Fairly Motionless Sources

Authors: E. S. Gower, T. Tsalaile, E. Rakgati, M. O. J. Hawksford

Abstract:

A frequency grouping approach for multi-channel instantaneous blind source separation (I-BSS) of convolutive mixtures is proposed for a lower net residual inter-symbol interference (ISI) and inter-channel interference (ICI) than the conventional short-time Fourier transform (STFT) approach. Starting in the time domain, STFTs are taken with overlapping windows to convert the convolutive mixing problem into frequency domain instantaneous mixing. Mixture samples at the same frequency but from different STFT windows are grouped together forming unique frequency groups. The individual frequency group vectors are input to the I-BSS algorithm of choice, from which the output samples are dispersed back to their respective STFT windows. After applying the inverse STFT, the resulting time domain signals are used to construct the complete source estimates via the weighted overlap-add method (WOLA). The proposed algorithm is tested for source deconvolution given two mixtures, and simulated along with the STFT approach to illustrate its superiority for fairly motionless sources.

Keywords: Blind source separation, short-time Fouriertransform, weighted overlap-add method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
2521 Smart Surveillance using PDA

Authors: Basem Mustafa Abd. Amer , Syed Abdul Rahman Al-Attas

Abstract:

The aim of this research is to develop a fast and reliable surveillance system based on a personal digital assistant (PDA) device. This is to extend the capability of the device to detect moving objects which is already available in personal computers. Secondly, to compare the performance between Background subtraction (BS) and Temporal Frame Differencing (TFD) techniques for PDA platform as to which is more suitable. In order to reduce noise and to prepare frames for the moving object detection part, each frame is first converted to a gray-scale representation and then smoothed using a Gaussian low pass filter. Two moving object detection schemes i.e., BS and TFD have been analyzed. The background frame is updated by using Infinite Impulse Response (IIR) filter so that the background frame is adapted to the varying illuminate conditions and geometry settings. In order to reduce the effect of noise pixels resulting from frame differencing morphological filters erosion and dilation are applied. In this research, it has been found that TFD technique is more suitable for motion detection purpose than the BS in term of speed. On average TFD is approximately 170 ms faster than the BS technique

Keywords: Surveillance, PDA, Motion Detection, ImageProcessing , Background Subtraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
2520 Software Tools for System Identification and Control using Neural Networks in Process Engineering

Authors: J. Fernandez de Canete, S. Gonzalez-Perez, P. del Saz-Orozco

Abstract:

Neural networks offer an alternative approach both for identification and control of nonlinear processes in process engineering. The lack of software tools for the design of controllers based on neural network models is particularly pronounced in this field. SIMULINK is properly a widely used graphical code development environment which allows system-level developers to perform rapid prototyping and testing. Such graphical based programming environment involves block-based code development and offers a more intuitive approach to modeling and control task in a great variety of engineering disciplines. In this paper a SIMULINK based Neural Tool has been developed for analysis and design of multivariable neural based control systems. This tool has been applied to the control of a high purity distillation column including non linear hydrodynamic effects. The proposed control scheme offers an optimal response for both theoretical and practical challenges posed in process control task, in particular when both, the quality improvement of distillation products and the operation efficiency in economical terms are considered.

Keywords: Distillation, neural networks, software tools, identification, control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2689
2519 Identification of States and Events for the Static and Dynamic Simulation of Single Electron Tunneling Circuits

Authors: Sharief F. Babiker, Abdelkareem Bedri, Rania Naeem

Abstract:

The implementation of single-electron tunneling (SET) simulators based on the master-equation (ME) formalism requires the efficient and accurate identification of an exhaustive list of active states and related tunnel events. Dynamic simulations also require the control of the emerging states and guarantee the safe elimination of decaying states. This paper describes algorithms for use in the stationary and dynamic control of the lists of active states and events. The paper presents results obtained using these algorithms with different SET structures.

Keywords: Active state, Coulomb blockade, Master Equation, Single electron devices

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1367