Search results for: Data Estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8104

Search results for: Data Estimation

6514 EDULOGIC+ - Knowledge Management through Data Analysis in Education

Authors: Alok Sharma, Dr. Harvinder S. Saini, Raviteja Tiruvury

Abstract:

This paper outlines the application of Knowledge Management (KM) principles in the context of Educational institutions. The paper caters to the needs of the engineering institutions for imparting quality education by delineating the instruction delivery process in a highly structured, controlled and quantified manner. This is done using a software tool EDULOGIC+. The central idea has been based on the engineering education pattern in Indian Universities/ Institutions. The data, contents and results produced over contiguous years build the necessary ground for managing the related accumulated knowledge. Application of KM has been explained using certain examples of data analysis and knowledge extraction.

Keywords: Education software system, information system, knowledge management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
6513 Quality Estimation of Video Transmitted overan Additive WGN Channel based on Digital Watermarking and Wavelet Transform

Authors: Mohamed S. El-Mahallawy, Attalah Hashad, Hazem Hassan Ali, Heba Sami Zaky

Abstract:

This paper presents an evaluation for a wavelet-based digital watermarking technique used in estimating the quality of video sequences transmitted over Additive White Gaussian Noise (AWGN) channel in terms of a classical objective metric, such as Peak Signal-to-Noise Ratio (PSNR) without the need of the original video. In this method, a watermark is embedded into the Discrete Wavelet Transform (DWT) domain of the original video frames using a quantization method. The degradation of the extracted watermark can be used to estimate the video quality in terms of PSNR with good accuracy. We calculated PSNR for video frames contaminated with AWGN and compared the values with those estimated using the Watermarking-DWT based approach. It is found that the calculated and estimated quality measures of the video frames are highly correlated, suggesting that this method can provide a good quality measure for video frames transmitted over AWGN channel without the need of the original video.

Keywords: AWGN, DWT, PSNR, Watermarking, VideoQuality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
6512 On the Development of a Homogenized Earthquake Catalogue for Northern Algeria

Authors: I. Grigoratos, R. Monteiro

Abstract:

Regions with a significant percentage of non-seismically designed buildings and reduced urban planning are particularly vulnerable to natural hazards. In this context, the project ‘Improved Tools for Disaster Risk Mitigation in Algeria’ (ITERATE) aims at seismic risk mitigation in Algeria. Past earthquakes in North Algeria caused extensive damages, e.g. the El Asnam 1980 moment magnitude (Mw) 7.1 and Boumerdes 2003 Mw 6.8 earthquakes. This paper will address a number of proposed developments and considerations made towards a further improvement of the component of seismic hazard. In specific, an updated earthquake catalog (until year 2018) is compiled, and new conversion equations to moment magnitude are introduced. Furthermore, a network-based method for the estimation of the spatial and temporal distribution of the minimum magnitude of completeness is applied. We found relatively large values for Mc, due to the sparse network, and a nonlinear trend between Mw and body wave (mb) or local magnitude (ML), which are the most common scales reported in the region. Lastly, the resulting b-value of the Gutenberg-Richter distribution is sensitive to the declustering method.

Keywords: Conversion equation, magnitude of completeness, seismic events, seismic hazard.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 805
6511 Application of Post-Stack and Pre-Stack Seismic Inversion for Prediction of Hydrocarbon Reservoirs in a Persian Gulf Gas Field

Authors: Nastaran Moosavi, Mohammad Mokhtari

Abstract:

Seismic inversion is a technique which has been in use for years and its main goal is to estimate and to model physical characteristics of rocks and fluids. Generally, it is a combination of seismic and well-log data. Seismic inversion can be carried out through different methods; we have conducted and compared post-stack and pre- stack seismic inversion methods on real data in one of the fields in the Persian Gulf. Pre-stack seismic inversion can transform seismic data to rock physics such as P-impedance, S-impedance and density. While post- stack seismic inversion can just estimate P-impedance. Then these parameters can be used in reservoir identification. Based on the results of inverting seismic data, a gas reservoir was detected in one of Hydrocarbon oil fields in south of Iran (Persian Gulf). By comparing post stack and pre-stack seismic inversion it can be concluded that the pre-stack seismic inversion provides a more reliable and detailed information for identification and prediction of hydrocarbon reservoirs.

Keywords: Density, P-impedance, S-impedance, post-stack seismic inversion, pre-stack seismic inversion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2212
6510 A Decision Support System for Predicting Hospitalization of Hemodialysis Patients

Authors: Jinn-Yi Yeh, Tai-Hsi Wu

Abstract:

Hemodialysis patients might suffer from unhealthy care behaviors or long-term dialysis treatments. Ultimately they need to be hospitalized. If the hospitalization rate of a hemodialysis center is high, its quality of service would be low. Therefore, how to decrease hospitalization rate is a crucial problem for health care. In this study we combined temporal abstraction with data mining techniques for analyzing the dialysis patients' biochemical data to develop a decision support system. The mined temporal patterns are helpful for clinicians to predict hospitalization of hemodialysis patients and to suggest them some treatments immediately to avoid hospitalization.

Keywords: Hemodialysis, Temporal abstract, Data mining, Healthcare quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1725
6509 A Human Activity Recognition System Based On Sensory Data Related to Object Usage

Authors: M. Abdullah-Al-Wadud

Abstract:

Sensor-based Activity Recognition systems usually accounts which sensors have been activated to perform an activity. The system then combines the conditional probabilities of those sensors to represent different activities and takes the decision based on that. However, the information about the sensors which are not activated may also be of great help in deciding which activity has been performed. This paper proposes an approach where the sensory data related to both usage and non-usage of objects are utilized to make the classification of activities. Experimental results also show the promising performance of the proposed method.

Keywords: Naïve Bayesian-based classification, Activity recognition, sensor data, object-usage model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
6508 Dominating Set Algorithm and Trust Evaluation Scheme for Secured Cluster Formation and Data Transferring

Authors: Y. Harold Robinson, M. Rajaram, E. Golden Julie, S. Balaji

Abstract:

This paper describes the proficient way of choosing the cluster head based on dominating set algorithm in a wireless sensor network (WSN). The algorithm overcomes the energy deterioration problems by this selection process of cluster heads. Clustering algorithms such as LEACH, EEHC and HEED enhance scalability in WSNs. Dominating set algorithm keeps the first node alive longer than the other protocols previously used. As the dominating set of cluster heads are directly connected to each node, the energy of the network is saved by eliminating the intermediate nodes in WSN. Security and trust is pivotal in network messaging. Cluster head is secured with a unique key. The member can only connect with the cluster head if and only if they are secured too. The secured trust model provides security for data transmission in the dominated set network with the group key. The concept can be extended to add a mobile sink for each or for no of clusters to transmit data or messages between cluster heads and to base station. Data security id preferably high and data loss can be prevented. The simulation demonstrates the concept of choosing cluster heads by dominating set algorithm and trust evaluation using DSTE. The research done is rationalized.

Keywords: Wireless Sensor Networks, LEECH, EEHC, HEED, DSTE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1398
6507 Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data

Authors: Kozak K, M. Kozak, K. Stapor

Abstract:

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.

Keywords: biological screening, kernel methods, KNN, QSAR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2259
6506 The Use of Real Measurements and GPS Data for Noise Mapping of Riyadh City

Authors: M. A. Foda, K. A. Alsaif, M. M. ElMadany, A.S. Aguib

Abstract:

In this paper, the noise maps for the area encircled by the Second Ring Road in Riyadh city are developed based on real measured data. Sound level meters, GPS receivers to determine measurement position, a database program to manage the measured data, and a program to develop the maps are used. A baseline noise level has been established at each short-term site so subsequent monitoring may be conducted to describe changes in Riyadh-s noise environment. Short-term sites are used to show typical daytime and nighttime noise levels at specific locations by short duration grab sampling.

Keywords: Noise mapping, Noise measurements, GPS, noise level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2144
6505 Removing Ocular Artifacts from EEG Signals using Adaptive Filtering and ARMAX Modeling

Authors: Parisa Shooshtari, Gelareh Mohamadi, Behnam Molaee Ardekani, Mohammad Bagher Shamsollahi

Abstract:

EEG signal is one of the oldest measures of brain activity that has been used vastly for clinical diagnoses and biomedical researches. However, EEG signals are highly contaminated with various artifacts, both from the subject and from equipment interferences. Among these various kinds of artifacts, ocular noise is the most important one. Since many applications such as BCI require online and real-time processing of EEG signal, it is ideal if the removal of artifacts is performed in an online fashion. Recently, some methods for online ocular artifact removing have been proposed. One of these methods is ARMAX modeling of EEG signal. This method assumes that the recorded EEG signal is a combination of EOG artifacts and the background EEG. Then the background EEG is estimated via estimation of ARMAX parameters. The other recently proposed method is based on adaptive filtering. This method uses EOG signal as the reference input and subtracts EOG artifacts from recorded EEG signals. In this paper we investigate the efficiency of each method for removing of EOG artifacts. A comparison is made between these two methods. Our undertaken conclusion from this comparison is that adaptive filtering method has better results compared with the results achieved by ARMAX modeling.

Keywords: Ocular Artifacts, EEG, Adaptive Filtering, ARMAX

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
6504 Construction of Space-Filling Designs for Three Input Variables Computer Experiments

Authors: Kazeem A. Osuolale, Waheed B. Yahya, Babatunde L. Adeleke

Abstract:

Latin hypercube designs (LHDs) have been applied in many computer experiments among the space-filling designs found in the literature. A LHD can be randomly generated but a randomly chosen LHD may have bad properties and thus act poorly in estimation and prediction. There is a connection between Latin squares and orthogonal arrays (OAs). A Latin square of order s involves an arrangement of s symbols in s rows and s columns, such that every symbol occurs once in each row and once in each column and this exists for every non-negative integer s. In this paper, a computer program was written to construct orthogonal array-based Latin hypercube designs (OA-LHDs). Orthogonal arrays (OAs) were constructed from Latin square of order s and the OAs constructed were afterward used to construct the desired Latin hypercube designs for three input variables for use in computer experiments. The LHDs constructed have better space-filling properties and they can be used in computer experiments that involve only three input factors. MATLAB 2012a computer package (www.mathworks.com/) was used for the development of the program that constructs the designs.

Keywords: Computer Experiments, Latin Squares, Latin Hypercube Designs, Orthogonal Array, Space-filling Designs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
6503 Transceiver for Differential Wave Pipe-Lined Serial Interconnect with Surfing

Authors: Bhaskar M., Venkataramani B.

Abstract:

In the literature, surfing technique has been proposed for single ended wave-pipelined serial interconnects to increase the data transfer rate. In this paper a novel surfing technique is proposed for differential wave-pipelined serial interconnects, which uses a 'Controllable inverter pair' for surfing. To evaluate the efficiency of this technique, a transceiver with transmitter, receiver, delay locked loop (DLL) along with 40mm metal 4 interconnects using the proposed surfing technique is implemented in UMC 180nm technology and their performances are studied through post layout simulations. From the study, it is observed that the proposed scheme permits 1.875 times higher data transmission rate compared to the single ended scheme whose maximum data transfer rate is 1.33 GB/s. The proposed scheme has the ability to receive the correct data even with stuck-at-faults in the complementary line.

Keywords: Controllable inverter pair, differential interconnect, serial link, surfing, wave pipelining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
6502 A Novel Digital Watermarking Technique Basedon ISB (Intermediate Significant Bit)

Authors: Akram M. Zeki, Azizah A. Manaf

Abstract:

Least Significant Bit (LSB) technique is the earliest developed technique in watermarking and it is also the most simple, direct and common technique. It essentially involves embedding the watermark by replacing the least significant bit of the image data with a bit of the watermark data. The disadvantage of LSB is that it is not robust against attacks. In this study intermediate significant bit (ISB) has been used in order to improve the robustness of the watermarking system. The aim of this model is to replace the watermarked image pixels by new pixels that can protect the watermark data against attacks and at the same time keeping the new pixels very close to the original pixels in order to protect the quality of watermarked image. The technique is based on testing the value of the watermark pixel according to the range of each bit-plane.

Keywords: Watermarking, LSB, ISB, Robustness.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1699
6501 Using Data Mining Technique for Scholarship Disbursement

Authors: J. K. Alhassan, S. A. Lawal

Abstract:

This work is on decision tree-based classification for the disbursement of scholarship. Tree-based data mining classification technique is used in other to determine the generic rule to be used to disburse the scholarship. The system based on the defined rules from the tree is able to determine the class (status) to which an applicant shall belong whether Granted or Not Granted. The applicants that fall to the class of granted denote a successful acquirement of scholarship while those in not granted class are unsuccessful in the scheme. An algorithm that can be used to classify the applicants based on the rules from tree-based classification was also developed. The tree-based classification is adopted because of its efficiency, effectiveness, and easy to comprehend features. The system was tested with the data of National Information Technology Development Agency (NITDA) Abuja, a Parastatal of Federal Ministry of Communication Technology that is mandated to develop and regulate information technology in Nigeria. The system was found working according to the specification. It is therefore recommended for all scholarship disbursement organizations.

Keywords: Decision tree, classification, data mining, scholarship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2145
6500 Finite Element Application to Estimate Inservice Material Properties using Miniature Specimen

Authors: G. Partheepan, D.K. Sehgal, R.K. Pandey

Abstract:

This paper presents a method for determining the uniaxial tensile properties such as Young-s modulus, yield strength and the flow behaviour of a material in a virtually non-destructive manner. To achieve this, a new dumb-bell shaped miniature specimen has been designed. This helps in avoiding the removal of large size material samples from the in-service component for the evaluation of current material properties. The proposed miniature specimen has an advantage in finite element modelling with respect to computational time and memory space. Test fixtures have been developed to enable the tension tests on the miniature specimen in a testing machine. The studies have been conducted in a chromium (H11) steel and an aluminum alloy (AR66). The output from the miniature test viz. load-elongation diagram is obtained and the finite element simulation of the test is carried out using a 2D plane stress analysis. The results are compared with the experimental results. It is observed that the results from the finite element simulation corroborate well with the miniature test results. The approach seems to have potential to predict the mechanical properties of the materials, which could be used in remaining life estimation of the various in-service structures.

Keywords: ABAQUS, finite element, miniature test, tensileproperties

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
6499 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey

Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff

Abstract:

This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.

Keywords: Cruise behavior, on-board environmental factors, on-board experience, user or customer satisfaction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859
6498 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: Data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1109
6497 Linking OpenCourseWares and Open Education Resources: Creating an Effective Search and Recommendation System

Authors: Brett E. Shelton, Joel Duffin, Yuxuan Wang, Justin Ball

Abstract:

With a growing number of digital libraries and other open education repositories being made available throughout the world, effective search and retrieval tools are necessary to access the desired materials that surpass the effectiveness of traditional, allinclusive search engines. This paper discusses the design and use of Folksemantic, a platform that integrates OpenCourseWare search, Open Educational Resource recommendations, and social network functionality into a single open source project. The paper describes how the system was originally envisioned, its goals for users, and data that provides insight into how it is actually being used. Data sources include website click-through data, query logs, web server log files and user account data. Based on a descriptive analysis of its current use, modifications to the platform's design are recommended to better address goals of the system, along with recommendations for additional phases of research.

Keywords: Digital libraries, open education, recommendation system, social networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
6496 First Studies of the Influence of Single Gene Perturbations on the Inference of Genetic Networks

Authors: Frank Emmert-Streib, Matthias Dehmer

Abstract:

Inferring the network structure from time series data is a hard problem, especially if the time series is short and noisy. DNA microarray is a technology allowing to monitor the mRNA concentration of thousands of genes simultaneously that produces data of these characteristics. In this study we try to investigate the influence of the experimental design on the quality of the result. More precisely, we investigate the influence of two different types of random single gene perturbations on the inference of genetic networks from time series data. To obtain an objective quality measure for this influence we simulate gene expression values with a biologically plausible model of a known network structure. Within this framework we study the influence of single gene knock-outs in opposite to linearly controlled expression for single genes on the quality of the infered network structure.

Keywords: Dynamic Bayesian networks, microarray data, structure learning, Markov chain Monte Carlo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
6495 Performance Evaluation of a Minimum Mean Square Error-Based Physical Sidelink Share Channel Receiver under Fading Channel

Authors: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Abstract:

Cellular Vehicle to Everything (C-V2X) is considered a promising solution for future autonomous driving. From Release 16 to Release 17, the Third Generation Partnership Project (3GPP) has introduced the definitions and services for 5G New Radio (NR) V2X. Since establishing a simulator for C-V2X communications is an essential preliminary step to achieve reliable and stable communication links, this paper proposes a complete framework of a link-level simulator based on the 3GPP specifications for the Physical Sidelink Share Channel (PSSCH) of the 5G NR Physical Layer (PHY). In this framework, several algorithms in the receiver part, i.e., sliding window in channel estimation and Minimum Mean Square Error (MMSE)-based equalization, are developed. Finally, the performance of the developed PSSCH receiver is validated through extensive simulations under different assumptions.

Keywords: Yang Fu, Jaime Rodrigo Navarro, Jose F. Monserrat, Faiza Bouchmal, Oscar Carrasco Quilis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 123
6494 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: Welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 790
6493 The Application of Queuing Theory in Multi-Stage Production Lines

Authors: Hani Shafeek, Muhammed Marsudi

Abstract:

The purpose of this work is examining the multiproduct multi-stage in a battery production line. To improve the performances of an assembly production line by determine the efficiency of each workstation. Data collected from every workstation. The data are throughput rate, number of operator, and number of parts that arrive and leaves during part processing. Data for the number of parts that arrives and leaves are collected at least at the amount of ten samples to make the data is possible to be analyzed by Chi-Squared Goodness Test and queuing theory. Measures of this model served as the comparison with the standard data available in the company. Validation of the task time value resulted by comparing it with the task time value based on the company database. Some performance factors for the multi-product multi-stage in a battery production line in this work are shown. The efficiency in each workstation was also shown. Total production time to produce each part can be determined by adding the total task time in each workstation. To reduce the queuing time and increase the efficiency based on the analysis any probably improvement should be done. One probably action is by increasing the number of operators how manually operate this workstation.

Keywords: Production line, manufacturing, performance measurement, queuing theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3136
6492 Novel Rao-Blackwellized Particle Filter for Mobile Robot SLAM Using Monocular Vision

Authors: Maohai Li, Bingrong Hong, Zesu Cai, Ronghua Luo

Abstract:

This paper presents the novel Rao-Blackwellised particle filter (RBPF) for mobile robot simultaneous localization and mapping (SLAM) using monocular vision. The particle filter is combined with unscented Kalman filter (UKF) to extending the path posterior by sampling new poses that integrate the current observation which drastically reduces the uncertainty about the robot pose. The landmark position estimation and update is also implemented through UKF. Furthermore, the number of resampling steps is determined adaptively, which seriously reduces the particle depletion problem, and introducing the evolution strategies (ES) for avoiding particle impoverishment. The 3D natural point landmarks are structured with matching Scale Invariant Feature Transform (SIFT) feature pairs. The matching for multi-dimension SIFT features is implemented with a KD-Tree in the time cost of O(log2 N). Experiment results on real robot in our indoor environment show the advantages of our methods over previous approaches.

Keywords: Mobile robot, simultaneous localization and mapping, Rao-Blackwellised particle filter, evolution strategies, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2127
6491 Robust Digital Cinema Watermarking

Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi

Abstract:

With the advent of digital cinema and digital broadcasting, copyright protection of video data has been one of the most important issues. We present a novel method of watermarking for video image data based on the hardware and digital wavelet transform techniques and name it as “traceable watermarking" because the watermarked data is constructed before the transmission process and traced after it has been received by an authorized user. In our method, we embed the watermark to the lowest part of each image frame in decoded video by using a hardware LSI. Digital Cinema is an important application for traceable watermarking since digital cinema system makes use of watermarking technology during content encoding, encryption, transmission, decoding and all the intermediate process to be done in digital cinema systems. The watermark is embedded into the randomly selected movie frames using hash functions. Embedded watermark information can be extracted from the decoded video data. For that, there is no need to access original movie data. Our experimental results show that proposed traceable watermarking method for digital cinema system is much better than the convenient watermarking techniques in terms of robustness, image quality, speed, simplicity and robust structure.

Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip, traceable watermark, Hash Function, CRC-32.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1638
6490 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: Data analysis, interferon gamma release assay, statistical methods, tuberculosis infection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943
6489 Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart

Authors: Joval P George, Dr. Zheng Chen, Philip Shaw

Abstract:

This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.

Keywords: Principal component analysis, hotelling's t2 chart, multivariate statistical process control, drinking water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2769
6488 Replicating Data Objects in Large-scale Distributed Computing Systems using Extended Vickrey Auction

Authors: Samee Ullah Khan, Ishfaq Ahmad

Abstract:

This paper proposes a novel game theoretical technique to address the problem of data object replication in largescale distributed computing systems. The proposed technique draws inspiration from computational economic theory and employs the extended Vickrey auction. Specifically, players in a non-cooperative environment compete for server-side scarce memory space to replicate data objects so as to minimize the total network object transfer cost, while maintaining object concurrency. Optimization of such a cost in turn leads to load balancing, fault-tolerance and reduced user access time. The method is experimentally evaluated against four well-known techniques from the literature: branch and bound, greedy, bin-packing and genetic algorithms. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution quality.

Keywords: Auctions, data replication, pricing, static allocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
6487 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: Steganalysis, security, fast Fourier transform, streaming media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 763
6486 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: Network worms, malware infection propagating malicious code, virus, security, VPN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2798
6485 A Prediction-Based Reversible Watermarking for MRI Images

Authors: Nuha Omran Abokhdair, Azizah Bt Abdul Manaf

Abstract:

Reversible watermarking is a special branch of image watermarking, that is able to recover the original image after extracting the watermark from the image. In this paper, an adaptive prediction-based reversible watermarking scheme is presented, in order to increase the payload capacity of MRI medical images. The scheme divides the image into two parts, Region of Interest (ROI) and Region of Non-Interest (RONI). Two bits are embedded in each embeddable pixel of RONI and one bit is embedded in each embeddable pixel of ROI. The experimental results demonstrate that the proposed scheme is able to achieve high embedding capacity. This is mainly caused by two reasons. First, the pixels that were excluded from data embedding due to overflow/underflow are used for data embedding. Second, large location map that need to be added to watermark data as overhead is eliminated and thus lower data embedding capacity is prevented. Moreover, the scheme provides good visual quality to the watermarked image.

Keywords: Medical image watermarking, reversible watermarking, Difference Expansion, Prediction-Error Expansion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1908