Search results for: steganography; LSB Matching; RSA Encryption; data segments
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7805

Search results for: steganography; LSB Matching; RSA Encryption; data segments

7265 A Network Traffic Prediction Algorithm Based On Data Mining Technique

Authors: D. Prangchumpol

Abstract:

This paper is a description approach to predict incoming and outgoing data rate in network system by using association rule discover, which is one of the data mining techniques. Information of incoming and outgoing data in each times and network bandwidth are network performance parameters, which needed to solve in the traffic problem. Since congestion and data loss are important network problems. The result of this technique can predicted future network traffic. In addition, this research is useful for network routing selection and network performance improvement.

Keywords: Traffic prediction, association rule, data mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3640
7264 Screen of MicroRNA Targets in Zebrafish Using Heterogeneous Data Sources: A Case Study for Dre-miR-10 and Dre-miR-196

Authors: Yanju Zhang, Joost M. Woltering, Fons J. Verbeek

Abstract:

It has been established that microRNAs (miRNAs) play an important role in gene expression by post-transcriptional regulation of messengerRNAs (mRNAs). However, the precise relationships between microRNAs and their target genes in sense of numbers, types and biological relevance remain largely unclear. Dissecting the miRNA-target relationships will render more insights for miRNA targets identification and validation therefore promote the understanding of miRNA function. In miRBase, miRanda is the key algorithm used for target prediction for Zebrafish. This algorithm is high-throughput but brings lots of false positives (noise). Since validation of a large scale of targets through laboratory experiments is very time consuming, several computational methods for miRNA targets validation should be developed. In this paper, we present an integrative method to investigate several aspects of the relationships between miRNAs and their targets with the final purpose of extracting high confident targets from miRanda predicted targets pool. This is achieved by using the techniques ranging from statistical tests to clustering and association rules. Our research focuses on Zebrafish. It was found that validated targets do not necessarily associate with the highest sequence matching. Besides, for some miRNA families, the frequency of their predicted targets is significantly higher in the genomic region nearby their own physical location. Finally, in a case study of dre-miR-10 and dre-miR-196, it was found that the predicted target genes hoxd13a, hoxd11a, hoxd10a and hoxc4a of dre-miR- 10 while hoxa9a, hoxc8a and hoxa13a of dre-miR-196 have similar characteristics as validated target genes and therefore represent high confidence target candidates.

Keywords: MicroRNA targets validation, microRNA-target relationships, dre-miR-10, dre-miR-196.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1961
7263 Fuzzy Processing of Uncertain Data

Authors: Petr Morávek, Miloš Šeda

Abstract:

In practice, we often come across situations where it is necessary to make decisions based on incomplete or uncertain data. In control systems it may be due to the unknown exact mathematical model, or its excessive complexity (e.g. nonlinearity) when it is necessary to simplify it, respectively, to solve it using a rule base. In the case of databases, searching data we compare a similarity measure with of the requirements of the selection with stored data, where both the select query and the data itself may contain vague terms, for example in the form of linguistic qualifiers. In this paper, we focus on the processing of uncertain data in databases and demonstrate it on the example multi-criteria decision making in the selection of variants, specified by higher number of technical parameters.

Keywords: fuzzy logic, linguistic variable, multicriteria decision

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
7262 A Pipelined FSBM Hardware Architecture for HTDV-H.26x

Authors: H. Loukil, A. Ben Atitallah, F. Ghozzi, M. A. Ben Ayed, N. Masmoudi

Abstract:

In MPEG and H.26x standards, to eliminate the temporal redundancy we use motion estimation. Given that the motion estimation stage is very complex in terms of computational effort, a hardware implementation on a re-configurable circuit is crucial for the requirements of different real time multimedia applications. In this paper, we present hardware architecture for motion estimation based on "Full Search Block Matching" (FSBM) algorithm. This architecture presents minimum latency, maximum throughput, full utilization of hardware resources such as embedded memory blocks, and combining both pipelining and parallel processing techniques. Our design is described in VHDL language, verified by simulation and implemented in a Stratix II EP2S130F1020C4 FPGA circuit. The experiment result show that the optimum operating clock frequency of the proposed design is 89MHz which achieves 160M pixels/sec.

Keywords: SAD, FSBM, Hardware Implementation, FPGA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
7261 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation

Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai

Abstract:

Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.

Keywords: Ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 415
7260 Performance of Dual MRC Receiver for M-ary Modulations over Correlated Nakagami-m Fading Channels with Non-identical and Arbitrary Fading Parameter

Authors: Rupaban Subadar

Abstract:

Performance of a dual maximal ratio combining receiver has been analyzed for M-ary coherent and non-coherent modulations over correlated Nakagami-m fading channels with nonidentical and arbitrary fading parameter. The classical probability density function (PDF) based approach is used for analysis. Expressions for outage probability and average symbol error performance for M-ary coherent and non-coherent modulations have been obtained. The obtained results are verified against the special case published results and found to be matching. The effect of the unequal fading parameters, branch correlation and unequal input average SNR on the receiver performance has been studied.

Keywords: MRC, correlated Nakagami-m fading, non-identicalfading statistics, average symbol error rate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1428
7259 Automated Stereophotogrammetry Data Cleansing

Authors: Stuart Henry, Philip Morrow, John Winder, Bryan Scotney

Abstract:

The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.

Keywords: Data cleansing, stereophotogrammetry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
7258 Formal Analysis of a Public-Key Algorithm

Authors: Markus Kaiser, Johannes Buchmann

Abstract:

In this article, a formal specification and verification of the Rabin public-key scheme in a formal proof system is presented. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. A major objective of this article is the presentation of the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Moreover, we explicate a (computer-proven) formalization of correctness as well as a computer verification of security properties using a straight-forward computation model in Isabelle/HOL. The analysis uses a given database to prove formal properties of our implemented functions with computer support. The main task in designing a practical formalization of correctness as well as efficient computer proofs of security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as efficient formal proofs. Consequently, we get reliable proofs with a minimal error rate augmenting the used database, what provides a formal basis for more computer proof constructions in this area.

Keywords: public-key encryption, Rabin public-key scheme, formalproof system, higher-order logic, formal verification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
7257 An Improved Data Mining Method Applied to the Search of Relationship between Metabolic Syndrome and Lifestyles

Authors: Yi Chao Huang, Yu Ling Liao, Chiu Shuang Lin

Abstract:

A data cutting and sorting method (DCSM) is proposed to optimize the performance of data mining. DCSM reduces the calculation time by getting rid of redundant data during the data mining process. In addition, DCSM minimizes the computational units by splitting the database and by sorting data with support counts. In the process of searching for the relationship between metabolic syndrome and lifestyles with the health examination database of an electronics manufacturing company, DCSM demonstrates higher search efficiency than the traditional Apriori algorithm in tests with different support counts.

Keywords: Data mining, Data cutting and sorting method, Apriori algorithm, Metabolic syndrome

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1566
7256 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems

Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan

Abstract:

Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.

Keywords: Data mining, hybrid storage system, recurrent neural network, support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716
7255 Extended Constraint Mask Based One-Bit Transform for Low-Complexity Fast Motion Estimation

Authors: Oğuzhan Urhan

Abstract:

In this paper, an improved motion estimation (ME) approach based on weighted constrained one-bit transform is proposed for block-based ME employed in video encoders. Binary ME approaches utilize low bit-depth representation of the original image frames with a Boolean exclusive-OR based hardware efficient matching criterion to decrease computational burden of the ME stage. Weighted constrained one-bit transform (WC‑1BT) based approach improves the performance of conventional C-1BT based ME employing 2-bit depth constraint mask instead of a 1-bit depth mask. In this work, the range of constraint mask is further extended to increase ME performance of WC-1BT approach. Experiments reveal that the proposed method provides better ME accuracy compared existing similar ME methods in the literature.

Keywords: Fast motion estimation, low-complexity motion estimation, video coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
7254 The Appeal of Vocal Islamism in the West: The Case of Hizb ut-Tahrir vis-à-vis Its Competitors

Authors: Elisa Orofino

Abstract:

Islamism is a very debated topic in the West but almost exclusively explored in its violent forms. Nevertheless, a number of “vocal radical Islamist” groups exist in the West and legally operate because of their non-violent nature. Vocal radicals continually inspire individuals and lead them towards specific goals and priorities, sometimes even towards violence. This paper uses the long-living group Hizb ut-Tahrir (HT) to explore the elements that make the organization appealing to segments of Muslim community in the West. This paper uses three agency variables - reflexive monitoring, the rationalization of action and the motivations for actions – to analyze HT’s appeal vis-à-vis two other Islamist groups, Ikhwan al-Muslimun and Jamaat-e-Islami (JeI), having similar goals and the same high international profile. This paper concludes that HT’s uniqueness is given by its method, detailed vision of the caliphate, consistency over time and the emphasis placed on the caliphate as the leading force of HT’s unchanged motivation for action.

Keywords: Agency, Caliphate, Radicalization, Vocal Radicals.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 750
7253 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, data mining, Hadoop, Map Reduce, MongoDB, NoSQL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 667
7252 Analysis and Design of Simultaneous Dual Band Harvesting System with Enhanced Efficiency

Authors: Zina Saheb, Ezz El-Masry, Jean-François Bousquet

Abstract:

This paper presents an enhanced efficiency simultaneous dual band energy harvesting system for wireless body area network. A bulk biasing is used to enhance the efficiency of the adapted rectifier design to reduce Vth of MOSFET. The presented circuit harvests the radio frequency (RF) energy from two frequency bands: 1 GHz and 2.4 GHz. It is designed with TSMC 65-nm CMOS technology and high quality factor dual matching network to boost the input voltage. Full circuit analysis and modeling is demonstrated. The simulation results demonstrate a harvester with an efficiency of 23% at 1 GHz and 46% at 2.4 GHz at an input power as low as -30 dBm.

Keywords: Energy harvester, simultaneous, dual band, CMOS, differential rectifier, voltage boosting, TSMC 65nm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
7251 Real Time Classification of Political Tendency of Twitter Spanish Users based on Sentiment Analysis

Authors: Marc Solé, Francesc Giné, Magda Valls, Nina Bijedic

Abstract:

What people say on social media has turned into a rich source of information to understand social behavior. Specifically, the growing use of Twitter social media for political communication has arisen high opportunities to know the opinion of large numbers of politically active individuals in real time and predict the global political tendencies of a specific country. It has led to an increasing body of research on this topic. The majority of these studies have been focused on polarized political contexts characterized by only two alternatives. Unlike them, this paper tackles the challenge of forecasting Spanish political trends, characterized by multiple political parties, by means of analyzing the Twitters Users political tendency. According to this, a new strategy, named Tweets Analysis Strategy (TAS), is proposed. This is based on analyzing the users tweets by means of discovering its sentiment (positive, negative or neutral) and classifying them according to the political party they support. From this individual political tendency, the global political prediction for each political party is calculated. In order to do this, two different strategies for analyzing the sentiment analysis are proposed: one is based on Positive and Negative words Matching (PNM) and the second one is based on a Neural Networks Strategy (NNS). The complete TAS strategy has been performed in a Big-Data environment. The experimental results presented in this paper reveal that NNS strategy performs much better than PNM strategy to analyze the tweet sentiment. In addition, this research analyzes the viability of the TAS strategy to obtain the global trend in a political context make up by multiple parties with an error lower than 23%.

Keywords: Political tendency, prediction, sentiment analysis, Twitter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 820
7250 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: Critical success factors, data quality, data quality management, Delphi, Q-Sort.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1080
7249 Secure Data Aggregation Using Clusters in Sensor Networks

Authors: Prakash G L, Thejaswini M, S H Manjula, K R Venugopal, L M Patnaik

Abstract:

Wireless sensor network can be applied to both abominable and military environments. A primary goal in the design of wireless sensor networks is lifetime maximization, constrained by the energy capacity of batteries. One well-known method to reduce energy consumption in such networks is data aggregation. Providing efcient data aggregation while preserving data privacy is a challenging problem in wireless sensor networks research. In this paper, we present privacy-preserving data aggregation scheme for additive aggregation functions. The Cluster-based Private Data Aggregation (CPDA)leverages clustering protocol and algebraic properties of polynomials. It has the advantage of incurring less communication overhead. The goal of our work is to bridge the gap between collaborative data collection by wireless sensor networks and data privacy. We present simulation results of our schemes and compare their performance to a typical data aggregation scheme TAG, where no data privacy protection is provided. Results show the efficacy and efficiency of our schemes.

Keywords: Aggregation, Clustering, Query Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712
7248 Matching Facial Images using Age Related Morphing Changes

Authors: Udeni Jayasinghe, Anuja Dharmaratne

Abstract:

Each year many people are reported missing in most of the countries in the world owing to various reasons. Arrangements have to be made to find these people after some time. So the investigating agencies are compelled to make out these people by using manpower. But in many cases, the investigations carried out to find out an absconding for a long time may not be successful. At a time like that it may be difficult to identify these people by examining their old photographs, because their facial appearance might have changed mainly due to the natural aging process. On some occasions in forensic medicine if a dead body is found, investigations should be held to make sure that this corpse belongs to the same person disappeared some time ago. With the passage of time the face of the person might have changed and there should be a mechanism to reveal the person-s identity. In order to make this process easy, we must guess and decide as to how he will look like by now. To address this problem this paper presents a way of synthesizing a facial image with the aging effects.

Keywords: Cranio-facial growth model, eigenfaces, eigenvectors, Face Anthropometry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
7247 A New Protocol for Concealed Data Aggregation in Wireless Sensor Networks

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
7246 IMDC: An Image-Mapped Data Clustering Technique for Large Datasets

Authors: Faruq A. Al-Omari, Nabeel I. Al-Fayoumi

Abstract:

In this paper, we present a new algorithm for clustering data in large datasets using image processing approaches. First the dataset is mapped into a binary image plane. The synthesized image is then processed utilizing efficient image processing techniques to cluster the data in the dataset. Henceforth, the algorithm avoids exhaustive search to identify clusters. The algorithm considers only a small set of the data that contains critical boundary information sufficient to identify contained clusters. Compared to available data clustering techniques, the proposed algorithm produces similar quality results and outperforms them in execution time and storage requirements.

Keywords: Data clustering, Data mining, Image-mapping, Pattern discovery, Predictive analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1475
7245 The New Method of Concealed Data Aggregation in Wireless Sensor: A Case Study

Authors: M. Abbasi Dezfouli, S. Mazraeh, M. H. Yektaie

Abstract:

Wireless sensor networks (WSN) consists of many sensor nodes that are placed on unattended environments such as military sites in order to collect important information. Implementing a secure protocol that can prevent forwarding forged data and modifying content of aggregated data and has low delay and overhead of communication, computing and storage is very important. This paper presents a new protocol for concealed data aggregation (CDA). In this protocol, the network is divided to virtual cells, nodes within each cell produce a shared key to send and receive of concealed data with each other. Considering to data aggregation in each cell is locally and implementing a secure authentication mechanism, data aggregation delay is very low and producing false data in the network by malicious nodes is not possible. To evaluate the performance of our proposed protocol, we have presented computational models that show the performance and low overhead in our protocol.

Keywords: Wireless Sensor Networks, Security, Concealed Data Aggregation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1746
7244 A Comparative Study of SVM Classifiers and Artificial Neural Networks Application for Rolling Element Bearing Fault Diagnosis using Wavelet Transform Preprocessing

Authors: Commander Sunil Tyagi

Abstract:

Effectiveness of Artificial Neural Networks (ANN) and Support Vector Machines (SVM) classifiers for fault diagnosis of rolling element bearings are presented in this paper. The characteristic features of vibration signals of rotating driveline that was run in its normal condition and with faults introduced were used as input to ANN and SVM classifiers. Simple statistical features such as standard deviation, skewness, kurtosis etc. of the time-domain vibration signal segments along with peaks of the signal and peak of power spectral density (PSD) are used as features to input the ANN and SVM classifier. The effect of preprocessing of the vibration signal by Discreet Wavelet Transform (DWT) prior to feature extraction is also studied. It is shown from the experimental results that the performance of SVM classifier in identification of bearing condition is better then ANN and pre-processing of vibration signal by DWT enhances the effectiveness of both ANN and SVM classifier

Keywords: ANN, Artificial Intelligence, Fault Diagnosis, Pattern Recognition, Rolling Element Bearing, SVM. Wavelet Transform

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2083
7243 Peakwise Smoothing of Data Models using Wavelets

Authors: D Sudheer Reddy, N Gopal Reddy, P V Radhadevi, J Saibaba, Geeta Varadan

Abstract:

Smoothing or filtering of data is first preprocessing step for noise suppression in many applications involving data analysis. Moving average is the most popular method of smoothing the data, generalization of this led to the development of Savitzky-Golay filter. Many window smoothing methods were developed by convolving the data with different window functions for different applications; most widely used window functions are Gaussian or Kaiser. Function approximation of the data by polynomial regression or Fourier expansion or wavelet expansion also gives a smoothed data. Wavelets also smooth the data to great extent by thresholding the wavelet coefficients. Almost all smoothing methods destroys the peaks and flatten them when the support of the window is increased. In certain applications it is desirable to retain peaks while smoothing the data as much as possible. In this paper we present a methodology called as peak-wise smoothing that will smooth the data to any desired level without losing the major peak features.

Keywords: smoothing, moving average, peakwise smoothing, spatialdensity models, planar shape models, wavelets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
7242 A New Precautionary Method for Measurement and Improvement the Data Quality

Authors: Seyed Mohammad Hossein Moossavizadeh, Mehran Mohsenzadeh, Nasrin Arshadi

Abstract:

the data quality is a kind of complex and unstructured concept, which is concerned by information systems managers. The reason of this attention is the high amount of Expenses for maintenance and cleaning of the inefficient data. Such a data more than its expenses of lack of quality, cause wrong statistics, analysis and decisions in organizations. Therefor the managers intend to improve the quality of their information systems' data. One of the basic subjects of quality improvement is the evaluation of the amount of it. In this paper, we present a precautionary method, which with its application the data of information systems would have a better quality. Our method would cover different dimensions of data quality; therefor it has necessary integrity. The presented method has tested on three dimensions of accuracy, value-added and believability and the results confirm the improvement and integrity of this method.

Keywords: Data quality, precaution, information system, measurement, improvement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1449
7241 Energy Benefits of Urban Platooning with Self-Driving Vehicles

Authors: Eduardo F. Mello, Peter H. Bauer

Abstract:

The primary focus of this paper is the generation of energy-optimal speed trajectories for heterogeneous electric vehicle platoons in urban driving conditions. Optimal speed trajectories are generated for individual vehicles and for an entire platoon under the assumption that they can be executed without errors, as would be the case for self-driving vehicles. It is then shown that the optimization for the “average vehicle in the platoon” generates similar transportation energy savings to optimizing speed trajectories for each vehicle individually. The introduced approach only requires the lead vehicle to run the optimization software while the remaining vehicles are only required to have adaptive cruise control capability. The achieved energy savings are typically between 30% and 50% for stop-to-stop segments in cities. The prime motivation of urban platooning comes from the fact that urban platoons efficiently utilize the available space and the minimization of transportation energy in cities is important for many reasons, i.e., for environmental, power, and range considerations.

Keywords: Electric vehicles, energy efficiency, optimization, platooning, self-driving vehicles, urban traffic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1228
7240 An Efficient Data Mining Approach on Compressed Transactions

Authors: Jia-Yu Dai, Don-Lin Yang, Jungpin Wu, Ming-Chuan Hung

Abstract:

In an era of knowledge explosion, the growth of data increases rapidly day by day. Since data storage is a limited resource, how to reduce the data space in the process becomes a challenge issue. Data compression provides a good solution which can lower the required space. Data mining has many useful applications in recent years because it can help users discover interesting knowledge in large databases. However, existing compression algorithms are not appropriate for data mining. In [1, 2], two different approaches were proposed to compress databases and then perform the data mining process. However, they all lack the ability to decompress the data to their original state and improve the data mining performance. In this research a new approach called Mining Merged Transactions with the Quantification Table (M2TQT) was proposed to solve these problems. M2TQT uses the relationship of transactions to merge related transactions and builds a quantification table to prune the candidate itemsets which are impossible to become frequent in order to improve the performance of mining association rules. The experiments show that M2TQT performs better than existing approaches.

Keywords: Association rule, data mining, merged transaction, quantification table.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
7239 Weigh-in-Motion Data Analysis Software for Developing Traffic Data for Mechanistic Empirical Pavement Design

Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder

Abstract:

Currently, there are few user friendly Weigh-in- Motion (WIM) data analysis softwares available which can produce traffic input data for the recently developed AASHTOWare pavement Mechanistic-Empirical (ME) design software. However, these softwares have only rudimentary Quality Control (QC) processes. Therefore, they cannot properly deal with erroneous WIM data. As the pavement performance is highly sensible to the quality of WIM data, it is highly recommended to use more refined QC process on raw WIM data to get a good result. This study develops a userfriendly software, which can produce traffic input for the ME design software. This software takes the raw data (Class and Weight data) collected from the WIM station and processes it with a sophisticated QC procedure. Traffic data such as traffic volume, traffic distribution, axle load spectra, etc. can be obtained from this software; which can directly be used in the ME design software.

Keywords: Weigh-in-motion, software, axle load spectra, traffic distribution, AASHTOWare.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1877
7238 Study on the Evaluation of the Chaotic Cipher System Using the Improved Volterra Filters and the RBFN Mapping

Authors: Hirotaka Watanabe, Takaaki Kondo, Daiki Yoshida, Ariyoshi Nakayama, Taichi Sato, Shuhei Kuriyama, Hiroyuki Kamata

Abstract:

In this paper, we propose a chaotic cipher system consisting of Improved Volterra Filters and the mapping that is created from the actual voice by using Radial Basis Function Network. In order to achieve a practical system, the system supposes to use the digital communication line, such as the Internet, to maintain the parameter matching between the transmitter and receiver sides. Therefore, in order to withstand the attack from outside, it is necessary that complicate the internal state and improve the sensitivity coefficient. In this paper, we validate the robustness of proposed method from three perspectives of "Chaotic properties", "Randomness", "Coefficient sensitivity".

Keywords: Chaos cipher, 16-bit-length fixed point arithmetic, Volterra filter, Seacret communications, RBF Network

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1795
7237 The Service Failure and Recovery in the Information Technology Services

Authors: Jun Luo, Weiguo Zhang., Dabin Qin

Abstract:

It is important to retain customer satisfaction in information technology services. When a service failure occurs, companies need to take service recovery action to recover their customer satisfaction. Although companies cannot avoid all problems and complaints, they should try to make up. Therefore, service failure and service recovery have become an important and challenging issue for companies. In this paper, the literature and the problems in the information technology services were reviewed. An integrated model of profit driven for the service failure and service recovery was established in view of the benefit of customer and enterprise. Moreover, the interaction between service failure and service recovery strategy was studied, the result of which verified the matching principles of the service recovery strategy and the type of service failure. In addition, the relationship between the cost of service recovery and customer-s cumulative value of service after recovery was analyzed with the model. The result attributes to managers in deciding on appropriate resource allocations for recovery strategies.

Keywords: service failure, service recovery, informationtechnology services

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2086
7236 Hybrid Authentication System Using QR Code with OTP

Authors: Salim Istyaq

Abstract:

As we know, number of Internet users are increasing drastically. Now, people are using different online services provided by banks, colleges/schools, hospitals, online utility, bill payment and online shopping sites. To access online services, text-based authentication system is in use. The text-based authentication scheme faces some drawbacks with usability and security issues that bring troubles to users. The core element of computational trust is identity. The aim of the paper is to make the system more compliable for the imposters and more reliable for the users, by using the graphical authentication approach. In this paper, we are using the more powerful tool of encoding the options in graphical QR format and also there will be the acknowledgment which will send to the user’s mobile for final verification. The main methodology depends upon the encryption option and final verification by confirming a set of pass phrase on the legal users, the outcome of the result is very powerful as it only gives the result at once when the process is successfully done. All processes are cross linked serially as the output of the 1st process, is the input of the 2nd and so on. The system is a combination of recognition and pure recall based technique. Presented scheme is useful for devices like PDAs, iPod, phone etc. which are more handy and convenient to use than traditional desktop computer systems.

Keywords: Graphical Password, OTP, QR Codes, Recognition based graphical user authentication, usability and security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1636