Search results for: Multiple data sources
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9046

Search results for: Multiple data sources

7156 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: Steganalysis, security, fast Fourier transform, streaming media.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 749
7155 How Can We Carry Out Green Incentives Most Efficiently?

Authors: Peter Yang

Abstract:

Green incentives are included in the “American Recovery and Reinvestment Act of 2009" (ARRA). It is, however, unclear how these government incentives can be carried out most effectively according to market-based principles and if they can serve as a catalyst for an accelerated green transformation and an ultimate solution to the current U.S. and global economic and financial crisis. The article will compare the existing U.S. green economic policies with those in Germany, identify problems, and suggest improvements to allow the green stimulus incentives to achieve the best results in the process of an accelerated green transformation. The author argues that the current U.S. green stimulus incentives can only be most successful if they are carried out as part of a visionary, comprehensive, long-term, and consistent strategy of the green economic transformation.

Keywords: Green incentives, financial crisis, green economy, renewable energy sources, energy efficiency.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1594
7154 Adaptive Distributed Genetic Algorithms and Its VLSI Design

Authors: Kazutaka Kobayashi, Norihiko Yoshida, Shuji Narazaki

Abstract:

This paper presents a dynamic adaptation scheme for the frequency of inter-deme migration in distributed genetic algorithms (GA), and its VLSI hardware design. Distributed GA, or multi-deme-based GA, uses multiple populations which evolve concurrently. The purpose of dynamic adaptation is to improve convergence performance so as to obtain better solutions. Through simulation experiments, we proved that our scheme achieves better performance than fixed frequency migration schemes.

Keywords: Genetic algorithms, dynamic adaptation, VLSI hardware.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
7153 Towards an Effective Reputation Assessment Process in Peer-to-Peer Systems

Authors: Farag Azzedin, Ahmad Ridha

Abstract:

The need for reputation assessment is particularly strong in peer-to-peer (P2P) systems because the peers' personal site autonomy is amplified by the inherent technological decentralization of the environment. However, the decentralization notion makes the problem of designing a peer-to-peer based reputation assessment substantially harder in P2P networks than in centralized settings.Existing reputation systems tackle the reputation assessment process in an ad-hoc manner. There is no systematic and coherent way to derive measures and analyze the current reputation systems. In this paper, we propose a reputation assessment process and use it to classify the existing reputation systems. Simulation experiments are conducted and focused on the different methods in selecting the recommendation sources and retrieving the recommendations. These two phases can contribute significantly to the overall performance due to communication cost and coverage.

Keywords: P2P Systems, Trust, Reputation, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
7152 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: Network worms, malware infection propagating malicious code, virus, security, VPN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2785
7151 A Prediction-Based Reversible Watermarking for MRI Images

Authors: Nuha Omran Abokhdair, Azizah Bt Abdul Manaf

Abstract:

Reversible watermarking is a special branch of image watermarking, that is able to recover the original image after extracting the watermark from the image. In this paper, an adaptive prediction-based reversible watermarking scheme is presented, in order to increase the payload capacity of MRI medical images. The scheme divides the image into two parts, Region of Interest (ROI) and Region of Non-Interest (RONI). Two bits are embedded in each embeddable pixel of RONI and one bit is embedded in each embeddable pixel of ROI. The experimental results demonstrate that the proposed scheme is able to achieve high embedding capacity. This is mainly caused by two reasons. First, the pixels that were excluded from data embedding due to overflow/underflow are used for data embedding. Second, large location map that need to be added to watermark data as overhead is eliminated and thus lower data embedding capacity is prevented. Moreover, the scheme provides good visual quality to the watermarked image.

Keywords: Medical image watermarking, reversible watermarking, Difference Expansion, Prediction-Error Expansion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
7150 Using TRACE and SNAP Codes to Establish the Model of Maanshan PWR for SBO Accident

Authors: B. R. Shen, J. R. Wang, J. H. Yang, S. W. Chen, C. Shih, Y. Chiang, Y. F. Chang, Y. H. Huang

Abstract:

In this research, TRACE code with the interface code-SNAP was used to simulate and analyze the SBO (station blackout) accident which occurred in Maanshan PWR (pressurized water reactor) nuclear power plant (NPP). There are four main steps in this research. First, the SBO accident data of Maanshan NPP were collected. Second, the TRACE/SNAP model of Maanshan NPP was established by using these data. Third, this TRACE/SNAP model was used to perform the simulation and analysis of SBO accident. Finally, the simulation and analysis of SBO with mitigation equipments was performed. The analysis results of TRACE are consistent with the data of Maanshan NPP. The mitigation equipments of Maanshan can maintain the safety of Maanshan in the SBO according to the TRACE predictions.

Keywords: PWR, TRACE, SBO, Maanshan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 729
7149 Salbutamol Sulphate-Ethylcellulose Tabletted Microcapsules: Pharmacokinetic Study using Convolution Approach

Authors: Ghulam Murtaza, Kalsoom Farzana

Abstract:

The aim of this article is to narrate the utility of novel simulation approach i.e. convolution method to predict blood concentration of drug utilizing dissolution data of salbutamol sulphate microparticulate formulations with different release patterns (1:1, 1:2 and 1:3, drug:polymer). Dissolution apparatus II USP 2007 and 900 ml double distilled water stirrd at 50 rpm was employed for dissolution analysis. From dissolution data, blood drug concentration was determined, and in return predicted blood drug concentration data was used to calculate the pharmacokinetic parameters i.e. Cmax, Tmax, and AUC. Convolution is a good biwaiver technique; however its better utility needs it application in the conditions where biorelevant dissolution media are used.

Keywords: Convolution, Dissolution, Pharmacokinetics, Salbutamol sulphate

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2565
7148 High Performance in Parallel Data Integration: An Empirical Evaluation of the Ratio Between Processing Time and Number of Physical Nodes

Authors: Caspar von Seckendorff, Eldar Sultanow

Abstract:

Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.

Keywords: Process delay, speedup, efficiency, parallel computing, data integration, E-Commerce, Amazon Elastic Compute Cloud (EC2), Hadoop, Nutch.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611
7147 Improvements in Navy Data Networks and Tactical Communication Systems

Authors: Laurent Enel, Franck Guillem

Abstract:

This paper considers the benefits gained by using an efficient quality of service management such as DiffServ technique to improve the performance of military communications. Low delay and no blockage must be achieved especially for real time tactical data. All traffic flows generated by different applications do not need same bandwidth, same latency, same error ratio and this scalable technique of packet management based on priority levels is analysed. End to end architectures supporting various traffic flows and including lowbandwidth and high-delay HF or SHF military links as well as unprotected Internet sub domains are studied. A tuning of Diffserv parameters is proposed in accordance with different loads of various traffic and different operational situations.

Keywords: Military data networks, Quality of service, Tacticalsystems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048
7146 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis

Authors: Sidi Yang, Haiyi Zhang

Abstract:

Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.

Keywords: Text mining, Twitter, topic model, sentiment analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
7145 Topological Queries on Graph-structured XML Data: Models and Implementations

Authors: Hongzhi Wang, Jianzhong Li, Jizhou Luo

Abstract:

In many applications, data is in graph structure, which can be naturally represented as graph-structured XML. Existing queries defined on tree-structured and graph-structured XML data mainly focus on subgraph matching, which can not cover all the requirements of querying on graph. In this paper, a new kind of queries, topological query on graph-structured XML is presented. This kind of queries consider not only the structure of subgraph but also the topological relationship between subgraphs. With existing subgraph query processing algorithms, efficient algorithms for topological query processing are designed. Experimental results show the efficiency of implementation algorithms.

Keywords: XML, Graph Structure, Topological query.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1397
7144 Validation and Selection between Machine Learning Technique and Traditional Methods to Reduce Bullwhip Effects: a Data Mining Approach

Authors: Hamid R. S. Mojaveri, Seyed S. Mousavi, Mojtaba Heydar, Ahmad Aminian

Abstract:

The aim of this paper is to present a methodology in three steps to forecast supply chain demand. In first step, various data mining techniques are applied in order to prepare data for entering into forecasting models. In second step, the modeling step, an artificial neural network and support vector machine is presented after defining Mean Absolute Percentage Error index for measuring error. The structure of artificial neural network is selected based on previous researchers' results and in this article the accuracy of network is increased by using sensitivity analysis. The best forecast for classical forecasting methods (Moving Average, Exponential Smoothing, and Exponential Smoothing with Trend) is resulted based on prepared data and this forecast is compared with result of support vector machine and proposed artificial neural network. The results show that artificial neural network can forecast more precisely in comparison with other methods. Finally, forecasting methods' stability is analyzed by using raw data and even the effectiveness of clustering analysis is measured.

Keywords: Artificial Neural Networks (ANN), bullwhip effect, demand forecasting, Support Vector Machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1985
7143 Discovery of Sequential Patterns Based On Constraint Patterns

Authors: Shigeaki Sakurai, Youichi Kitahata, Ryohei Orihara

Abstract:

This paper proposes a method that discovers sequential patterns corresponding to user-s interests from sequential data. This method expresses the interests as constraint patterns. The constraint patterns can define relationships among attributes of the items composing the data. The method recursively decomposes the constraint patterns into constraint subpatterns. The method evaluates the constraint subpatterns in order to efficiently discover sequential patterns satisfying the constraint patterns. Also, this paper applies the method to the sequential data composed of stock price indexes and verifies its effectiveness through comparing it with a method without using the constraint patterns.

Keywords: Sequential pattern mining, Constraint pattern, Attribute constraint, Stock price indexes

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1400
7142 Structure-Phase States of Al-Si Alloy after Electron-Beam Treatment and Multicycle Fatigue

Authors: Krestina V. Alsaraeva, Victor E. Gromov, Sergey V. Konovalov, Anna A. Atroshkina

Abstract:

Processing of Al-19.4Si alloy by high intensive electron beam has been carried out and multiple increases in fatigue life of the material have been revealed. Investigations of structure and surface modified layer destruction of Al-19.4Si alloy subjected to multicycle fatigue tests to fracture have been carried out by methods of scanning electron microscopy. The factors responsible for the increase of fatigue life of Al-19.4Si alloy have been revealed and analyzed.

Keywords: Al-19.4Si alloy, high intensive electron beam, multicycle fatigue, structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1995
7141 Modeling Prices of Electricity Futures at EEX

Authors: Robest Flasza, Milan Rippel, Jan Solc

Abstract:

The main aim of this paper is to develop and calibrate an econometric model for modeling prices of long term electricity futures contracts. The calibration of our model is performed on data from EEX AG allowing us to capture the specific features of German electricity market. The data sample contains several structural breaks which have to be taken into account for modeling. We model the data with an ARIMAX model which reveals high correlation between the price of electricity futures contracts and prices of LT futures contracts of fuels (namely coal, natural gas and crude oil). Besides this, also a share price index of representative electricity companies traded on Xetra, spread between 10Y and 1Y German bonds and exchange rate between EUR and USD appeared to have significant explanatory power over these futures contracts on EEX.

Keywords: electricity futures, EEX, ARIMAX, emissionallowances

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1983
7140 Performance Evaluation of Music and Minimum Norm Eigenvector Algorithms in Resolving Noisy Multiexponential Signals

Authors: Abdussamad U. Jibia, Momoh-Jimoh E. Salami

Abstract:

Eigenvector methods are gaining increasing acceptance in the area of spectrum estimation. This paper presents a successful attempt at testing and evaluating the performance of two of the most popular types of subspace techniques in determining the parameters of multiexponential signals with real decay constants buried in noise. In particular, MUSIC (Multiple Signal Classification) and minimum-norm techniques are examined. It is shown that these methods perform almost equally well on multiexponential signals with MUSIC displaying better defined peaks.

Keywords: Eigenvector, minimum norm, multiexponential, subspace.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1719
7139 Data-driven ASIC for Multichannel Sensors

Authors: Eduard Atkin, Alexander Klyuev, Vitaly Shumikhin

Abstract:

An approach and its implementation in 0.18 m CMOS process of the multichannel ASIC for capacitive (up to 30 pF) sensors are described in the paper. The main design aim was to study an analog data-driven architecture. The design was done for an analog derandomizing function of the 128 to 16 structure. That means that the ASIC structure should provide a parallel front-end readout of 128 input analog sensor signals and after the corresponding fast commutation with appropriate arbitration logic their processing by means of 16 output chains, including analog-to-digital conversion. The principal feature of the ASIC is a low power consumption within 2 mW/channel (including a 9-bit 20Ms/s ADC) at a maximum average channel hit rate not less than 150 kHz.

Keywords: Data-driven architecture, derandomizer, multichannel sensor readout

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1404
7138 Issues in Travel Demand Forecasting

Authors: Huey-Kuo Chen

Abstract:

Travel demand forecasting including four travel choices, i.e., trip generation, trip distribution, modal split and traffic assignment constructs the core of transportation planning. In its current application, travel demand forecasting has associated with three important issues, i.e., interface inconsistencies among four travel choices, inefficiency of commonly used solution algorithms, and undesirable multiple path solutions. In this paper, each of the three issues is extensively elaborated. An ideal unified framework for the combined model consisting of the four travel choices and variable demand functions is also suggested. Then, a few remarks are provided in the end of the paper

Keywords: Travel choices, B algorithm, entropy maximization, dynamic traffic assignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2324
7137 The Robust Clustering with Reduction Dimension

Authors: Dyah E. Herwindiati

Abstract:

A clustering is process to identify a homogeneous groups of object called as cluster. Clustering is one interesting topic on data mining. A group or class behaves similarly characteristics. This paper discusses a robust clustering process for data images with two reduction dimension approaches; i.e. the two dimensional principal component analysis (2DPCA) and principal component analysis (PCA). A standard approach to overcome this problem is dimension reduction, which transforms a high-dimensional data into a lower-dimensional space with limited loss of information. One of the most common forms of dimensionality reduction is the principal components analysis (PCA). The 2DPCA is often called a variant of principal component (PCA), the image matrices were directly treated as 2D matrices; they do not need to be transformed into a vector so that the covariance matrix of image can be constructed directly using the original image matrices. The decomposed classical covariance matrix is very sensitive to outlying observations. The objective of paper is to compare the performance of robust minimizing vector variance (MVV) in the two dimensional projection PCA (2DPCA) and the PCA for clustering on an arbitrary data image when outliers are hiden in the data set. The simulation aspects of robustness and the illustration of clustering images are discussed in the end of paper

Keywords: Breakdown point, Consistency, 2DPCA, PCA, Outlier, Vector Variance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1679
7136 A Virtual Simulation Environment for a Design and Verification of a GPGPU

Authors: Kwang Y. Lee, Tae R. Park, Jae C. Kwak, Yong S. Koo

Abstract:

When a small H/W IP is designed, we can develop an appropriate verification environment by observing the simulated signal waves, or using the serial test vectors for the fixed output. In the case of design and verification of a massive parallel processor with multiple IPs, it-s difficult to make a verification system with existing common verification environment, and to verify each partial IP. A TestDrive verification environment can build easy and reliable verification system that can produce highly intuitive results by applying Modelsim and SystemVerilog-s DPI. It shows many advantages, for example a high-level design of a GPGPU processor design can be migrate to FPGA board immediately.

Keywords: Virtual Simulation, Verification, IP Design, GPGPU

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1637
7135 Comparative Analysis between Corn and Ramon (Brosimum alicastrum) Starches to Be Used as Sustainable Bio-Based Plastics

Authors: C. R. Ríos-Soberanis, V. M. Moo-Huchin, R. J. Estrada-Leon, E. Perez-Pacheco

Abstract:

Polymers from renewable resources have attracted an increasing amount of attention over the last two decades, predominantly due to two major reasons: firstly environmental concerns, and secondly the realization that our petroleum resources are finite. Finding new uses for agricultural commodities is also an important area of research. Therefore, it is crucial to get new sources of natural materials that can be used in different applications. Ramon tree (Brosimum alicastrum) is a tropical plant that grows freely in Yucatan countryside. This paper focuses on the seeds recollection, processing and starch extraction and characterization in order to find out about its suitability as biomaterial. Results demonstrated that it has a high content of qualities to be used not only as comestible but also as an important component in polymeric blends.

Keywords: Biomaterials, biopolymer, starch, characterization techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2335
7134 Model Based Monitoring Using Integrated Data Validation, Simulation and Parameter Estimation

Authors: Reza Hayati, Maryam Sadi, Saeid Shokri, Mehdi Ahmadi Marvast, Saeid Hassan Boroojerdi, Amin Hamzavi Abedi

Abstract:

Efficient and safe plant operation can only be achieved if the operators are able to monitor all key process parameters. Instrumentation is used to measure many process variables, like temperatures, pressures, flow rates, compositions or other product properties. Therefore Performance monitoring is a suitable tool for operators. In this paper, we integrate rigorous simulation model, data reconciliation and parameter estimation to monitor process equipments and determine key performance indicator (KPI) of them. The applied method here has been implemented in two case studies.

Keywords: Data Reconciliation, Measurement, Optimization, Parameter Estimation, Performance Monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2060
7133 Investigating Ultra Violet (UV) Strength against Different Level of Altitude using New Environmental Data Management System

Authors: M. Amir Abas, M. Dahlui

Abstract:

This paper presents the investigation results of UV measurement at different level of altitudes and the development of a new portable instrument for measuring UV. The rapid growth of industrial sectors in developing countries including Malaysia, brings not only income to the nation, but also causes pollution in various forms. Air pollution is one of the significant contributors to global warming by depleting the Ozone layer, which would reduce the filtration of UV rays. Long duration of exposure to high to UV rays has many devastating health effects to mankind directly or indirectly through destruction of the natural resources. This study aimed to show correlation between UV and altitudes which indirectly can help predict Ozone depletion. An instrument had been designed to measure and monitors the level of UV. The instrument comprises of two main blocks namely data logger and Graphic User Interface (GUI). Three sensors were used in the data logger to detect changes in the temperature, humidity and ultraviolet. The system has undergone experimental measurement to capture data at two different conditions; industrial area and high attitude area. The performance of the instrument showed consistency in the data captured and the results of the experiment drew a significantly high reading of UV at high altitudes.

Keywords: Ozone Layer, Monitoring, Global Warming, Measurement, Ultraviolet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1713
7132 Investigation of Chaotic Behavior in DC-DC Converters

Authors: Sajid Iqbal, Masood Ahmed, Suhail Aftab Qureshi

Abstract:

DC-DC converters are widely used in regulated switched mode power supplies and in DC motor drive applications. There are several sources of unwanted nonlinearity in practical power converters. In addition, their operation is characterized by switching that gives birth to a variety of nonlinear dynamics. DC-DC buck and boost converters controlled by pulse-width modulation (PWM) have been simulated. The voltage waveforms and attractors obtained from the circuit simulation have been studied. With the onset of instability, the phenomenon of subharmonic oscillations, quasi-periodicity, bifurcations, and chaos have been observed. This paper is mainly motivated by potential contributions of chaos theory in the design, analysis and control of power converters, in particular and power electronics circuits, in general.

Keywords: Buck converter, boost converter, period- doubling, chaos, bifurcation, strange attractor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3627
7131 Determining the Online Purchasing Loyalty for Thai Herbal Products

Authors: Chummanond Natchaya, Rotchanakitumnuai Siriluck

Abstract:

The objective of this study is to identify the factors that influence the online purchasing loyalty for Thai herbal products. Survey research is used to gather data from Thai herb online merchants to assess factors that have impacts on enhancing loyalty. Data were collected from 300 online customers who had experience in online purchasing of Thai Herbal products. Prior experience consists of data from previous usage of online herbs, herb purchase and internet usage. E-Quality data consists of information quality, system quality, service quality and the product quality of Thai herbal products sold online. The results suggest that prior experience, Equality, attitude toward purchase and trust in online merchant have major impacts on loyalty. The good attitude and E-Quality of purchasing Thai herbal product online are the most significant determinants affecting loyalty.

Keywords: e-Commerce, Thai herb, E-Quality, satisfaction, loyalty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871
7130 Performance Comparison of ADTree and Naive Bayes Algorithms for Spam Filtering

Authors: Thanh Nguyen, Andrei Doncescu, Pierre Siegel

Abstract:

Classification is an important data mining technique and could be used as data filtering in artificial intelligence. The broad application of classification for all kind of data leads to be used in nearly every field of our modern life. Classification helps us to put together different items according to the feature items decided as interesting and useful. In this paper, we compare two classification methods Naïve Bayes and ADTree use to detect spam e-mail. This choice is motivated by the fact that Naive Bayes algorithm is based on probability calculus while ADTree algorithm is based on decision tree. The parameter settings of the above classifiers use the maximization of true positive rate and minimization of false positive rate. The experiment results present classification accuracy and cost analysis in view of optimal classifier choice for Spam Detection. It is point out the number of attributes to obtain a tradeoff between number of them and the classification accuracy.

Keywords: Classification, data mining, spam filtering, naive Bayes, decision tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468
7129 File Format of Flow Chart Simulation Software - CFlow

Authors: Syahanim Mohd Salleh, Zaihosnita Hood, Hairulliza Mohd Judi, Marini Abu Bakar

Abstract:

CFlow is a flow chart software, it contains facilities to draw and evaluate a flow chart. A flow chart evaluation applies a simulation method to enable presentation of work flow in a flow chart solution. Flow chart simulation of CFlow is executed by manipulating the CFlow data file which is saved in a graphical vector format. These text-based data are organised by using a data classification technic based on a Library classification-scheme. This paper describes the file format for flow chart simulation software of CFlow.

Keywords: CFlow, flow chart, file format.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2523
7128 Client Importance and Audit Quality under Civil Law versus Common Law Societies

Authors: Kelly Grani Yuen

Abstract:

Accounting scandals and auditing frauds are perceived to be driven by aggressive companies and misrepresentation of audit reports. However, local legal systems and law enforcements may affect the services auditors provide to their ‘important’ clients. Under the civil law and common law jurisdictions, the standard setters, the government, and the regulatory bodies treat cases differently. As such, whether or not different forms of legal systems and extent of law enforcement plays an important role in auditor’s Audit Quality is a question this paper attempts to explore. The paper focuses on the investigation in Asia, where Hong Kong represents the common-law jurisdiction, while Taiwan and China represent the civil law jurisdiction. Only the ten reputable accounting firms are used in this study due to the differences in rankings and establishments of some of the small local audit firms. This will also contribute to the data collected between the years 2007-2013. By focusing on the use of multiple regression based on the dependent (Audit Quality) and independent variables (Client Importance, Law Enforcement, and Press Freedom), six different models are established. Results demonstrate that since different jurisdictions have different legal systems and market regulations, auditor’s treatment on ‘important’ clients will vary. However, with the moderators in place (law enforcement and press freedom), the relationship between client importance and audit quality may be smoothed out. With that in mind, this study contributes to local governments and standard setters’ consideration on legal reform and proper law enforcement in the market. Perhaps, with such modifications on the economic systems, collusion between companies and auditors can finally be put to a halt.

Keywords: Audit quality, client importance, jurisdiction, modified audit opinions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
7127 A 2D-3D Hybrid Vision System for Robotic Manipulation of Randomly Oriented Objects

Authors: Moulay A. Akhloufi

Abstract:

This paper presents an new vision technique for robotic manipulation of randomly oriented objects in industrial applications. The proposed approach uses 2D and 3D vision for efficiently extracting the 3D pose of an object in the presence of multiple randomly positioned objects. 2D vision permits to quickly select the objects of interest for 3D processing with a new modified ICP algorithm (FaR-ICP), thus reducing significantly the processing time. The extracted 3D pose is then sent to the robot manipulator for picking. The tests show that the proposed system achieves high performances

Keywords: 3D vision, Hand-Eye calibration, robot visual servoing, random bin picking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1783