Search results for: pointing accuracy
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1746

Search results for: pointing accuracy

1566 Enhanced Weighted Centroid Localization Algorithm for Indoor Environments

Authors: I. Nižetić Kosović, T. Jagušt

Abstract:

Lately, with the increasing number of location-based applications, demand for highly accurate and reliable indoor localization became urgent. This is a challenging problem, due to the measurement variance which is the consequence of various factors like obstacles, equipment properties and environmental changes in complex nature of indoor environments. In this paper we propose low-cost custom-setup infrastructure solution and localization algorithm based on the Weighted Centroid Localization (WCL) method. Localization accuracy is increased by several enhancements: calibration of RSSI values gained from wireless nodes, repetitive measurements of RSSI to exclude deviating values from the position estimation, and by considering orientation of the device according to the wireless nodes. We conducted several experiments to evaluate the proposed algorithm. High accuracy of ~1m was achieved.

Keywords: Indoor environment, received signal strength indicator, weighted centroid localization, wireless localization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3047
1565 A Matching Algorithm of Minutiae for Real Time Fingerprint Identification System

Authors: Shahram Mohammadi, Ali Frajzadeh

Abstract:

A lot of matching algorithms with different characteristics have been introduced in recent years. For real time systems these algorithms are usually based on minutiae features. In this paper we introduce a novel approach for feature extraction in which the extracted features are independent of shift and rotation of the fingerprint and at the meantime the matching operation is performed much more easily and with higher speed and accuracy. In this new approach first for any fingerprint a reference point and a reference orientation is determined and then based on this information features are converted into polar coordinates. Due to high speed and accuracy of this approach and small volume of extracted features and easily execution of matching operation this approach is the most appropriate for real time applications.

Keywords: Matching, Minutiae, Reference point, Reference orientation

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2359
1564 A MATLAB Simulink Library for Transient Flow Simulation of Gas Networks

Authors: M. Behbahani-Nejad, A. Bagheri

Abstract:

An efficient transient flow simulation for gas pipelines and networks is presented. The proposed transient flow simulation is based on the transfer function models and MATLABSimulink. The equivalent transfer functions of the nonlinear governing equations are derived for different types of the boundary conditions. Next, a MATLAB-Simulink library is developed and proposed considering any boundary condition type. To verify the accuracy and the computational efficiency of the proposed simulation, the results obtained are compared with those of the conventional finite difference schemes (such as TVD, method of lines, and other finite difference implicit and explicit schemes). The effects of the flow inertia and the pipeline inclination are incorporated in this simulation. It is shown that the proposed simulation has a sufficient accuracy and it is computationally more efficient than the other methods.

Keywords: Gas network, MATLAB-Simulink, transfer functions, transient flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6412
1563 Multiclass Support Vector Machines for Environmental Sounds Classification Using log-Gabor Filters

Authors: S. Souli, Z. Lachiri

Abstract:

In this paper we propose a robust environmental sound classification approach, based on spectrograms features driven from log-Gabor filters. This approach includes two methods. In the first methods, the spectrograms are passed through an appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criteria. The second method uses the same steps but applied only to three patches extracted from each spectrogram.

To investigate the accuracy of the proposed methods, we conduct experiments using a large database containing 10 environmental sound classes. The classification results based on Multiclass Support Vector Machines show that the second method is the most efficient with an average classification accuracy of 89.62 %.

Keywords: Environmental sounds, Log-Gabor filters, Spectrogram, SVM Multiclass, Visual features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1695
1562 European Environmental Policy for Road Transport: Analysis of the Perverse Effects Generated and Proposals for a Good Practice Guide

Authors: Pedro Pablo Ramírez Sánchez, Alassane Ballé Ndiaye, Roberto Rendeiro Martín-Cejas

Abstract:

The aim of this paper is to analyse the different environmental policies adopted in Europe for car emissions, to comment on some of the possible perverse effects generated and point out these policies which are considered more efficient under the environmental perspective. This paper is focused on passenger cars as this category is the most significant in road transport. The utility of this research lies in this being the first step or basis to improve and optimise actual policies. The methodology applied in this paper refers to a comparative analysis from a practical and theoretical point of view of European environmental policies in road transport. This work describes an overview of the road transport industry in Europe pointing out some relevant aspects such as the contribution of road transport to total emissions and the vehicle fleet in Europe. Additionally, we propose a brief practice guide with the combined policies in order to optimise their aim.

Keywords: Air quality, climate change, emission, environment, perverse effect, road transport, tax policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 808
1561 An Optimized Multi-block Method for Turbulent Flows

Authors: M. Goodarzi, P. Lashgari

Abstract:

A major part of the flow field involves no complicated turbulent behavior in many turbulent flows. In this research work, in order to reduce required memory and CPU time, the flow field was decomposed into several blocks, each block including its special turbulence. A two dimensional backward facing step was considered here. Four combinations of the Prandtl mixing length and standard k- E models were implemented as well. Computer memory and CPU time consumption in addition to numerical convergence and accuracy of the obtained results were mainly investigated. Observations showed that, a suitable combination of turbulence models in different blocks led to the results with the same accuracy as the high order turbulence model for all of the blocks, in addition to the reductions in memory and CPU time consumption.

Keywords: Computer memory, CPU time, Multi-block method, Turbulence modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1521
1560 Evaluation of the Degree of the Sufficiency of Public Green Spaces as an Indicator of Urban Density in the Chubu Metropolitan Area in Japan

Authors: Kayoko Yamamoto

Abstract:

This study uses GIS (Geographic Information Systems) to conduct an evaluation of the degree of the sufficiency of public green spaces such as parks and urban green areas as an indicator of the density of metropolitan areas, in particular the Chubu metropolitan area, in Japan. To that end, it first grasps the distribution situation of green spaces in the three metropolitan areas in Japan, especially in the Chubu metropolitan area, using GIS digital maps. And based on this result, it conducts a GIS evaluation of the degree of sufficiency of public green spaces and arranges the result for every distance belt from the central part to compare and exam for every distance belt away from the center in the Chubu metropolitan area. Furthermore, after pointing out the insufficient areas of public green spaces based on the result, it also proposes the improvement policy which can be introduced in the Chubu metropolitan area.

Keywords: Public Green Spaces, Urban Density, MetropolitanAreas, Land Use, Urbanization, GIS (Geographic InformationSystems)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2006
1559 Shadow Detection for Increased Accuracy of Privacy Enhancing Methods in Video Surveillance Edge Devices

Authors: F. Matusek, G. Pujolle, R. Reda

Abstract:

Shadow detection is still considered as one of the potential challenges for intelligent automated video surveillance systems. A pre requisite for reliable and accurate detection and tracking is the correct shadow detection and classification. In such a landscape of conditions, privacy issues add more and more complexity and require reliable shadow detection. In this work the intertwining between security, accuracy, reliability and privacy is analyzed and, accordingly, a novel architecture for Privacy Enhancing Video Surveillance (PEVS) is introduced. Shadow detection and masking are dealt with through the combination of two different approaches simultaneously. This results in a unique privacy enhancement, without affecting security. Subsequently, the methodology was employed successfully in a large-scale wireless video surveillance system; privacy relevant information was stored and encrypted on the unit, without transferring it over an un-trusted network.

Keywords: Video Surveillance, Intelligent Video Surveillance, Physical Security, WSSU, Privacy, Shadow Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1296
1558 Rough Set Based Intelligent Welding Quality Classification

Authors: L. Tao, T. J. Sun, Z. H. Li

Abstract:

The knowledge base of welding defect recognition is essentially incomplete. This characteristic determines that the recognition results do not reflect the actual situation. It also has a further influence on the classification of welding quality. This paper is concerned with the study of a rough set based method to reduce the influence and improve the classification accuracy. At first, a rough set model of welding quality intelligent classification has been built. Both condition and decision attributes have been specified. Later on, groups of the representative multiple compound defects have been chosen from the defect library and then classified correctly to form the decision table. Finally, the redundant information of the decision table has been reducted and the optimal decision rules have been reached. By this method, we are able to reclassify the misclassified defects to the right quality level. Compared with the ordinary ones, this method has higher accuracy and better robustness.

Keywords: intelligent decision, rough set, welding defects, welding quality level

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1548
1557 Rigid Registration of Reduced Dimension Images using 1D Binary Projections

Authors: Panos D. Kotsas, Tony Dodd

Abstract:

The purpose of this work is to present a method for rigid registration of medical images using 1D binary projections when a part of one of the two images is missing. We use 1D binary projections and we adjust the projection limits according to the reduced image in order to perform accurate registration. We use the variance of the weighted ratio as a registration function which we have shown is able to register 2D and 3D images more accurately and robustly than mutual information methods. The function is computed explicitly for n=5 Chebyshev points in a [-9,+9] interval and it is approximated using Chebyshev polynomials for all other points. The images used are MR scans of the head. We find that the method is able to register the two images with average accuracy 0.3degrees for rotations and 0.2 pixels for translations for a y dimension of 156 with initial dimension 256. For y dimension 128/256 the accuracy decreases to 0.7 degrees for rotations and 0.6 pixels for translations.

Keywords: binary projections, image registration, reduceddimension images.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
1556 Coupled Galerkin-DQ Approach for the Transient Analysis of Dam-Reservoir Interaction

Authors: S. A. Eftekhari

Abstract:

In this paper, a numerical algorithm using a coupled Galerkin-Differential Quadrature (DQ) method is proposed for the solution of dam-reservoir interaction problem. The governing differential equation of motion of the dam structure is discretized by the Galerkin method and the DQM is used to discretize the fluid domain. The resulting systems of ordinary differential equations are then solved by the Newmark time integration scheme. The mixed scheme combines the simplicity of the Galerkin method and high accuracy and efficiency of the DQ method. Its accuracy and efficiency are demonstrated by comparing the calculated results with those of the existing literature. It is shown that highly accurate results can be obtained using a small number of Galerkin terms and DQM sampling points. The technique presented in this investigation is general and can be used to solve various fluid-structure interaction problems.

Keywords: Dam-reservoir system, Differential quadrature method, Fluid-structure interaction, Galerkin method, Integral quadrature method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1821
1555 Voice Command Recognition System Based on MFCC and VQ Algorithms

Authors: Mahdi Shaneh, Azizollah Taheri

Abstract:

The goal of this project is to design a system to recognition voice commands. Most of voice recognition systems contain two main modules as follow “feature extraction" and “feature matching". In this project, MFCC algorithm is used to simulate feature extraction module. Using this algorithm, the cepstral coefficients are calculated on mel frequency scale. VQ (vector quantization) method will be used for reduction of amount of data to decrease computation time. In the feature matching stage Euclidean distance is applied as similarity criterion. Because of high accuracy of used algorithms, the accuracy of this voice command system is high. Using these algorithms, by at least 5 times repetition for each command, in a single training session, and then twice in each testing session zero error rate in recognition of commands is achieved.

Keywords: MFCC, Vector quantization, Vocal tract, Voicecommand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3093
1554 Accuracy of Autonomy Navigation of Unmanned Aircraft Systems through Imagery

Authors: Sidney A. Lima, Hermann J. H. Kux, Elcio H. Shiguemori

Abstract:

The Unmanned Aircraft Systems (UAS) usually navigate through the Global Navigation Satellite System (GNSS) associated with an Inertial Navigation System (INS). However, GNSS can have its accuracy degraded at any time or even turn off the signal of GNSS. In addition, there is the possibility of malicious interferences, known as jamming. Therefore, the image navigation system can solve the autonomy problem, because if the GNSS is disabled or degraded, the image navigation system would continue to provide coordinate information for the INS, allowing the autonomy of the system. This work aims to evaluate the accuracy of the positioning though photogrammetry concepts. The methodology uses orthophotos and Digital Surface Models (DSM) as a reference to represent the object space and photograph obtained during the flight to represent the image space. For the calculation of the coordinates of the perspective center and camera attitudes, it is necessary to know the coordinates of homologous points in the object space (orthophoto coordinates and DSM altitude) and image space (column and line of the photograph). So if it is possible to automatically identify in real time the homologous points the coordinates and attitudes can be calculated whit their respective accuracies. With the methodology applied in this work, it is possible to verify maximum errors in the order of 0.5 m in the positioning and 0.6º in the attitude of the camera, so the navigation through the image can reach values equal to or higher than the GNSS receivers without differential correction. Therefore, navigating through the image is a good alternative to enable autonomous navigation.

Keywords: Autonomy, navigation, security, photogrammetry, remote sensing, spatial resection, UAS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1259
1553 Evaluating the Tracking Abilities of Microsoft HoloLens-1 for Small-Scale Industrial Processes

Authors: Kuhelee Chandel, Julia Åhlén, Stefan Seipel

Abstract:

This study evaluates the accuracy of Microsoft HoloLens (Version 1) for small-scale industrial activities, comparing its measurements to ground truth data from a Kuka Robotics arm. Two experiments were conducted to assess its position-tracking capabilities, revealing that the HoloLens device is effective for measuring the position of dynamic objects with small dimensions. However, its precision is affected by the velocity of the trajectory and its position within the device's field of view. While the HoloLens device may be suitable for small-scale tasks, its limitations for more complex and demanding applications requiring high precision and accuracy must be considered. The findings can guide the use of HoloLens devices in industrial applications and contribute to the development of more effective and reliable position-tracking systems.

Keywords: Augmented Reality, AR, Microsoft HoloLens, object tracking, industrial processes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38
1552 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods

Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu

Abstract:

The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.

Keywords: Accident analysis, multi-factorial error modeling, risk, systemic methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 983
1551 A Method to Enhance the Accuracy of Digital Forensic in the Absence of Sufficient Evidence in Saudi Arabia

Authors: Fahad Alanazi, Andrew Jones

Abstract:

Digital forensics seeks to achieve the successful investigation of digital crimes through obtaining acceptable evidence from digital devices that can be presented in a court of law. Thus, the digital forensics investigation is normally performed through a number of phases in order to achieve the required level of accuracy in the investigation processes. Since 1984 there have been a number of models and frameworks developed to support the digital investigation processes. In this paper, we review a number of the investigation processes that have been produced throughout the years and introduce a proposed digital forensic model which is based on the scope of the Saudi Arabia investigation process. The proposed model has been integrated with existing models for the investigation processes and produced a new phase to deal with a situation where there is initially insufficient evidence.

Keywords: Digital forensics, Process, Metadata, Traceback, Saudi Arabia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
1550 Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area

Authors: Nassib Abdallah, Pierre Chauvet, Abd El Salam Hajjar, Bassam Daya

Abstract:

In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words.

Keywords: Brain-computer interface, speech recognition, electroencephalography EEG, Wernicke area, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 838
1549 An Investigation of Direct and Indirect Geo-Referencing Techniques on the Accuracy of Points in Photogrammetry

Authors: F. Yildiz, S. Y. Oturanc

Abstract:

Advances technology in the field of photogrammetry replaces analog cameras with reflection on aircraft GPS/IMU system with a digital aerial camera. In this system, when determining the position of the camera with the GPS, camera rotations are also determined by the IMU systems. All around the world, digital aerial cameras have been used for the photogrammetry applications in the last ten years. In this way, in terms of the work done in photogrammetry it is possible to use time effectively, costs to be reduced to a minimum level, the opportunity to make fast and accurate. Geo-referencing techniques that are the cornerstone of the GPS / INS systems, photogrammetric triangulation of images required for balancing (interior and exterior orientation) brings flexibility to the process. Also geo-referencing process; needed in the application of photogrammetry targets to help to reduce the number of ground control points. In this study, the use of direct and indirect georeferencing techniques on the accuracy of the points was investigated in the production of photogrammetric mapping.

Keywords: Photogrammetry, GPS/IMU Systems, Geo- Referencing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2659
1548 Mechanical Quadrature Methods for Solving First Kind Boundary Integral Equations of Stationary Stokes Problem

Authors: Xin Luo, Jin Huang, Pan Cheng

Abstract:

By means of Sidi-Israeli’s quadrature rules, mechanical quadrature methods (MQMs) for solving the first kind boundary integral equations (BIEs) of steady state Stokes problem are presented. The convergence of numerical solutions by MQMs is proved based on Anselone’s collective compact and asymptotical compact theory, and the asymptotic expansions with the odd powers of the errors are provided, which implies that the accuracy of the approximations by MQMs possesses high accuracy order O (h3). Finally, the numerical examples show the efficiency of our methods.

Keywords: Stokes problem, boundary integral equation, mechanical quadrature methods, asymptotic expansions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1352
1547 Fuzzy Metric Approach for Fuzzy Time Series Forecasting based on Frequency Density Based Partitioning

Authors: Tahseen Ahmed Jilani, Syed Muhammad Aqil Burney, C. Ardil

Abstract:

In the last 15 years, a number of methods have been proposed for forecasting based on fuzzy time series. Most of the fuzzy time series methods are presented for forecasting of enrollments at the University of Alabama. However, the forecasting accuracy rates of the existing methods are not good enough. In this paper, we compared our proposed new method of fuzzy time series forecasting with existing methods. Our method is based on frequency density based partitioning of the historical enrollment data. The proposed method belongs to the kth order and time-variant methods. The proposed method can get the best forecasting accuracy rate for forecasting enrollments than the existing methods.

Keywords: Fuzzy logical groups, fuzzified enrollments, fuzzysets, fuzzy time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3151
1546 EZW Coding System with Artificial Neural Networks

Authors: Saudagar Abdul Khader Jilani, Syed Abdul Sattar

Abstract:

Image compression plays a vital role in today-s communication. The limitation in allocated bandwidth leads to slower communication. To exchange the rate of transmission in the limited bandwidth the Image data must be compressed before transmission. Basically there are two types of compressions, 1) LOSSY compression and 2) LOSSLESS compression. Lossy compression though gives more compression compared to lossless compression; the accuracy in retrievation is less in case of lossy compression as compared to lossless compression. JPEG, JPEG2000 image compression system follows huffman coding for image compression. JPEG 2000 coding system use wavelet transform, which decompose the image into different levels, where the coefficient in each sub band are uncorrelated from coefficient of other sub bands. Embedded Zero tree wavelet (EZW) coding exploits the multi-resolution properties of the wavelet transform to give a computationally simple algorithm with better performance compared to existing wavelet transforms. For further improvement of compression applications other coding methods were recently been suggested. An ANN base approach is one such method. Artificial Neural Network has been applied to many problems in image processing and has demonstrated their superiority over classical methods when dealing with noisy or incomplete data for image compression applications. The performance analysis of different images is proposed with an analysis of EZW coding system with Error Backpropagation algorithm. The implementation and analysis shows approximately 30% more accuracy in retrieved image compare to the existing EZW coding system.

Keywords: Accuracy, Compression, EZW, JPEG2000, Performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1868
1545 Wireless Body Area Network’s Mitigation Method Using Equalization

Authors: Savita Sindhu, Shruti Vashist

Abstract:

A wireless body area sensor network (WBASN) is composed of a central node and heterogeneous sensors to supervise the physiological signals and functions of the human body. This overwhelmimg area has stimulated new research and calibration processes, especially in the area of WBASN’s attainment and fidelity. In the era of mobility or imbricated WBASN’s, system performance incomparably degrades because of unstable signal integrity. Hence, it is mandatory to define mitigation techniques in the design to avoid interference. There are various mitigation methods available e.g. diversity techniques, equalization, viterbi decoder etc. This paper presents equalization mitigation scheme in WBASNs to improve the signal integrity. Eye diagrams are also given to represent accuracy of the signal. Maximum no. of symbols is taken to authenticate the signal which in turn results in accuracy and increases the overall performance of the system.

Keywords: Wireless body area network, equalizer, RLS, LMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 742
1544 Documents Emotions Classification Model Based on TF-IDF Weighting Measure

Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees

Abstract:

Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents’ classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results’ accuracy.

Keywords: Emotion detection, TF-IDF, WEKA tool, classification algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1653
1543 On Enhancing Robustness of an Evolutionary Fuzzy Tracking Controller

Authors: H. Megherbi, A. C. Megherbi, N. Megherbi, K. Benmahamed

Abstract:

This paper presents three-phase evolution search methodology to automatically design fuzzy logic controllers (FLCs) that can work in a wide range of operating conditions. These include varying load, parameter variations, and unknown external disturbances. The three-phase scheme consists of an exploration phase, an exploitation phase and a robustness phase. The first two phases search for FLC with high accuracy performances while the last phase aims at obtaining FLC providing the best compromise between the accuracy and robustness performances. Simulations were performed for direct-drive two-axis robot arm. The evolved FLC with the proposed design technique found to provide a very satisfactory performance under the wide range of operation conditions and to overcome problem associated with coupling and nonlinearities characteristics inherent to robot arms.

Keywords: Fuzzy logic control, evolutionary algorithms, robustness, exploration/exploitation phase.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1401
1542 Determination of Sequential Best Replies in N-player Games by Genetic Algorithms

Authors: Mattheos K. Protopapas, Elias B. Kosmatopoulos

Abstract:

An iterative algorithm is proposed and tested in Cournot Game models, which is based on the convergence of sequential best responses and the utilization of a genetic algorithm for determining each player-s best response to a given strategy profile of its opponents. An extra outer loop is used, to address the problem of finite accuracy, which is inherent in genetic algorithms, since the set of feasible values in such an algorithm is finite. The algorithm is tested in five Cournot models, three of which have convergent best replies sequence, one with divergent sequential best replies and one with “local NE traps"[14], where classical local search algorithms fail to identify the Nash Equilibrium. After a series of simulations, we conclude that the algorithm proposed converges to the Nash Equilibrium, with any level of accuracy needed, in all but the case where the sequential best replies process diverges.

Keywords: Best response, Cournot oligopoly, genetic algorithms, Nash equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
1541 Faults Forecasting System

Authors: Hanaa E.Sayed, Hossam A. Gabbar, Shigeji Miyazaki

Abstract:

This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.

Keywords: Bayesian Techniques, Faults Detection, Forecasting techniques, Multivariate Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1502
1540 A Comparison among Wolf Pack Search and Four other Optimization Algorithms

Authors: Shahla Shoghian, Maryam Kouzehgar

Abstract:

The main objective of this paper is applying a comparison between the Wolf Pack Search (WPS) as a newly introduced intelligent algorithm with several other known algorithms including Particle Swarm Optimization (PSO), Shuffled Frog Leaping (SFL), Binary and Continues Genetic algorithms. All algorithms are applied on two benchmark cost functions. The aim is to identify the best algorithm in terms of more speed and accuracy in finding the solution, where speed is measured in terms of function evaluations. The simulation results show that the SFL algorithm with less function evaluations becomes first if the simulation time is important, while if accuracy is the significant issue, WPS and PSO would have a better performance.

Keywords: Wolf Pack Search, Particle Swarm Optimization, Continues Genetic Algorithm, Binary Genetic Algorithm, Shuffled Frog Leaping, Optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3674
1539 Accuracy of Small Field of View CBCT in Determining Endodontic Working Length

Authors: N. L. S. Ahmad, Y. L. Thong, P. Nambiar

Abstract:

An in vitro study was carried out to evaluate the feasibility of small field of view (FOV) cone beam computed tomography (CBCT) in determining endodontic working length. The objectives were to determine the accuracy of CBCT in measuring the estimated preoperative working lengths (EPWL), endodontic working lengths (EWL) and file lengths. Access cavities were prepared in 27 molars. For each root canal, the baseline electronic working length was determined using an EAL (Raypex 5). The teeth were then divided into overextended, non-modified and underextended groups and the lengths were adjusted accordingly. Imaging and measurements were made using the respective software of the RVG (Kodak RVG 6100) and CBCT units (Kodak 9000 3D). Root apices were then shaved and the apical constrictions viewed under magnification to measure the control working lengths. The paired t-test showed a statistically significant difference between CBCT EPWL and control length but the difference was too small to be clinically significant. From the Bland Altman analysis, the CBCT method had the widest range of 95% limits of agreement, reflecting its greater potential of error. In measuring file lengths, RVG had a bigger window of 95% limits of agreement compared to CBCT. Conclusions: (1) The clinically insignificant underestimation of the preoperative working length using small FOV CBCT showed that it is acceptable for use in the estimation of preoperative working length. (2) Small FOV CBCT may be used in working length determination but it is not as accurate as the currently practiced method of using the EAL. (3) It is also more accurate than RVG in measuring file lengths.

Keywords: Accuracy, CBCT, endodontic, measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1551
1538 Fusion of Colour and Depth Information to Enhance Wound Tissue Classification

Authors: Darren Thompson, Philip Morrow, Bryan Scotney, John Winder

Abstract:

Patients with diabetes are susceptible to chronic foot wounds which may be difficult to manage and slow to heal. Diagnosis and treatment currently rely on the subjective judgement of experienced professionals. An objective method of tissue assessment is required. In this paper, a data fusion approach was taken to wound tissue classification. The supervised Maximum Likelihood and unsupervised Multi-Modal Expectation Maximisation algorithms were used to classify tissues within simulated wound models by weighting the contributions of both colour and 3D depth information. It was found that, at low weightings, depth information could show significant improvements in classification accuracy when compared to classification by colour alone, particularly when using the maximum likelihood method. However, larger weightings were found to have an entirely negative effect on accuracy.

Keywords: Classification, data fusion, diabetic foot, stereophotogrammetry, tissue colour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1658
1537 Level Set and Morphological Operation Techniques in Application of Dental Image Segmentation

Authors: Abdolvahab Ehsani Rad, Mohd Shafry Mohd Rahim, Alireza Norouzi

Abstract:

Medical image analysis is one of the great effects of computer image processing. There are several processes to analysis the medical images which the segmentation process is one of the challenging and most important step. In this paper the segmentation method proposed in order to segment the dental radiograph images. Thresholding method has been applied to simplify the images and to morphologically open binary image technique performed to eliminate the unnecessary regions on images. Furthermore, horizontal and vertical integral projection techniques used to extract the each individual tooth from radiograph images. Segmentation process has been done by applying the level set method on each extracted images. Nevertheless, the experiments results by 90% accuracy demonstrate that proposed method achieves high accuracy and promising result.

Keywords: Integral production, level set method, morphological operation, segmentation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4148