Search results for: Data cutting and sorting method
12575 Data Mining Approach for Commercial Data Classification and Migration in Hybrid Storage Systems
Authors: Mais Haj Qasem, Maen M. Al Assaf, Ali Rodan
Abstract:
Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.Keywords: Data mining, hybrid storage system, recurrent neural network, support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173612574 Applying Spanning Tree Graph Theory for Automatic Database Normalization
Authors: Chetneti Srisa-an
Abstract:
In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.
Keywords: Relational Database, Functional Dependency, Automatic Normalization, Primary Key, Spanning tree.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 286612573 Obtain the Stress Intensity Factor (SIF) in a Medium Containing a Penny-Shaped Crack by the Ritz Method
Authors: A. Tavangari, N. Salehzadeh
Abstract:
In the crack growth analysis, the Stress Intensity Factor (SIF) is a fundamental prerequisite. In the present study, the mode I stress intensity factor (SIF) of three-dimensional penny- Shaped crack is obtained in an isotropic elastic cylindrical medium with arbitrary dimensions under arbitrary loading at the top of the cylinder, by the semi-analytical method based on the Rayleigh-Ritz method. This method that is based on minimizing the potential energy amount of the whole of the system, gives a very close results to the previous studies. Defining the displacements (elastic fields) by hypothetical functions in a defined coordinate system is the base of this research. So for creating the singularity conditions at the tip of the crack the appropriate terms should be found.
Keywords: Penny-shaped crack, Stress intensity factor, Fracture mechanics, Ritz method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 211612572 Application of Differential Transformation Method for Solving Dynamical Transmission of Lassa Fever Model
Authors: M. A. Omoloye, M. I. Yusuff, O. K. S. Emiola
Abstract:
The use of mathematical models for solving biological problems varies from simple to complex analyses, depending on the nature of the research problems and applicability of the models. The method is more common nowadays. Many complex models become impractical when transmitted analytically. However, alternative approach such as numerical method can be employed. It appropriateness in solving linear and non-linear model equation in Differential Transformation Method (DTM) which depends on Taylor series make it applicable. Hence this study investigates the application of DTM to solve dynamic transmission of Lassa fever model in a population. The mathematical model was formulated using first order differential equation. Firstly, existence and uniqueness of the solution was determined to establish that the model is mathematically well posed for the application of DTM. Numerically, simulations were conducted to compare the results obtained by DTM and that of fourth-order Runge-Kutta method. As shown, DTM is very effective in predicting the solution of epidemics of Lassa fever model.
Keywords: Differential Transform Method, Existence and uniqueness, Lassa fever, Runge-Kutta Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49012571 Permanent Magnet Machine Can Be a Vibration Sensor for Itself
Authors: M. Barański
Abstract:
This article presents a new vibration diagnostic method designed to (PM) machines with permanent magnets. Those devices are commonly used in small wind and water systems or vehicles drives. The author’s method is very innovative and unique. Specific structural properties of PM machines are used in this method - electromotive force (EMF) generated due to vibrations. There was analysed number of publications which describe vibration diagnostic methods and tests of electrical PM machines and there was no method found to determine the technical condition of such machine basing on their own signals. In this article will be discussed: the method genesis, the similarity of machines with permanent magnet to vibration sensor and simulation and laboratory tests results. The method of determination the technical condition of electrical machine with permanent magnets basing on its own signals is the subject of patent application and it is the main thesis of author’s doctoral dissertation.
Keywords: Electrical vehicle, generator, permanent magnet, traction drive, vibrations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 231512570 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions
Authors: Hazem M. El-Bakry
Abstract:
In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.Keywords: Boolean Functions, Simplification, KarnoughMap, Implementation of Logic Functions, Modular NeuralNetworks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181412569 Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm
Authors: S. Esfandeh, M. Sedighizadeh
Abstract:
Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.Keywords: Weather, Climate, PSO, Prediction, Meteorological
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207712568 Application of GM (1, 1) Model Group Based on Recursive Solution in China's Energy Demand Forecasting
Authors: Yeqing Guan, Fen Yang
Abstract:
To learn about China-s future energy demand, this paper first proposed GM(1,1) model group based on recursive solutions of parameters estimation, setting up a general solving-algorithm of the model group. This method avoided the problems occurred on the past researches that remodeling, loss of information and large amount of calculation. This paper established respectively all-data-GM(1,1), metabolic GM(1,1) and new information GM (1,1)model according to the historical data of energy consumption in China in the year 2005-2010 and the added data of 2011, then modeling, simulating and comparison of accuracies we got the optimal models and to predict. Results showed that the total energy demand of China will be 37.2221 billion tons of equivalent coal in 2012 and 39.7973 billion tons of equivalent coal in 2013, which are as the same as the overall planning of energy demand in The 12th Five-Year Plan.
Keywords: energy demands, GM(1, 1) model group, least square estimation, prediction
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 155612567 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).
Keywords: Classification, Bayesian network; structure learning, K2 algorithm, expert knowledge, surface water analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 51312566 An Optimized Method for Calculating the Linear and Nonlinear Response of SDOF System Subjected to an Arbitrary Base Excitation
Authors: Hossein Kabir, Mojtaba Sadeghi
Abstract:
Finding the linear and nonlinear responses of a typical single-degree-of-freedom system (SDOF) is always being regarded as a time-consuming process. This study attempts to provide modifications in the renowned Newmark method in order to make it more time efficient than it used to be and make it more accurate by modifying the system in its own non-linear state. The efficacy of the presented method is demonstrated by assigning three base excitations such as Tabas 1978, El Centro 1940, and MEXICO CITY/SCT 1985 earthquakes to a SDOF system, that is, SDOF, to compute the strength reduction factor, yield pseudo acceleration, and ductility factor.
Keywords: Single-degree-of-freedom system, linear acceleration method, nonlinear excited system, equivalent displacement method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 110612565 Fuzzy Voting in Internal Elections of Educational and Party Organizations
Authors: R. Hosseingholizadeh
Abstract:
This article presents a method for elections between the members of a group that is founded by fuzzy logic. Linguistic variables are objects for decision on election cards and deduction is based on t-norms and s-norms. In this election-s method election cards are questionnaire. The questionnaires are comprised of some questions with some choices. The choices are words from natural language. Presented method is accompanied by center of gravity (COG) defuzzification added up to a computer program by MATLAB. Finally the method is illustrated by solving two examples; choose a head for a research group-s members and a representative for students.
Keywords: fuzzy election, fuzzy electoral card, fuzzy inference, questionnaire.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142412564 Association Rules Mining and NOSQL Oriented Document in Big Data
Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub
Abstract:
Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.
Keywords: Apriori, Association rules mining, Big Data, data mining, Hadoop, Map Reduce, MongoDB, NoSQL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 69412563 Integrating Fast Karnough Map and Modular Neural Networks for Simplification and Realization of Complex Boolean Functions
Authors: Hazem M. El-Bakry
Abstract:
In this paper a new fast simplification method is presented. Such method realizes Karnough map with large number of variables. In order to accelerate the operation of the proposed method, a new approach for fast detection of group of ones is presented. Such approach implemented in the frequency domain. The search operation relies on performing cross correlation in the frequency domain rather than time one. It is proved mathematically and practically that the number of computation steps required for the presented method is less than that needed by conventional cross correlation. Simulation results using MATLAB confirm the theoretical computations. Furthermore, a powerful solution for realization of complex functions is given. The simplified functions are implemented by using a new desigen for neural networks. Neural networks are used because they are fault tolerance and as a result they can recognize signals even with noise or distortion. This is very useful for logic functions used in data and computer communications. Moreover, the implemented functions are realized with minimum amount of components. This is done by using modular neural nets (MNNs) that divide the input space into several homogenous regions. Such approach is applied to implement XOR function, 16 logic functions on one bit level, and 2-bit digital multiplier. Compared to previous non- modular designs, a clear reduction in the order of computations and hardware requirements is achieved.
Keywords: Boolean functions, simplification, Karnough map, implementation of logic functions, modular neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 207012562 High Capacity Data Hiding based on Predictor and Histogram Modification
Authors: Hui-Yu Huang, Shih-Hsu Chang
Abstract:
In this paper, we propose a high capacity image hiding technology based on pixel prediction and the difference of modified histogram. This approach is used the pixel prediction and the difference of modified histogram to calculate the best embedding point. This approach can improve the predictive accuracy and increase the pixel difference to advance the hiding capacity. We also use the histogram modification to prevent the overflow and underflow. Experimental results demonstrate that our proposed method within the same average hiding capacity can still keep high quality of image and low distortionKeywords: data hiding, predictor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188612561 Comparative Study between Classical P-Q Method and Modern Fuzzy Controller Method to Improve the Power Quality of an Electrical Network
Authors: A. Morsli, A.Tlemçani, N. Ould Cherchali, M. S. Boucherit
Abstract:
This article presents two methods for the compensation of harmonics generated by a nonlinear load. The first is the classic method P-Q. The second is the controller by modern method of artificial intelligence specifically fuzzy logic. Both methods are applied to a shunt Active Power Filter (sAPF) based on a three-phase voltage converter at five levels NPC topology. In calculating the harmonic currents of reference, we use the algorithm P-Q and pulse generation, we use the intersective PWM. For flexibility and dynamics, we use fuzzy logic. The results give us clear that the rate of Harmonic Distortion issued by fuzzy logic is better than P-Q.Keywords: Fuzzy logic controller, P-Q method, Pulse Width Modulation (PWM), shunt Active Power Filter (sAPF), Total Harmonic Distortion (THD).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 236612560 Optimizing Data Evaluation Metrics for Fraud Detection Using Machine Learning
Authors: Jennifer Leach, Umashanger Thayasivam
Abstract:
The use of technology has benefited society in more ways than one ever thought possible. Unfortunately, as society’s knowledge of technology has advanced, so has its knowledge of ways to use technology to manipulate others. This has led to a simultaneous advancement in the world of fraud. Machine learning techniques can offer a possible solution to help decrease these advancements. This research explores how the use of various machine learning techniques can aid in detecting fraudulent activity across two different types of fraudulent datasets, and the accuracy, precision, recall, and F1 were recorded for each method. Each machine learning model was also tested across five different training and testing splits in order to discover which split and technique would lead to the most optimal results.
Keywords: Data science, fraud detection, machine learning, supervised learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77312559 Developing the Color Temperature Histogram Method for Improving the Content-Based Image Retrieval
Authors: P. Phokharatkul, S. Chaisriya, S. Somkuarnpanit, S. Phaiboon, C. Kimpan
Abstract:
This paper proposes a new method for image searches and image indexing in databases with a color temperature histogram. The color temperature histogram can be used for performance improvement of content–based image retrieval by using a combination of color temperature and histogram. The color temperature histogram can be represented by a range of 46 colors. That is more than the color histogram and the dominant color temperature. Moreover, with our method the colors that have the same color temperature can be separated while the dominant color temperature can not. The results showed that the color temperature histogram retrieved an accurate image more often than the dominant color temperature method or color histogram method. This also took less time so the color temperature can be used for indexing and searching for images.
Keywords: Color temperature histogram, color temperature, animage retrieval and content-based image retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 245312558 Improvement of the Shortest Path Problem with Geodesic-Like Method
Authors: Wen-Haw Chen
Abstract:
This paper proposes a method to improve the shortest path problem on a NURBS (Non-uniform rational basis spline) surfaces. It comes from an application of the theory in classic differential geometry on surfaces and can improve the distance problem not only on surfaces but in the Euclidean 3-space R3 .Keywords: shortest paths, geodesic-like method, NURBS surfaces.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176912557 Automatic Extraction of Water Bodies Using Whole-R Method
Authors: Nikhat Nawaz, S. Srinivasulu, P. Kesava Rao
Abstract:
Feature extraction plays an important role in many remote sensing applications. Automatic extraction of water bodies is of great significance in many remote sensing applications like change detection, image retrieval etc. This paper presents a procedure for automatic extraction of water information from remote sensing images. The algorithm uses the relative location of R color component of the chromaticity diagram. This method is then integrated with the effectiveness of the spatial scale transformation of whole method. The whole method is based on water index fitted from spectral library. Experimental results demonstrate the improved accuracy and effectiveness of the integrated method for automatic extraction of water bodies.
Keywords: Chromaticity, Feature Extraction, Remote Sensing, Spectral library, Water Index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 337012556 Design and Simulation of Low Speed Axial Flux Permanent Magnet (AFPM) Machine
Authors: Ahmad Darabi, Hassan Moradi, Hossein Azarinfar
Abstract:
In this paper presented initial design of Low Speed Axial Flux Permanent Magnet (AFPM) Machine with Non-Slotted TORUS topology type by use of certain algorithm (Appendix). Validation of design algorithm studied by means of selected data of an initial prototype machine. Analytically design calculation carried out by means of design algorithm and obtained results compared with results of Finite Element Method (FEM).Keywords: Axial Flux Permanent Magnet (AFPM) Machine, Design Algorithm, Finite Element Method (FEM), TORUS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 330512555 Optimization of the Characteristic Straight Line Method by a “Best Estimate“ of Observed, Normal Orthometric Elevation Differences
Authors: Mahmoud M. S. Albattah
Abstract:
In this paper, to optimize the “Characteristic Straight Line Method" which is used in the soil displacement analysis, a “best estimate" of the geodetic leveling observations has been achieved by taking in account the concept of 'Height systems'. This concept has been discussed in detail and consequently the concept of “height". In landslides dynamic analysis, the soil is considered as a mosaic of rigid blocks. The soil displacement has been monitored and analyzed by using the “Characteristic Straight Line Method". Its characteristic components have been defined constructed from a “best estimate" of the topometric observations. In the measurement of elevation differences, we have used the most modern leveling equipment available. Observational procedures have also been designed to provide the most effective method to acquire data. In addition systematic errors which cannot be sufficiently controlled by instrumentation or observational techniques are minimized by applying appropriate corrections to the observed data: the level collimation correction minimizes the error caused by nonhorizontality of the leveling instrument's line of sight for unequal sight lengths, the refraction correction is modeled to minimize the refraction error caused by temperature (density) variation of air strata, the rod temperature correction accounts for variation in the length of the leveling rod' s Invar/LO-VAR® strip which results from temperature changes, the rod scale correction ensures a uniform scale which conforms to the international length standard and the introduction of the concept of the 'Height systems' where all types of height (orthometric, dynamic, normal, gravity correction, and equipotential surface) have been investigated. The “Characteristic Straight Line Method" is slightly more convenient than the “Characteristic Circle Method". It permits to evaluate a displacement of very small magnitude even when the displacement is of an infinitesimal quantity. The inclination of the landslide is given by the inverse of the distance reference point O to the “Characteristic Straight Line". Its direction is given by the bearing of the normal directed from point O to the Characteristic Straight Line (Fig..6). A “best estimate" of the topometric observations was used to measure the elevation of points carefully selected, before and after the deformation. Gross errors have been eliminated by statistical analyses and by comparing the heights within local neighborhoods. The results of a test using an area where very interesting land surface deformation occurs are reported. Monitoring with different options and qualitative comparison of results based on a sufficient number of check points are presented.
Keywords: Characteristic straight line method, dynamic height, landslides, orthometric height, systematic errors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 156712554 Dynamic Modeling and Simulation of Industrial Naphta Reforming Reactor
Authors: Gholamreza Zahedi, M. Tarin, M. Biglari
Abstract:
This work investigated the steady state and dynamic simulation of a fixed bed industrial naphtha reforming reactors. The performance of the reactor was investigated using a heterogeneous model. For process simulation, the differential equations are solved using the 4th order Runge-Kutta method .The models were validated against measured process data of an existing naphtha reforming plant. The results of simulation in terms of components yields and temperature of the outlet were in good agreement with empirical data. The simple model displays a useful tool for dynamic simulation, optimization and control of naphtha reforming.Keywords: Dynamic simulation, fixed bed reactor, modeling, reforming
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 296512553 A Heuristics Approach for Fast Detecting Suspicious Money Laundering Cases in an Investment Bank
Authors: Nhien-An Le-Khac, Sammer Markos, M-Tahar Kechadi
Abstract:
Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most international financial institutions have been implementing anti-money laundering solutions (AML) to fight investment fraud. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project for the purpose of developing a new solution for the AML Units in an international investment bank, we proposed a data mining-based solution for AML. In this paper, we present a heuristics approach to improve the performance for this solution. We also show some preliminary results associated with this method on analysing transaction datasets.Keywords: data mining, anti money laundering, clustering, heuristics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 358512552 Evaluation of Urban Development Proposals An ANP Approach
Authors: T. Gómez-Navarro, M. García-Melón, D. Díaz-Martín, S. Acuna-Dutra,
Abstract:
In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.
Keywords: Environmental pressure indicators, multicriteria decision analysis, analytic network process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180312551 Estimating Word Translation Probabilities for Thai – English Machine Translation using EM Algorithm
Authors: Chutchada Nusai, Yoshimi Suzuki, Haruaki Yamazaki
Abstract:
Selecting the word translation from a set of target language words, one that conveys the correct sense of source word and makes more fluent target language output, is one of core problems in machine translation. In this paper we compare the 3 methods of estimating word translation probabilities for selecting the translation word in Thai – English Machine Translation. The 3 methods are (1) Method based on frequency of word translation, (2) Method based on collocation of word translation, and (3) Method based on Expectation Maximization (EM) algorithm. For evaluation we used Thai – English parallel sentences generated by NECTEC. The method based on EM algorithm is the best method in comparison to the other methods and gives the satisfying results.Keywords: Machine translation, EM algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167912550 Analysis of Image Segmentation Techniques for Diagnosis of Dental Caries in X-ray Images
Authors: V. Geetha, K. S. Aprameya
Abstract:
Early diagnosis of dental caries is essential for maintaining dental health. In this paper, method for diagnosis of dental caries is proposed using Laplacian filter, adaptive thresholding, texture analysis and Support Vector Machine (SVM) classifier. Analysis of the proposed method is compared with Otsu thresholding, watershed segmentation and active contouring method. Adaptive thresholding has comparatively better performance with 96.9% accuracy and 96.1% precision. The results are validated using statistical method, two-way ANOVA, at significant level of 5%, that shows the interaction of proposed method on performance parameter measures are significant. Hence the proposed technique could be used for detection of dental caries in automated computer assisted diagnosis system.
Keywords: Computer assisted diagnosis, dental caries, dental radiography, image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 115612549 A Decision Tree Approach to Estimate Permanent Residents Using Remote Sensing Data in Lebanese Municipalities
Authors: K. Allaw, J. Adjizian Gerard, M. Chehayeb, A. Raad, W. Fahs, A. Badran, A. Fakherdin, H. Madi, N. Badaro Saliba
Abstract:
Population estimation using Geographic Information System (GIS) and remote sensing faces many obstacles such as the determination of permanent residents. A permanent resident is an individual who stays and works during all four seasons in his village. So, all those who move towards other cities or villages are excluded from this category. The aim of this study is to identify the factors affecting the percentage of permanent residents in a village and to determine the attributed weight to each factor. To do so, six factors have been chosen (slope, precipitation, temperature, number of services, time to Central Business District (CBD) and the proximity to conflict zones) and each one of those factors has been evaluated using one of the following data: the contour lines map of 50 m, the precipitation map, four temperature maps and data collected through surveys. The weighting procedure has been done using decision tree method. As a result of this procedure, temperature (50.8%) and percentage of precipitation (46.5%) are the most influencing factors.
Keywords: Remote sensing and GIS, permanent residence, decision tree, Lebanon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 101012548 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran
Authors: M. Ahmadi, M. Kafil, H. Ebrahimi
Abstract:
Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.
Keywords: Broken bar, condition monitoring, diagnostics, empirical mode decomposition, Fourier transform, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 80112547 Program Camouflage: A Systematic Instruction Hiding Method for Protecting Secrets
Authors: Yuichiro Kanzaki, Akito Monden, Masahide Nakamura, Ken-ichi Matsumoto
Abstract:
This paper proposes an easy-to-use instruction hiding method to protect software from malicious reverse engineering attacks. Given a source program (original) to be protected, the proposed method (1) takes its modified version (fake) as an input, (2) differences in assembly code instructions between original and fake are analyzed, and, (3) self-modification routines are introduced so that fake instructions become correct (i.e., original instructions) before they are executed and that they go back to fake ones after they are executed. The proposed method can add a certain amount of security to a program since the fake instructions in the resultant program confuse attackers and it requires significant effort to discover and remove all the fake instructions and self-modification routines. Also, this method is easy to use (with little effort) because all a user (who uses the proposed method) has to do is to prepare a fake source code by modifying the original source code.Keywords: Copyright protection, program encryption, program obfuscation, self-modification, software protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150712546 Design, Modeling and Fabrication of a Tactile Sensor and Display System for Application in Laparoscopic Surgery
Authors: M. Ramezanifard, J. Dargahi, S. Najarian, N. Narayanan
Abstract:
One of the major disadvantages of the minimally invasive surgery (MIS) is the lack of tactile feedback to the surgeon. In order to identify and avoid any damage to the grasped complex tissue by endoscopic graspers, it is important to measure the local softness of tissue during MIS. One way to display the measured softness to the surgeon is a graphical method. In this paper, a new tactile sensor has been reported. The tactile sensor consists of an array of four softness sensors, which are integrated into the jaws of a modified commercial endoscopic grasper. Each individual softness sensor consists of two piezoelectric polymer Polyvinylidene Fluoride (PVDF) films, which are positioned below a rigid and a compliant cylinder. The compliant cylinder is fabricated using a micro molding technique. The combination of output voltages from PVDF films is used to determine the softness of the grasped object. The theoretical analysis of the sensor is also presented. A method has been developed with the aim of reproducing the tactile softness to the surgeon by using a graphical method. In this approach, the proposed system, including the interfacing and the data acquisition card, receives signals from the array of softness sensors. After the signals are processed, the tactile information is displayed by means of a color coding method. It is shown that the degrees of softness of the grasped objects/tissues can be visually differentiated and displayed on a monitor.Keywords: Minimally invasive surgery, Robotic surgery, Sensor, Softness, Tactile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711