Search results for: Ensemble Algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3748

Search results for: Ensemble Algorithm

1738 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.

Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 339
1737 Lake of Neuchatel: Effect of Increasing Storm Events on Littoral Transport and Coastal Structures

Authors: Charlotte Dreger, Erik Bollaert

Abstract:

This paper presents two environmentally-friendly coastal structures realized on the Lake of Neuchâtel. Both structures reflect current environmental issues of concern on the lake and have been strongly affected by extreme meteorological conditions between their period of design and their actual operational period. The Lake of Neuchatel is one of the biggest Swiss lakes and measures around 38 km in length and 8.2 km in width, for a maximum water depth of 152 m. Its particular topographical alignment, situated in between the Swiss Plateau and the Jura mountains, combines strong winds and large fetch values, resulting in significant wave heights during storm events at both north-east and south-west lake extremities. In addition, due to flooding concerns, historically, lake levels have been lowered by several meters during the Jura correction works in the 19th and 20th century. Hence, during storm events, continuous erosion of the vulnerable molasse shorelines and sand banks generate frequent and abundant littoral transport from the center of the lake to its extremities. This phenomenon does not only cause disturbances of the ecosystem, but also generates numerous problems at natural or man-made infrastructures located along the shorelines, such as reed plants, harbor entrances, canals, etc. A first example is provided at the southwestern extremity, near the city of Yverdon, where an ensemble of 11 small islands, the Iles des Vernes, have been artificially created in view of enhancing biological conditions and food availability for bird species during their migration process, replacing at the same time two larger islands that were affected by lack of morphodynamics and general vegetalization of their surfaces. The article will present the concept and dimensioning of these islands based on 2D numerical modelling, as well as the realization and follow-up campaigns. In particular, the influence of several major storm events that occurred immediately after the works will be pointed out. Second, a sediment retention dike is discussed at the northeastern extremity, at the entrance of the Canal de la Broye into the lake. This canal is heavily used for navigation and suffers from frequent and significant sedimentation at its outlet. The new coastal structure has been designed to minimize sediment deposits around the exutory of the canal into the lake, by retaining the littoral transport during storm events. The article will describe the basic assumptions used to design the dike, as well as the construction works and follow-up campaigns. Especially the huge influence of changing meteorological conditions on the littoral transport of the Lake of Neuchatel since project design ten years ago will be pointed out. Not only the intensity and frequency of storm events are increasing, but also the main wind directions alter, affecting in this way the efficiency of the coastal structure in retaining the sediments.

Keywords: meteorological evolution, sediment transport, lake of Neuchatel, numerical modelling, environmental measures

Procedia PDF Downloads 87
1736 Close Loop Controlled Current Nerve Locator

Authors: H. A. Alzomor, B. K. Ouda, A. M. Eldeib

Abstract:

Successful regional anesthesia depends upon precise location of the peripheral nerve or nerve plexus. Locating peripheral nerves is preferred to be done using nerve stimulation. In order to generate a nerve impulse by electrical means, a minimum threshold stimulus of current “rheobase” must be applied to the nerve. The technique depends on stimulating muscular twitching at a close distance to the nerve without actually touching it. Success rate of this operation depends on the accuracy of current intensity pulses used for stimulation. In this paper, we will discuss a circuit and algorithm for closed loop control for the current, theoretical analysis and test results and compare them with previous techniques.

Keywords: Close Loop Control (CLC), constant current, nerve locator, rheobase

Procedia PDF Downloads 257
1735 A Two Phase VNS Algorithm for the Combined Production Routing Problem

Authors: Nejah Ben Mabrouk, Bassem Jarboui, Habib Chabchoub

Abstract:

Production and distribution planning is the most important part in supply chain management. In this paper, a NP-hard production-distribution problem for one product over a multi-period horizon is investigated. The aim is to minimize the sum of costs of three items: production setups, inventories and distribution, while determining, for each period, the amount produced, the inventory levels and the delivery trips. To solve this difficult problem, we propose a bi-phase approach based on a Variable Neighbourhood Search (VNS). This heuristic is tested on 90 randomly generated instances from the literature, with 20 periods and 50, 100, 200 customers. Computational results show that our approach outperforms existing solution procedures available in the literature

Keywords: logistic, production, distribution, variable neighbourhood search

Procedia PDF Downloads 340
1734 Electrical Dault Detection of Photovoltaic System: A Short-Circuit Fault Case

Authors: Moustapha H. Ibrahim, Dahir Abdourahman

Abstract:

This document presents a short-circuit fault detection process in a photovoltaic (PV) system. The proposed method is developed in MATLAB/Simulink. It determines whatever the size of the installation number of the short circuit module. The proposed algorithm indicates the presence or absence of an abnormality on the power of the PV system through measures of hourly global irradiation, power output, and ambient temperature. In case a fault is detected, it displays the number of modules in a short circuit. This fault detection method has been successfully tested on two different PV installations.

Keywords: PV system, short-circuit, fault detection, modelling, MATLAB-Simulink

Procedia PDF Downloads 234
1733 Anisotropic Approach for Discontinuity Preserving in Optical Flow Estimation

Authors: Pushpendra Kumar, Sanjeev Kumar, R. Balasubramanian

Abstract:

Estimation of optical flow from a sequence of images using variational methods is one of the most successful approach. Discontinuity between different motions is one of the challenging problem in flow estimation. In this paper, we design a new anisotropic diffusion operator, which is able to provide smooth flow over a region and efficiently preserve discontinuity in optical flow. This operator is designed on the basis of intensity differences of the pixels and isotropic operator using exponential function. The combination of these are used to control the propagation of flow. Experimental results on the different datasets verify the robustness and accuracy of the algorithm and also validate the effect of anisotropic operator in the discontinuity preserving.

Keywords: optical flow, variational methods, computer vision, anisotropic operator

Procedia PDF Downloads 876
1732 Impact of Climate Change on Irrigation and Hydropower Potential: A Case of Upper Blue Nile Basin in Western Ethiopia

Authors: Elias Jemal Abdella

Abstract:

The Blue Nile River is an important shared resource of Ethiopia, Sudan and also, because it is the major contributor of water to the main Nile River, Egypt. Despite the potential benefits of regional cooperation and integrated joint basin management, all three countries continue to pursue unilateral plans for development. Besides, there is great uncertainty about the likely impacts of climate change in water availability for existing as well as proposed irrigation and hydropower projects in the Blue Nile Basin. The main objective of this study is to quantitatively assess the impact of climate change on the hydrological regime of the upper Blue Nile basin, western Ethiopia. Three models were combined, a dynamic Coordinated Regional Climate Downscaling Experiment (CORDEX) regional climate model (RCM) that is used to determine climate projections for the Upper Blue Nile basin for Representative Concentration Pathways (RCPs) 4.5 and 8.5 greenhouse gas emissions scenarios for the period 2021-2050. The outputs generated from multimodel ensemble of four (4) CORDEX-RCMs (i.e., rainfall and temperature) were used as input to a Soil and Water Assessment Tool (SWAT) hydrological model which was setup, calibrated and validated with observed climate and hydrological data. The outputs from the SWAT model (i.e., projections in river flow) were used as input to a Water Evaluation and Planning (WEAP) water resources model which was used to determine the water resources implications of the changes in climate. The WEAP model was set-up to simulate three development scenarios. Current Development scenario was the existing water resource development situation, Medium-term Development scenario was planned water resource development that is expected to be commissioned (i.e. before 2025) and Long-term full Development scenario were all planned water resource development likely to be commissioned (i.e. before 2050). The projected change of mean annual temperature for period (2021 – 2050) in most of the basin are warmer than the baseline (1982 -2005) average in the range of 1 to 1.4oC, implying that an increase in evapotranspiration loss. Subbasins which already distressed from drought may endure to face even greater challenges in the future. Projected mean annual precipitation varies from subbasin to subbasin; in the Eastern, North Eastern and South western highland of the basin a likely increase of mean annual precipitation up to 7% whereas in the western lowland part of the basin mean annual precipitation projected to decrease by 3%. The water use simulation indicates that currently irrigation demand in the basin is 1.29 Bm3y-1 for 122,765 ha of irrigation area. By 2025, with new schemes being developed, irrigation demand is estimated to increase to 2.5 Bm3y-1 for 277,779 ha. By 2050, irrigation demand in the basin is estimated to increase to 3.4 Bm3y-1 for 372,779 ha. The hydropower generation simulation indicates that 98 % of hydroelectricity potential could be produced if all planned dams are constructed.

Keywords: Blue Nile River, climate change, hydropower, SWAT, WEAP

Procedia PDF Downloads 357
1731 Crater Detection Using PCA from Captured CMOS Camera Data

Authors: Tatsuya Takino, Izuru Nomura, Yuji Kageyama, Shin Nagata, Hiroyuki Kamata

Abstract:

We propose a method of detecting the craters from the image of the lunar surface. This proposal assumes that it is applied to SLIM (Smart Lander for Investigating Moon) working group aiming at the pinpoint landing on the lunar surface and investigating scientific research. It is difficult to equip and use high-performance computers for the small space probe. So, it is necessary to use a small computer with an exclusive hardware such as FPGA. We have studied the crater detection using principal component analysis (PCA), In this paper, We implement detection algorithm into the FPGA, and the detection is performed on the data that was captured from the CMOS camera.

Keywords: crater detection, PCA, FPGA, image processing

Procedia PDF Downloads 552
1730 Robust Control of a Dynamic Model of an F-16 Aircraft with Improved Damping through Linear Matrix Inequalities

Authors: J. P. P. Andrade, V. A. F. Campos

Abstract:

This work presents an application of Linear Matrix Inequalities (LMI) for the robust control of an F-16 aircraft through an algorithm ensuring the damping factor to the closed loop system. The results show that the zero and gain settings are sufficient to ensure robust performance and stability with respect to various operating points. The technique used is the pole placement, which aims to put the system in closed loop poles in a specific region of the complex plane. Test results using a dynamic model of the F-16 aircraft are presented and discussed.

Keywords: F-16 aircraft, linear matrix inequalities, pole placement, robust control

Procedia PDF Downloads 309
1729 On the Analysis of Pseudorandom Partial Quotient Sequences Generated from Continued Fractions

Authors: T. Padma, Jayashree S. Pillai

Abstract:

Random entities are an essential component in any cryptographic application. The suitability of a number theory based novel pseudorandom sequence called Pseudorandom Partial Quotient Sequence (PPQS) generated from the continued fraction expansion of irrational numbers, in cryptographic applications, is analyzed in this paper. An approach to build the algorithm around a hard mathematical problem has been considered. The PQ sequence is tested for randomness and its suitability as a cryptographic key by performing randomness analysis, key sensitivity and key space analysis, precision analysis and evaluating the correlation properties is established.

Keywords: pseudorandom sequences, key sensitivity, correlation, security analysis, randomness analysis, sensitivity analysis

Procedia PDF Downloads 595
1728 Signs-Only Compressed Row Storage Format for Exact Diagonalization Study of Quantum Fermionic Models

Authors: Michael Danilov, Sergei Iskakov, Vladimir Mazurenko

Abstract:

The present paper describes a high-performance parallel realization of an exact diagonalization solver for quantum-electron models in a shared memory computing system. The proposed algorithm contains a storage format for efficient computing eigenvalues and eigenvectors of a quantum electron Hamiltonian matrix. The results of the test calculations carried out for 15 sites Hubbard model demonstrate reduction in the required memory and good multiprocessor scalability, while maintaining performance of the same order as compressed row storage.

Keywords: sparse matrix, compressed format, Hubbard model, Anderson model

Procedia PDF Downloads 407
1727 Optical Flow Direction Determination for Railway Crossing Occupancy Monitoring

Authors: Zdenek Silar, Martin Dobrovolny

Abstract:

This article deals with the obstacle detection on a railway crossing (clearance detection). Detection is based on the optical flow estimation and classification of the flow vectors by K-means clustering algorithm. For classification of passing vehicles is used optical flow direction determination. The optical flow estimation is based on a modified Lucas-Kanade method.

Keywords: background estimation, direction of optical flow, K-means clustering, objects detection, railway crossing monitoring, velocity vectors

Procedia PDF Downloads 520
1726 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 145
1725 Secure Image Encryption via Enhanced Fractional Order Chaotic Map

Authors: Ismail Haddad, Djamel Herbadji, Aissa Belmeguenai, Selma Boumerdassi

Abstract:

in this paper, we provide a novel approach for image encryption that employs the Fibonacci matrix and an enhanced fractional order chaotic map. The enhanced map overcomes the drawbacks of the classical map, especially the limited chaotic range and non-uniform distribution of chaotic sequences, resulting in a larger encryption key space. As a result, this strategy improves the encryption system's security. Our experimental results demonstrate that our proposed algorithm effectively encrypts grayscale images with exceptional efficiency. Furthermore, our technique is resistant to a wide range of potential attacks, including statistical and entropy attacks.

Keywords: image encryption, logistic map, fibonacci matrix, grayscale images

Procedia PDF Downloads 320
1724 Asynchronous Sequential Machines with Fault Detectors

Authors: Seong Woo Kwak, Jung-Min Yang

Abstract:

A strategy of fault diagnosis and tolerance for asynchronous sequential machines is discussed in this paper. With no synchronizing clock, it is difficult to diagnose an occurrence of permanent or stuck-in faults in the operation of asynchronous machines. In this paper, we present a fault detector comprised of a timer and a set of static functions to determine the occurrence of faults. In order to realize immediate fault tolerance, corrective control theory is applied to designing a dynamic feedback controller. Existence conditions for an appropriate controller and its construction algorithm are presented in terms of reachability of the machine and the feature of fault occurrences.

Keywords: asynchronous sequential machines, corrective control, fault diagnosis and tolerance, fault detector

Procedia PDF Downloads 353
1723 Automatic Segmentation of 3D Tomographic Images Contours at Radiotherapy Planning in Low Cost Solution

Authors: D. F. Carvalho, A. O. Uscamayta, J. C. Guerrero, H. F. Oliveira, P. M. Azevedo-Marques

Abstract:

The creation of vector contours slices (ROIs) on body silhouettes in oncologic patients is an important step during the radiotherapy planning in clinic and hospitals to ensure the accuracy of oncologic treatment. The radiotherapy planning of patients is performed by complex softwares focused on analysis of tumor regions, protection of organs at risk (OARs) and calculation of radiation doses for anomalies (tumors). These softwares are supplied for a few manufacturers and run over sophisticated workstations with vector processing presenting a cost of approximately twenty thousand dollars. The Brazilian project SIPRAD (Radiotherapy Planning System) presents a proposal adapted to the emerging countries reality that generally does not have the monetary conditions to acquire some radiotherapy planning workstations, resulting in waiting queues for new patients treatment. The SIPRAD project is composed by a set of integrated and interoperabilities softwares that are able to execute all stages of radiotherapy planning on simple personal computers (PCs) in replace to the workstations. The goal of this work is to present an image processing technique, computationally feasible, that is able to perform an automatic contour delineation in patient body silhouettes (SIPRAD-Body). The SIPRAD-Body technique is performed in tomography slices under grayscale images, extending their use with a greedy algorithm in three dimensions. SIPRAD-Body creates an irregular polyhedron with the Canny Edge adapted algorithm without the use of preprocessing filters, as contrast and brightness. In addition, comparing the technique SIPRAD-Body with existing current solutions is reached a contours similarity at least 78%. For this comparison is used four criteria: contour area, contour length, difference between the mass centers and Jaccard index technique. SIPRAD-Body was tested in a set of oncologic exams provided by the Clinical Hospital of the University of Sao Paulo (HCRP-USP). The exams were applied in patients with different conditions of ethnology, ages, tumor severities and body regions. Even in case of services that have already workstations, it is possible to have SIPRAD working together PCs because of the interoperability of communication between both systems through the DICOM protocol that provides an increase of workflow. Therefore, the conclusion is that SIPRAD-Body technique is feasible because of its degree of similarity in both new radiotherapy planning services and existing services.

Keywords: radiotherapy, image processing, DICOM RT, Treatment Planning System (TPS)

Procedia PDF Downloads 298
1722 The Artificial Intelligence Technologies Used in PhotoMath Application

Authors: Tala Toonsi, Marah Alagha, Lina Alnowaiser, Hala Rajab

Abstract:

This report is about the Photomath app, which is an AI application that uses image recognition technology, specifically optical character recognition (OCR) algorithms. The (OCR) algorithm translates the images into a mathematical equation, and the app automatically provides a step-by-step solution. The application supports decimals, basic arithmetic, fractions, linear equations, and multiple functions such as logarithms. Testing was conducted to examine the usage of this app, and results were collected by surveying ten participants. Later, the results were analyzed. This paper seeks to answer the question: To what level the artificial intelligence features are accurate and the speed of process in this app. It is hoped this study will inform about the efficiency of AI in Photomath to the users.

Keywords: photomath, image recognition, app, OCR, artificial intelligence, mathematical equations.

Procedia PDF Downloads 174
1721 Engineering Optimization of Flexible Energy Absorbers

Authors: Reza Hedayati, Meysam Jahanbakhshi

Abstract:

Elastic energy absorbers which consist of a ring-liked plate and springs can be a good choice for increasing the impact duration during an accident. In the current project, an energy absorber system is optimized using four optimizing methods Kuhn-Tucker, Sequential Linear Programming (SLP), Concurrent Subspace Design (CSD), and Pshenichny-Lim-Belegundu-Arora (PLBA). Time solution, convergence, Programming Length and accuracy of the results were considered to find the best solution algorithm. Results showed the superiority of PLBA over the other algorithms.

Keywords: Concurrent Subspace Design (CSD), Kuhn-Tucker, Pshenichny-Lim-Belegundu-Arora (PLBA), Sequential Linear Programming (SLP)

Procedia PDF Downloads 400
1720 The Possibility of Solving a 3x3 Rubik’s Cube under 3 Seconds

Authors: Chung To Kong, Siu Ming Yiu

Abstract:

Rubik's cube was invented in 1974. Since then, speedcubers all over the world try their best to break the world record again and again. The newest record is 3.47 seconds. There are many factors that affect the timing, including turns per second (tps), algorithm, finger trick, hardware of the cube. In this paper, the lower bound of the cube solving time will be discussed using convex optimization. Extended analysis of the world records will be used to understand how to improve the timing. With the understanding of each part of the solving step, the paper suggests a list of speed improvement techniques. Based on the analysis of the world record, there is a high possibility that the 3 seconds mark will be broken soon.

Keywords: Rubik's Cube, speed, finger trick, optimization

Procedia PDF Downloads 210
1719 Data Stream Association Rule Mining with Cloud Computing

Authors: B. Suraj Aravind, M. H. M. Krishna Prasad

Abstract:

There exist emerging applications of data streams that require association rule mining, such as network traffic monitoring, web click streams analysis, sensor data, data from satellites etc. Data streams typically arrive continuously in high speed with huge amount and changing data distribution. This raises new issues that need to be considered when developing association rule mining techniques for stream data. This paper proposes to introduce an improved data stream association rule mining algorithm by eliminating the limitation of resources. For this, the concept of cloud computing is used. Inclusion of this may lead to additional unknown problems which needs further research.

Keywords: data stream, association rule mining, cloud computing, frequent itemsets

Procedia PDF Downloads 505
1718 Comparative Study of Scheduling Algorithms for LTE Networks

Authors: Samia Dardouri, Ridha Bouallegue

Abstract:

Scheduling is the process of dynamically allocating physical resources to User Equipment (UE) based on scheduling algorithms implemented at the LTE base station. Various algorithms have been proposed by network researchers as the implementation of scheduling algorithm which represents an open issue in Long Term Evolution (LTE) standard. This paper makes an attempt to study and compare the performance of PF, MLWDF and EXP/PF scheduling algorithms. The evaluation is considered for a single cell with interference scenario for different flows such as Best effort, Video and VoIP in a pedestrian and vehicular environment using the LTE-Sim network simulator. The comparative study is conducted in terms of system throughput, fairness index, delay, packet loss ratio (PLR) and total cell spectral efficiency.

Keywords: LTE, multimedia flows, scheduling algorithms, mobile computing

Procedia PDF Downloads 386
1717 Simulation of Obstacle Avoidance for Multiple Autonomous Vehicles in a Dynamic Environment Using Q-Learning

Authors: Andreas D. Jansson

Abstract:

The availability of inexpensive, yet competent hardware allows for increased level of automation and self-optimization in the context of Industry 4.0. However, such agents require high quality information about their surroundings along with a robust strategy for collision avoidance, as they may cause expensive damage to equipment or other agents otherwise. Manually defining a strategy to cover all possibilities is both time-consuming and counter-productive given the capabilities of modern hardware. This paper explores the idea of a model-free self-optimizing obstacle avoidance strategy for multiple autonomous agents in a simulated dynamic environment using the Q-learning algorithm.

Keywords: autonomous vehicles, industry 4.0, multi-agent system, obstacle avoidance, Q-learning, simulation

Procedia PDF Downloads 140
1716 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment

Authors: Arindam Chaudhuri

Abstract:

Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.

Keywords: FRSVM, Hadoop, MapReduce, PFRSVM

Procedia PDF Downloads 491
1715 EDM for Prediction of Academic Trends and Patterns

Authors: Trupti Diwan

Abstract:

Predicting student failure at school has changed into a difficult challenge due to both the large number of factors that can affect the reduced performance of students and the imbalanced nature of these kinds of data sets. This paper surveys the two elements needed to make prediction on Students’ Academic Performances which are parameters and methods. This paper also proposes a framework for predicting the performance of engineering students. Genetic programming can be used to predict student failure/success. Ranking algorithm is used to rank students according to their credit points. The framework can be used as a basis for the system implementation & prediction of students’ Academic Performance in Higher Learning Institute.

Keywords: classification, educational data mining, student failure, grammar-based genetic programming

Procedia PDF Downloads 423
1714 An Approximation Algorithm for the Non Orthogonal Cutting Problem

Authors: R. Ouafi, F. Ouafi

Abstract:

We study the problem of cutting a rectangular material entity into smaller sub-entities of trapezoidal forms with minimum waste of the material. This problem will be denoted TCP (Trapezoidal Cutting Problem). The TCP has many applications in manufacturing processes of various industries: pipe line design (petro chemistry), the design of airfoil (aeronautical) or cuts of the components of textile products. We introduce an orthogonal build to provide the optimal horizontal and vertical homogeneous strips. In this paper we develop a general heuristic search based upon orthogonal build. By solving two one-dimensional knapsack problems, we combine the horizontal and vertical homogeneous strips to give a non orthogonal cutting pattern.

Keywords: combinatorial optimization, cutting problem, heuristic

Procedia PDF Downloads 542
1713 Glushkov's Construction for Functional Subsequential Transducers

Authors: Aleksander Mendoza

Abstract:

Glushkov's construction has many interesting properties, and they become even more evident when applied to transducers. This article strives to show the vast range of possible extensions and optimisations for this algorithm. Special flavour of regular expressions is introduced, which can be efficiently converted to e-free functional subsequential weighted finite state transducers. Produced automata are very compact, as they contain only one state for each symbol (from input alphabet) of original expression and only one transition for each range of symbols, no matter how large. Such compactified ranges of transitions allow for efficient binary search lookup during automaton evaluation. All the methods and algorithms presented here were used to implement open-source compiler of regular expressions for multitape transducers.

Keywords: weighted automata, transducers, Glushkov, follow automata, regular expressions

Procedia PDF Downloads 164
1712 Developement of a New Wearable Device for Automatic Guidance Service

Authors: Dawei Cai

Abstract:

In this paper, we present a new wearable device that provide an automatic guidance servie for visitors. By combining the position information from NFC and the orientation information from a 6 axis acceleration and terrestrial magnetism sensor, the head's direction can be calculated. We developed an algorithm to calculate the device orientation based on the data from acceleration and terrestrial magnetism sensor. If visitors want to know some explanation about an exhibit in front of him, what he has to do is just lift up his mobile device. The identification program will automatically identify the status based on the information from NFC and MEMS, and start playing explanation content for him. This service may be convenient for old people or disables or children.

Keywords: wearable device, ubiquitous computing, guide sysem, MEMS sensor, NFC

Procedia PDF Downloads 426
1711 Global Optimization Techniques for Optimal Placement of HF Antennas on a Shipboard

Authors: Mustafa Ural, Can Bayseferogulari

Abstract:

In this work, radio frequency (RF) coupling between two HF antennas on a shipboard platform is minimized by determining an optimal antenna placement. Unlike the other works, the coupling is minimized not only at single frequency but over the whole frequency band of operation. Similarly, GAO and PSO, are used in order to determine optimal antenna placement. Throughout this work, outputs of two optimization techniques are compared with each other in terms of antenna placements and coupling results. At the end of the work, far-field radiation pattern performances of the antennas at their optimal places are analyzed in terms of directivity and coverage in order to see that.

Keywords: electromagnetic compatibility, antenna placement, optimization, genetic algorithm optimization, particle swarm optimization

Procedia PDF Downloads 240
1710 Efficient Alias-Free Level Crossing Sampling

Authors: Negar Riazifar, Nigel G. Stocks

Abstract:

This paper proposes strategies in level crossing (LC) sampling and reconstruction that provide alias-free high-fidelity signal reconstruction for speech signals without exponentially increasing sample number with increasing bit-depth. We introduce methods in LC sampling that reduce the sampling rate close to the Nyquist frequency even for large bit-depth. The results indicate that larger variation in the sampling intervals leads to an alias-free sampling scheme; this is achieved by either reducing the bit-depth or adding jitter to the system for high bit-depths. In conjunction with windowing, the signal is reconstructed from the LC samples using an efficient Toeplitz reconstruction algorithm.

Keywords: alias-free, level crossing sampling, spectrum, trigonometric polynomial

Procedia PDF Downloads 214
1709 Reactive Power Cost Evaluation with FACTS Devices in Restructured Power System

Authors: A. S. Walkey, N. P. Patidar

Abstract:

It is not always economical to provide reactive power using synchronous alternators. The cost of reactive power can be minimized by optimal placing of FACTS devices in power systems. In this paper a Particle Swarm Optimization- Sequential Quadratic Programming (PSO-SQP) algorithm is applied to minimize the cost of reactive power generation along with real power generation to alleviate the bus voltage violations. The effectiveness of proposed approach tested on IEEE-14 bus systems. In this paper in addition to synchronous generators, an opportunity of FACTS devices are also proposed to procure the reactive power demands in the power system.

Keywords: reactive power, reactive power cost, voltage security margins, capability curve, FACTS devices

Procedia PDF Downloads 508