Search results for: biologically inspired algorithm
2743 Quadrature Mirror Filter Bank Design Using Population Based Stochastic Optimization
Authors: Ju-Hong Lee, Ding-Chen Chung
Abstract:
The paper deals with the optimal design of two-channel linear-phase (LP) quadrature mirror filter (QMF) banks using a metaheuristic based optimization technique. Based on the theory of two-channel QMF banks using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the group delay error of the designed QMF bank and the magnitude response error of the designed low-pass analysis filter. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a particle swarm optimization algorithm. The resulting two-channel QMF banks can possess approximately LP response without magnitude distortion. Simulation results are presented for illustration and comparison.Keywords: quadrature mirror filter bank, digital all-pass filter, weighted least squares algorithm, particle swarm optimization
Procedia PDF Downloads 5212742 Development of an Automatic Computational Machine Learning Pipeline to Process Confocal Fluorescence Images for Virtual Cell Generation
Authors: Miguel Contreras, David Long, Will Bachman
Abstract:
Background: Microscopy plays a central role in cell and developmental biology. In particular, fluorescence microscopy can be used to visualize specific cellular components and subsequently quantify their morphology through development of virtual-cell models for study of effects of mechanical forces on cells. However, there are challenges with these imaging experiments, which can make it difficult to quantify cell morphology: inconsistent results, time-consuming and potentially costly protocols, and limitation on number of labels due to spectral overlap. To address these challenges, the objective of this project is to develop an automatic computational machine learning pipeline to predict cellular components morphology for virtual-cell generation based on fluorescence cell membrane confocal z-stacks. Methods: Registered confocal z-stacks of nuclei and cell membrane of endothelial cells, consisting of 20 images each, were obtained from fluorescence confocal microscopy and normalized through software pipeline for each image to have a mean pixel intensity value of 0.5. An open source machine learning algorithm, originally developed to predict fluorescence labels on unlabeled transmitted light microscopy cell images, was trained using this set of normalized z-stacks on a single CPU machine. Through transfer learning, the algorithm used knowledge acquired from its previous training sessions to learn the new task. Once trained, the algorithm was used to predict morphology of nuclei using normalized cell membrane fluorescence images as input. Predictions were compared to the ground truth fluorescence nuclei images. Results: After one week of training, using one cell membrane z-stack (20 images) and corresponding nuclei label, results showed qualitatively good predictions on training set. The algorithm was able to accurately predict nuclei locations as well as shape when fed only fluorescence membrane images. Similar training sessions with improved membrane image quality, including clear lining and shape of the membrane, clearly showing the boundaries of each cell, proportionally improved nuclei predictions, reducing errors relative to ground truth. Discussion: These results show the potential of pre-trained machine learning algorithms to predict cell morphology using relatively small amounts of data and training time, eliminating the need of using multiple labels in immunofluorescence experiments. With further training, the algorithm is expected to predict different labels (e.g., focal-adhesion sites, cytoskeleton), which can be added to the automatic machine learning pipeline for direct input into Principal Component Analysis (PCA) for generation of virtual-cell mechanical models.Keywords: cell morphology prediction, computational machine learning, fluorescence microscopy, virtual-cell models
Procedia PDF Downloads 2052741 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR
Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.
Abstract:
We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME
Procedia PDF Downloads 3962740 Dynamic Background Updating for Lightweight Moving Object Detection
Authors: Kelemewerk Destalem, Joongjae Cho, Jaeseong Lee, Ju H. Park, Joonhyuk Yoo
Abstract:
Background subtraction and temporal difference are often used for moving object detection in video. Both approaches are computationally simple and easy to be deployed in real-time image processing. However, while the background subtraction is highly sensitive to dynamic background and illumination changes, the temporal difference approach is poor at extracting relevant pixels of the moving object and at detecting the stopped or slowly moving objects in the scene. In this paper, we propose a moving object detection scheme based on adaptive background subtraction and temporal difference exploiting dynamic background updates. The proposed technique consists of a histogram equalization, a linear combination of background and temporal difference, followed by the novel frame-based and pixel-based background updating techniques. Finally, morphological operations are applied to the output images. Experimental results show that the proposed algorithm can solve the drawbacks of both background subtraction and temporal difference methods and can provide better performance than that of each method.Keywords: background subtraction, background updating, real time, light weight algorithm, temporal difference
Procedia PDF Downloads 3422739 Experimental Analysis of Control in Electric Vehicle Charging Station Based Grid Tied Photovoltaic-Battery System
Authors: A. Hassoune, M. Khafallah, A. Mesbahi, T. Bouragba
Abstract:
This work presents an improved strategy of control for charging a lithium-ion battery in an electric vehicle charging station using two charger topologies i.e. single ended primary inductor converter (SEPIC) and forward converter. In terms of rapidity and accuracy, the power system consists of a topology/control diagram that would overcome the performance constraints, for instance the power instability, the battery overloading and how the energy conversion blocks would react efficiently to any kind of perturbations. Simulation results show the effectiveness of the proposed topologies operated with a power management algorithm based on voltage/peak current mode controls. In order to provide credible findings, a low power prototype is developed to test the control strategy via experimental evaluations of the converter topology and its controls.Keywords: battery storage buffer, charging station, electric vehicle, experimental analysis, management algorithm, switches control
Procedia PDF Downloads 1652738 The Role of Inflammasomes for aβ Microglia Phagocytosis in Alzheimer Disease
Authors: Francesca La Rosa , Marina Saresella, Mario Clerici, Michael Heneka
Abstract:
Neuroinflammation plays a key role in the modulation of the pathogenesis of neurodegenerative disorder such as Alzheimer's Disease (AD). Microglia, the main immune effector of the brain, are able to migrate to sites of Amyloid-beta (Aβ) deposition to eliminate Aβ phagocytosis upon activation by multiple receptors: Toll like receptors and scavenger receptors. The issue of whether microglia are able to eliminate pathological lesions such as neurofibrillary tangles or senile plaques from AD brain still remains the matter of controversy. Recent data suggest that the Nod Like Receptor 3 (NLRP3), multiprotein inflammasome complexes, plays a role in AD, as its activation in the microglia by Aβ triggers. IL-1β is produced as a biologically inactive pro-form and requires caspase-1 for activation and secretion. Caspase-1 activity is controlled by inflammasomes. We investigate about the importance of inflammasomes complex in the Aβ phagocytosis and its degradation. The preliminary results of phagocytosis assay and immunofluorescent experiment on primary Microglia cells to lipopolysaccharide (LPS) an Aβ exposure show that a previous treatment with LPS reduce Aβ phagocytosis. Different results were obtained in Primary Microglia wild type, NLRP3 and ASC Knockout suggesting a real inflammasomes involvement in Alzheimer's pathology. Inflammasomes inactivation reduces the production of inflammatory cytokines prolonging the protective activity of microglia and Aβ clearance, featuring a typical microglia phenotype of the early stage of AD disease.Keywords: Alzheimer disease, innate immunity, neuroinflammation, NLRP3
Procedia PDF Downloads 4562737 Estimation of Synchronous Machine Synchronizing and Damping Torque Coefficients
Authors: Khaled M. EL-Naggar
Abstract:
Synchronizing and damping torque coefficients of a synchronous machine can give a quite clear picture for machine behavior during transients. These coefficients are used as a power system transient stability measurement. In this paper, a crow search optimization algorithm is presented and implemented to study the power system stability during transients. The algorithm makes use of the machine responses to perform the stability study in time domain. The problem is formulated as a dynamic estimation problem. An objective function that minimizes the error square in the estimated coefficients is designed. The method is tested using practical system with different study cases. Results are reported and a thorough discussion is presented. The study illustrates that the proposed method can estimate the stability coefficients for the critical stable cases where other methods may fail. The tests proved that the proposed tool is an accurate and reliable tool for estimating the machine coefficients for assessment of power system stability.Keywords: optimization, estimation, synchronous, machine, crow search
Procedia PDF Downloads 1402736 Finding Related Scientific Documents Using Formal Concept Analysis
Authors: Nadeem Akhtar, Hira Javed
Abstract:
An important aspect of research is literature survey. Availability of a large amount of literature across different domains triggers the need for optimized systems which provide relevant literature to researchers. We propose a search system based on keywords for text documents. This experimental approach provides a hierarchical structure to the document corpus. The documents are labelled with keywords using KEA (Keyword Extraction Algorithm) and are automatically organized in a lattice structure using Formal Concept Analysis (FCA). This groups the semantically related documents together. The hierarchical structure, based on keywords gives out only those documents which precisely contain them. This approach open doors for multi-domain research. The documents across multiple domains which are indexed by similar keywords are grouped together. A hierarchical relationship between keywords is obtained. To signify the effectiveness of the approach, we have carried out the experiment and evaluation on Semeval-2010 Dataset. Results depict that the presented method is considerably successful in indexing of scientific papers.Keywords: formal concept analysis, keyword extraction algorithm, scientific documents, lattice
Procedia PDF Downloads 3322735 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel
Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani
Abstract:
Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry
Procedia PDF Downloads 2712734 Estimation of the Temperatures in an Asynchronous Machine Using Extended Kalman Filter
Authors: Yi Huang, Clemens Guehmann
Abstract:
In order to monitor the thermal behavior of an asynchronous machine with squirrel cage rotor, a 9th-order extended Kalman filter (EKF) algorithm is implemented to estimate the temperatures of the stator windings, the rotor cage and the stator core. The state-space equations of EKF are established based on the electrical, mechanical and the simplified thermal models of an asynchronous machine. The asynchronous machine with simplified thermal model in Dymola is compiled as DymolaBlock, a physical model in MATLAB/Simulink. The coolant air temperature, three-phase voltages and currents are exported from the physical model and are processed by EKF estimator as inputs. Compared to the temperatures exported from the physical model of the machine, three parts of temperatures can be estimated quite accurately by the EKF estimator. The online EKF estimator is independent from the machine control algorithm and can work under any speed and load condition if the stator current is nonzero current system.Keywords: asynchronous machine, extended Kalman filter, resistance, simulation, temperature estimation, thermal model
Procedia PDF Downloads 2852733 Phantom and Clinical Evaluation of Block Sequential Regularized Expectation Maximization Reconstruction Algorithm in Ga-PSMA PET/CT Studies Using Various Relative Difference Penalties and Acquisition Durations
Authors: Fatemeh Sadeghi, Peyman Sheikhzadeh
Abstract:
Introduction: Block Sequential Regularized Expectation Maximization (BSREM) reconstruction algorithm was recently developed to suppress excessive noise by applying a relative difference penalty. The aim of this study was to investigate the effect of various strengths of noise penalization factor in the BSREM algorithm under different acquisition duration and lesion sizes in order to determine an optimum penalty factor by considering both quantitative and qualitative image evaluation parameters in clinical uses. Materials and Methods: The NEMA IQ phantom and 15 clinical whole-body patients with prostate cancer were evaluated. Phantom and patients were injected withGallium-68 Prostate-Specific Membrane Antigen(68 Ga-PSMA)and scanned on a non-time-of-flight Discovery IQ Positron Emission Tomography/Computed Tomography(PET/CT) scanner with BGO crystals. The data were reconstructed using BSREM with a β-value of 100-500 at an interval of 100. These reconstructions were compared to OSEM as a widely used reconstruction algorithm. Following the standard NEMA measurement procedure, background variability (BV), recovery coefficient (RC), contrast recovery (CR) and residual lung error (LE) from phantom data and signal-to-noise ratio (SNR), signal-to-background ratio (SBR) and tumor SUV from clinical data were measured. Qualitative features of clinical images visually were ranked by one nuclear medicine expert. Results: The β-value acts as a noise suppression factor, so BSREM showed a decreasing image noise with an increasing β-value. BSREM, with a β-value of 400 at a decreased acquisition duration (2 min/ bp), made an approximately equal noise level with OSEM at an increased acquisition duration (5 min/ bp). For the β-value of 400 at 2 min/bp duration, SNR increased by 43.7%, and LE decreased by 62%, compared with OSEM at a 5 min/bp duration. In both phantom and clinical data, an increase in the β-value is translated into a decrease in SUV. The lowest level of SUV and noise were reached with the highest β-value (β=500), resulting in the highest SNR and lowest SBR due to the greater noise reduction than SUV reduction at the highest β-value. In compression of BSREM with different β-values, the relative difference in the quantitative parameters was generally larger for smaller lesions. As the β-value decreased from 500 to 100, the increase in CR was 160.2% for the smallest sphere (10mm) and 12.6% for the largest sphere (37mm), and the trend was similar for SNR (-58.4% and -20.5%, respectively). BSREM visually was ranked more than OSEM in all Qualitative features. Conclusions: The BSREM algorithm using more iteration numbers leads to more quantitative accuracy without excessive noise, which translates into higher overall image quality and lesion detectability. This improvement can be used to shorter acquisition time.Keywords: BSREM reconstruction, PET/CT imaging, noise penalization, quantification accuracy
Procedia PDF Downloads 972732 Building Scalable and Accurate Hybrid Kernel Mapping Recommender
Authors: Hina Iqbal, Mustansar Ali Ghazanfar, Sandor Szedmak
Abstract:
Recommender systems uses artificial intelligence practices for filtering obscure information and can predict if a user likes a specified item. Kernel mapping Recommender systems have been proposed which are accurate and state-of-the-art algorithms and resolve recommender system’s design objectives such as; long tail, cold-start, and sparsity. The aim of research is to propose hybrid framework that can efficiently integrate different versions— namely item-based and user-based KMR— of KMR algorithm. We have proposed various heuristic algorithms that integrate different versions of KMR (into a unified framework) resulting in improved accuracy and elimination of problems associated with conventional recommender system. We have tested our system on publically available movies dataset and benchmark with KMR. The results (in terms of accuracy, precision, recall, F1 measure and ROC metrics) reveal that the proposed algorithm is quite accurate especially under cold-start and sparse scenarios.Keywords: Kernel Mapping Recommender Systems, hybrid recommender systems, cold start, sparsity, long tail
Procedia PDF Downloads 3382731 Hospitality Genealogy: Tracing the Ethics and Ontologies of Hospitality-Making on the Silk-Routes
Authors: Neil Michael Walsh, Angelique Lombarts
Abstract:
The authors propose that hospitality is ‘made’ (constituted and performed) in the encounters on the Silk-Routes. Inspired with an initial Derridean perspective on hospitality (the conditional/unconditional) and methodologically underpinned with a Delueuzian relational-rhizomatic approach, the authors contend that hospitality is (re)produced in the encounters of self/other, east/west (among others). Thus, in the spirit of performativity and using the temporal-spatial conduit of the Silk Routes (the sites of ethical, cultural, economic, and material interaction of such exchange), the authors concur that hospitality is produced at the moment in which it is performed. Key themes engaged as units of analysis become welcome, reception, hostility, (and so on) which the authors engage and examine –as they unfold- in the narratives and accounts and material legacies of those who travelled the Silk Routes between the 2nd and 18th Centuries. The preliminary results suggest that these earlier performative moments in hospitality-making on the silk routes continue to resonate and ‘form’ the hospitalities of today. Indeed, these acts of hospitality continue to reconstitute and are never a final state of affairs.Keywords: hospitality-genealogy, interactions, hospitality-making, Silk-Routes, rhizome, relationality
Procedia PDF Downloads 1342730 A Review on Applications of Evolutionary Algorithms to Reservoir Operation for Hydropower Production
Authors: Nkechi Neboh, Josiah Adeyemo, Abimbola Enitan, Oludayo Olugbara
Abstract:
Evolutionary algorithms are techniques extensively used in the planning and management of water resources and systems. It is useful in finding optimal solutions to water resources problems considering the complexities involved in the analysis. River basin management is an essential area that involves the management of upstream, river inflow and outflow including downstream aspects of a reservoir. Water as a scarce resource is needed by human and the environment for survival and its management involve a lot of complexities. Management of this scarce resource is necessary for proper distribution to competing users in a river basin. This presents a lot of complexities involving many constraints and conflicting objectives. Evolutionary algorithms are very useful in solving this kind of complex problems with ease. Evolutionary algorithms are easy to use, fast and robust with many other advantages. Many applications of evolutionary algorithms, which are population based search algorithm, are discussed. Different methodologies involved in the modeling and simulation of water management problems in river basins are explained. It was found from this work that different evolutionary algorithms are suitable for different problems. Therefore, appropriate algorithms are suggested for different methodologies and applications based on results of previous studies reviewed. It is concluded that evolutionary algorithms, with wide applications in water resources management, are viable and easy algorithms for most of the applications. The results suggested that evolutionary algorithms, applied in the right application areas, can suggest superior solutions for river basin management especially in reservoir operations, irrigation planning and management, stream flow forecasting and real-time applications. The future directions in this work are suggested. This study will assist decision makers and stakeholders on the best evolutionary algorithm to use in varied optimization issues in water resources management.Keywords: evolutionary algorithm, multi-objective, reservoir operation, river basin management
Procedia PDF Downloads 4912729 Automatic Classification of Periodic Heart Sounds Using Convolutional Neural Network
Authors: Jia Xin Low, Keng Wah Choo
Abstract:
This paper presents an automatic normal and abnormal heart sound classification model developed based on deep learning algorithm. MITHSDB heart sounds datasets obtained from the 2016 PhysioNet/Computing in Cardiology Challenge database were used in this research with the assumption that the electrocardiograms (ECG) were recorded simultaneously with the heart sounds (phonocardiogram, PCG). The PCG time series are segmented per heart beat, and each sub-segment is converted to form a square intensity matrix, and classified using convolutional neural network (CNN) models. This approach removes the need to provide classification features for the supervised machine learning algorithm. Instead, the features are determined automatically through training, from the time series provided. The result proves that the prediction model is able to provide reasonable and comparable classification accuracy despite simple implementation. This approach can be used for real-time classification of heart sounds in Internet of Medical Things (IoMT), e.g. remote monitoring applications of PCG signal.Keywords: convolutional neural network, discrete wavelet transform, deep learning, heart sound classification
Procedia PDF Downloads 3482728 A Novel Combined Finger Counting and Finite State Machine Technique for ASL Translation Using Kinect
Authors: Rania Ahmed Kadry Abdel Gawad Birry, Mohamed El-Habrouk
Abstract:
This paper presents a brief survey of the techniques used for sign language recognition along with the types of sensors used to perform the task. It presents a modified method for identification of an isolated sign language gesture using Microsoft Kinect with the OpenNI framework. It presents the way of extracting robust features from the depth image provided by Microsoft Kinect and the OpenNI interface and to use them in creating a robust and accurate gesture recognition system, for the purpose of ASL translation. The Prime Sense’s Natural Interaction Technology for End-user - NITE™ - was also used in the C++ implementation of the system. The algorithm presents a simple finger counting algorithm for static signs as well as directional Finite State Machine (FSM) description of the hand motion in order to help in translating a sign language gesture. This includes both letters and numbers performed by a user, which in-turn may be used as an input for voice pronunciation systems.Keywords: American sign language, finger counting, hand tracking, Microsoft Kinect
Procedia PDF Downloads 2962727 Registration of Multi-Temporal Unmanned Aerial Vehicle Images for Facility Monitoring
Authors: Dongyeob Han, Jungwon Huh, Quang Huy Tran, Choonghyun Kang
Abstract:
Unmanned Aerial Vehicles (UAVs) have been used for surveillance, monitoring, inspection, and mapping. In this paper, we present a systematic approach for automatic registration of UAV images for monitoring facilities such as building, green house, and civil structures. The two-step process is applied; 1) an image matching technique based on SURF (Speeded up Robust Feature) and RANSAC (Random Sample Consensus), 2) bundle adjustment of multi-temporal images. Image matching to find corresponding points is one of the most important steps for the precise registration of multi-temporal images. We used the SURF algorithm to find a quick and effective matching points. RANSAC algorithm was used in the process of finding matching points between images and in the bundle adjustment process. Experimental results from UAV images showed that our approach has a good accuracy to be applied to the change detection of facility.Keywords: building, image matching, temperature, unmanned aerial vehicle
Procedia PDF Downloads 2922726 Optimal Design of Tuned Inerter Damper-Based System for the Control of Wind-Induced Vibration in Tall Buildings through Cultural Algorithm
Authors: Luis Lara-Valencia, Mateo Ramirez-Acevedo, Daniel Caicedo, Jose Brito, Yosef Farbiarz
Abstract:
Controlling wind-induced vibrations as well as aerodynamic forces, is an essential part of the structural design of tall buildings in order to guarantee the serviceability limit state of the structure. This paper presents a numerical investigation on the optimal design parameters of a Tuned Inerter Damper (TID) based system for the control of wind-induced vibration in tall buildings. The control system is based on the conventional TID, with the main difference that its location is changed from the ground level to the last two story-levels of the structural system. The TID tuning procedure is based on an evolutionary cultural algorithm in which the optimum design variables defined as the frequency and damping ratios were searched according to the optimization criteria of minimizing the root mean square (RMS) response of displacements at the nth story of the structure. A Monte Carlo simulation was used to represent the dynamic action of the wind in the time domain in which a time-series derived from the Davenport spectrum using eleven harmonic functions with randomly chosen phase angles was reproduced. The above-mentioned methodology was applied on a case-study derived from a 37-story prestressed concrete building with 144 m height, in which the wind action overcomes the seismic action. The results showed that the optimally tuned TID is effective to reduce the RMS response of displacements up to 25%, which demonstrates the feasibility of the system for the control of wind-induced vibrations in tall buildings.Keywords: evolutionary cultural algorithm, Monte Carlo simulation, tuned inerter damper, wind-induced vibrations
Procedia PDF Downloads 1352725 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2652724 Maximization of Lifetime for Wireless Sensor Networks Based on Energy Efficient Clustering Algorithm
Authors: Frodouard Minani
Abstract:
Since last decade, wireless sensor networks (WSNs) have been used in many areas like health care, agriculture, defense, military, disaster hit areas and so on. Wireless Sensor Networks consist of a Base Station (BS) and more number of wireless sensors in order to monitor temperature, pressure, motion in different environment conditions. The key parameter that plays a major role in designing a protocol for Wireless Sensor Networks is energy efficiency which is a scarcest resource of sensor nodes and it determines the lifetime of sensor nodes. Maximizing sensor node’s lifetime is an important issue in the design of applications and protocols for Wireless Sensor Networks. Clustering sensor nodes mechanism is an effective topology control approach for helping to achieve the goal of this research. In this paper, the researcher presents an energy efficiency protocol to prolong the network lifetime based on Energy efficient clustering algorithm. The Low Energy Adaptive Clustering Hierarchy (LEACH) is a routing protocol for clusters which is used to lower the energy consumption and also to improve the lifetime of the Wireless Sensor Networks. Maximizing energy dissipation and network lifetime are important matters in the design of applications and protocols for wireless sensor networks. Proposed system is to maximize the lifetime of the Wireless Sensor Networks by choosing the farthest cluster head (CH) instead of the closest CH and forming the cluster by considering the following parameter metrics such as Node’s density, residual-energy and distance between clusters (inter-cluster distance). In this paper, comparisons between the proposed protocol and comparative protocols in different scenarios have been done and the simulation results showed that the proposed protocol performs well over other comparative protocols in various scenarios.Keywords: base station, clustering algorithm, energy efficient, sensors, wireless sensor networks
Procedia PDF Downloads 1442723 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features
Authors: Bushra Zafar, Usman Qamar
Abstract:
Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection
Procedia PDF Downloads 3162722 A Spiral Dynamic Optimised Hybrid Fuzzy Logic Controller for a Unicycle Mobile Robot on Irregular Terrains
Authors: Abdullah M. Almeshal, Mohammad R. Alenezi, Talal H. Alzanki
Abstract:
This paper presents a hybrid fuzzy logic control strategy for a unicycle trajectory following robot on irregular terrains. In literature, researchers have presented the design of path tracking controllers of mobile robots on non-frictional surface. In this work, the robot is simulated to drive on irregular terrains with contrasting frictional profiles of peat and rough gravel. A hybrid fuzzy logic controller is utilised to stabilise and drive the robot precisely with the predefined trajectory and overcome the frictional impact. The controller gains and scaling factors were optimised using spiral dynamics optimisation algorithm to minimise the mean square error of the linear and angular velocities of the unicycle robot. The robot was simulated on various frictional surfaces and terrains and the controller was able to stabilise the robot with a superior performance that is shown via simulation results.Keywords: fuzzy logic control, mobile robot, trajectory tracking, spiral dynamic algorithm
Procedia PDF Downloads 4952721 High Level Synthesis of Canny Edge Detection Algorithm on Zynq Platform
Authors: Hanaa M. Abdelgawad, Mona Safar, Ayman M. Wahba
Abstract:
Real-time image and video processing is a demand in many computer vision applications, e.g. video surveillance, traffic management and medical imaging. The processing of those video applications requires high computational power. Therefore, the optimal solution is the collaboration of CPU and hardware accelerators. In this paper, a Canny edge detection hardware accelerator is proposed. Canny edge detection is one of the common blocks in the pre-processing phase of image and video processing pipeline. Our presented approach targets offloading the Canny edge detection algorithm from processing system (PS) to programmable logic (PL) taking the advantage of High Level Synthesis (HLS) tool flow to accelerate the implementation on Zynq platform. The resulting implementation enables up to a 100x performance improvement through hardware acceleration. The CPU utilization drops down and the frame rate jumps to 60 fps of 1080p full HD input video stream.Keywords: high level synthesis, canny edge detection, hardware accelerators, computer vision
Procedia PDF Downloads 4782720 Resistivity Tomography Optimization Based on Parallel Electrode Linear Back Projection Algorithm
Authors: Yiwei Huang, Chunyu Zhao, Jingjing Ding
Abstract:
Electrical Resistivity Tomography has been widely used in the medicine and the geology, such as the imaging of the lung impedance and the analysis of the soil impedance, etc. Linear Back Projection is the core algorithm of Electrical Resistivity Tomography, but the traditional Linear Back Projection can not make full use of the information of the electric field. In this paper, an imaging method of Parallel Electrode Linear Back Projection for Electrical Resistivity Tomography is proposed, which generates the electric field distribution that is not linearly related to the traditional Linear Back Projection, captures the new information and improves the imaging accuracy without increasing the number of electrodes by changing the connection mode of the electrodes. The simulation results show that the accuracy of the image obtained by the inverse operation obtained by the Parallel Electrode Linear Back Projection can be improved by about 20%.Keywords: electrical resistivity tomography, finite element simulation, image optimization, parallel electrode linear back projection
Procedia PDF Downloads 1532719 An Excellent Adventure: The Stories of National Tertiary Teaching Excellence Award Winners
Authors: Claire Goode
Abstract:
This paper reports on a doctoral research project using narrative inquiry to investigate the stories of twelve national Tertiary Teaching Excellence Award winners in New Zealand. Preliminary findings highlight awardees’ views on their identity, their professional practice, and on what they consider to be excellence in tertiary teaching. The research also reports on common themes in the personal qualities that awardees describe, and on what these nationally recognised educators would like to see in place around Tertiary Teacher Development. Educators, mentors, trainers, and curriculum designers can gain a deeper understanding of what teaching excellence looks like, and of how teachers perceive their own practice and their impact on others. This may enable different interventions to develop best practice from staff, and to raise standards. It is hoped too that, by reflecting on the stories of teachers who have been recognised for ‘excellence’, educators will relate to and recognise elements of their own practice, and will feel motivated and inspired to share these with their peers and the wider academic community.Keywords: academic identity, narrative inquiry, teacher development, teaching excellence
Procedia PDF Downloads 1222718 Facility Anomaly Detection with Gaussian Mixture Model
Authors: Sunghoon Park, Hank Kim, Jinwon An, Sungzoon Cho
Abstract:
Internet of Things allows one to collect data from facilities which are then used to monitor them and even predict malfunctions in advance. Conventional quality control methods focus on setting a normal range on a sensor value defined between a lower control limit and an upper control limit, and declaring as an anomaly anything falling outside it. However, interactions among sensor values are ignored, thus leading to suboptimal performance. We propose a multivariate approach which takes into account many sensor values at the same time. In particular Gaussian Mixture Model is used which is trained to maximize likelihood value using Expectation-Maximization algorithm. The number of Gaussian component distributions is determined by Bayesian Information Criterion. The negative Log likelihood value is used as an anomaly score. The actual usage scenario goes like a following. For each instance of sensor values from a facility, an anomaly score is computed. If it is larger than a threshold, an alarm will go off and a human expert intervenes and checks the system. A real world data from Building energy system was used to test the model.Keywords: facility anomaly detection, gaussian mixture model, anomaly score, expectation maximization algorithm
Procedia PDF Downloads 2722717 A Near-Optimal Domain Independent Approach for Detecting Approximate Duplicates
Authors: Abdelaziz Fellah, Allaoua Maamir
Abstract:
We propose a domain-independent merging-cluster filter approach complemented with a set of algorithms for identifying approximate duplicate entities efficiently and accurately within a single and across multiple data sources. The near-optimal merging-cluster filter (MCF) approach is based on the Monge-Elkan well-tuned algorithm and extended with an affine variant of the Smith-Waterman similarity measure. Then we present constant, variable, and function threshold algorithms that work conceptually in a divide-merge filtering fashion for detecting near duplicates as hierarchical clusters along with their corresponding representatives. The algorithms take recursive refinement approaches in the spirit of filtering, merging, and updating, cluster representatives to detect approximate duplicates at each level of the cluster tree. Experiments show a high effectiveness and accuracy of the MCF approach in detecting approximate duplicates by outperforming the seminal Monge-Elkan’s algorithm on several real-world benchmarks and generated datasets.Keywords: data mining, data cleaning, approximate duplicates, near-duplicates detection, data mining applications and discovery
Procedia PDF Downloads 3872716 Effect of Thickness and Solidity on the Performance of Straight Type Vertical Axis Wind Turbine
Authors: Jianyang Zhu, Lin Jiang, Tixian Tian
Abstract:
Inspired by the increasing interesting on the wind power associated with production of clear electric power, a numerical experiment is applied to investigate the aerodynamic performance of straight type vertical axis wind turbine with different thickness and solidity, where the incompressible Navier-Stokes (N-S) equations coupled with dynamic mesh technique is solved. By analyzing the flow field, as well as energy coefficient of different thickness and solidity turbine, it is found that the thickness and solidity can significantly influence the performance of vertical axis wind turbine. For the turbine under low tip speed, the mean energy coefficient increase with the increasing of thickness and solidity, which may improve the self starting performance of the turbine. However for the turbine under high tip speed, the appropriate thickness and smaller solidity turbine possesses better performance. In addition, delay stall and no interaction of the blade and previous separated vortex are observed around appropriate thickness and solidity turbine, therefore lead better performance characteristics.Keywords: vertical axis wind turbine, N-S equations, dynamic mesh technique, thickness, solidity
Procedia PDF Downloads 2652715 Compass Bar: A Visualization Technique for Out-of-View-Objects in Head-Mounted Displays
Authors: Alessandro Evangelista, Vito M. Manghisi, Michele Gattullo, Enricoandrea Laviola
Abstract:
In this work, we propose a custom visualization technique for Out-Of-View-Objects in Virtual and Augmented Reality applications using Head Mounted Displays. In the last two decades, Augmented Reality (AR) and Virtual Reality (VR) technologies experienced a remarkable growth of applications for navigation, interaction, and collaboration in different types of environments, real or virtual. Both environments can be potentially very complex, as they can include many virtual objects located in different places. Given the natural limitation of the human Field of View (about 210° horizontal and 150° vertical), humans cannot perceive objects outside this angular range. Moreover, despite recent technological advances in AR e VR Head-Mounted Displays (HMDs), these devices still suffer from a limited Field of View, especially regarding Optical See-Through displays, thus greatly amplifying the challenge of visualizing out-of-view objects. This problem is not negligible when the user needs to be aware of the number and the position of the out-of-view objects in the environment. For instance, during a maintenance operation on a construction site where virtual objects serve to improve the dangers' awareness. Providing such information can enhance the comprehension of the scene, enable fast navigation and focused search, and improve users' safety. In our research, we investigated how to represent out-of-view-objects in HMD User Interfaces (UI). Inspired by commercial video games such as Call of Duty Modern Warfare, we designed a customized Compass. By exploiting the Unity 3D graphics engine, we implemented our custom solution that can be used both in AR and VR environments. The Compass Bar consists of a graduated bar (in degrees) at the top center of the UI. The values of the bar range from -180 (far left) to +180 (far right), the zero is placed in front of the user. Two vertical lines on the bar show the amplitude of the user's field of view. Every virtual object within the scene is represented onto the compass bar as a specific color-coded proxy icon (a circular ring with a colored dot at its center). To provide the user with information about the distance, we implemented a specific algorithm that increases the size of the inner dot as the user approaches the virtual object (i.e., when the user reaches the object, the dot fills the ring). This visualization technique for out-of-view objects has some advantages. It allows users to be quickly aware of the number and the position of the virtual objects in the environment. For instance, if the compass bar displays the proxy icon at about +90, users will immediately know that the virtual object is to their right and so on. Furthermore, by having qualitative information about the distance, users can optimize their speed, thus gaining effectiveness in their work. Given the small size and position of the Compass Bar, our solution also helps lessening the occlusion problem thus increasing user acceptance and engagement. As soon as the lockdown measures will allow, we will carry out user-tests comparing this solution with other state-of-the-art existing ones such as 3D Radar, SidebARs and EyeSee360.Keywords: augmented reality, situation awareness, virtual reality, visualization design
Procedia PDF Downloads 1272714 Removal and/or Recovery of Phosphates by Precipitation as Ferric Phosphate from the Effluent of a Municipal Wastewater Treatment Plant
Authors: Kyriaki Kalaitzidou, Athanasia Tolkou, Christina Raptopoulou, Manassis Mitrakas, Anastasios Zouboulis
Abstract:
Phosphate rock is the main source of phosphorous (P) in fertilizers and is essential for high crop yield in agriculture; currently, it is considered as a critical element, phasing scarcity. Chemical precipitation, which is a commonly used method of phosphorous removal from wastewaters, finds its significance in that phosphates may be precipitated in appropriate chemical forms that can be reused-recovered. Most often phosphorous is removed from wastewaters in the form of insoluble phosphate salts, by using salts (coagulants) of multivalent metal ions, most frequently iron, aluminum, calcium, or magnesium. The removal degree is affected by various factors, such as pH, chemical agent dose, temperature, etc. In this study, phosphate precipitation from the secondary (biologically treated) effluent of a municipal wastewater treatment plant is examined. Using chlorosulfate (FeClSO4) it was attempted to either remove and/or recover PO43-. Results showed that the use of Fe3+ can achieve residual concentrations lower than the commonly applied legislation limit of PO43- (i.e. 3 mg PO43-/L) by adding 7.5 mg/L Fe3+ in the secondary effluent with an initial concentration of about 10 mg PO43-/L and at pH range between 6 to 9. In addition, the formed sediment has a percentage of almost 24% PO43- content. Therefore, simultaneous removal and recovery of PO43- as ferric phosphate can be achieved, making it possible for the ferric phosphate to be re-used as a possible (secondary) fertilizer source.Keywords: ferric phosphate, phosphorus recovery, phosphorus removal, wastewater treatment
Procedia PDF Downloads 484