Search results for: dispersed region growing algorithm (DRGA)
9942 Monocular Visual Odometry for Three Different View Angles by Intel Realsense T265 with the Measurement of Remote
Authors: Heru Syah Putra, Aji Tri Pamungkas Nurcahyo, Chuang-Jan Chang
Abstract:
MOIL-SDK method refers to the spatial angle that forms a view with a different perspective from the Fisheye image. Visual Odometry forms a trusted application for extending projects by tracking using image sequences. A real-time, precise, and persistent approach that is able to contribute to the work when taking datasets and generate ground truth as a reference for the estimates of each image using the FAST Algorithm method in finding Keypoints that are evaluated during the tracking process with the 5-point Algorithm with RANSAC, as well as produce accurate estimates the camera trajectory for each rotational, translational movement on the X, Y, and Z axes.Keywords: MOIL-SDK, intel realsense T265, Fisheye image, monocular visual odometry
Procedia PDF Downloads 1349941 Fault Detection and Isolation in Sensors and Actuators of Wind Turbines
Authors: Shahrokh Barati, Reza Ramezani
Abstract:
Due to the countries growing attention to the renewable energy producing, the demand for energy from renewable energy has gone up among the renewable energy sources; wind energy is the fastest growth in recent years. In this regard, in order to increase the availability of wind turbines, using of Fault Detection and Isolation (FDI) system is necessary. Wind turbines include of various faults such as sensors fault, actuator faults, network connection fault, mechanical faults and faults in the generator subsystem. Although, sensors and actuators have a large number of faults in wind turbine but have discussed fewer in the literature. Therefore, in this work, we focus our attention to design a sensor and actuator fault detection and isolation algorithm and Fault-tolerant control systems (FTCS) for Wind Turbine. The aim of this research is to propose a comprehensive fault detection and isolation system for sensors and actuators of wind turbine based on data-driven approaches. To achieve this goal, the features of measurable signals in real wind turbine extract in any condition. The next step is the feature selection among the extract in any condition. The next step is the feature selection among the extracted features. Features are selected that led to maximum separation networks that implemented in parallel and results of classifiers fused together. In order to maximize the reliability of decision on fault, the property of fault repeatability is used.Keywords: FDI, wind turbines, sensors and actuators faults, renewable energy
Procedia PDF Downloads 4009940 Distribution Frequency, Ecology, and Economic Utility of Coprophilous Mushrooms (Agaricales, Basidiomycota) in Punjab, India
Authors: Amandeep Kaur, N. S. Atri, Munruchi Kaur
Abstract:
Herbivorous dung is a special substrate for the growth of fungi. Fungi growing thereon are known as coprophilous. These fungi are amongst the most abundant taxa in the ecosystem, which regulate the decomposition of dung organic matter, nutrient dynamics and maintenance of ecological balance on the earth. The coprophilous fungi represent a diverse group of saprobes, including taxa from most major fungal groups belonging to Zygomycota, Ascomycota and Basidiomycota. The present work, however, has been focused on the basidiomycetous coprophilous mushrooms belonging to the order Agaricales. The research work includes the results of eco-taxonomic studies of coprophilous mushrooms in Punjab, India, on the basis of a survey of dung localities of the state. The mushrooms were collected growing as saprobes on dung of various domesticated and wild herbivorous animals in pastures, grasslands, zoos, and on dung heaps in villages, etc. The present study observed the frequency of distribution of coprophilous mushrooms in different taxonomic categories in different regions of the state in various seasons on different dung types along with their growing habit. The paper also discusses their economic utility as edible, inedible, poisonous, medicinal and hallucinogenic species. The study has shown that animal dung is a good niche for the growth of mushrooms. However, the natural habitats with dung deposits are getting destroyed because of different developmental activities. Livestock in agriculture-based societies like Punjab state in India should be managed in a manner that favors their grazing in the wild places and thereby the growth of coprophilous mushrooms so that a significant role in ecological balance on the earth is established.Keywords: herbivorous dung, psychoactive, seasonal availability, taxo-ecology
Procedia PDF Downloads 989939 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings
Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey
Abstract:
Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing
Procedia PDF Downloads 1529938 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia PDF Downloads 1289937 Design and Implementation of DC-DC Converter with Inc-Cond Algorithm
Authors: Mustafa Engin Başoğlu, Bekir Çakır
Abstract:
The most important component affecting the efficiency of photovoltaic power systems are solar panels. Efficiency of these systems are significantly affected because of being low efficiency of solar panel. Therefore, solar panels should be operated under maximum power point conditions through a power converter. In this study, design boost converter with maximum power point tracking (MPPT) operation has been designed and performed with Incremental Conductance (Inc-Cond) algorithm by using direct duty control. Furthermore, it is shown that performance of boost converter with MPPT operation fails under low load resistance connection.Keywords: boost converter, incremental conductance (Inc-Cond), MPPT, solar panel
Procedia PDF Downloads 10469936 Stochastic Simulation of Random Numbers Using Linear Congruential Method
Authors: Melvin Ballera, Aldrich Olivar, Mary Soriano
Abstract:
Digital computers nowadays must be able to have a utility that is capable of generating random numbers. Usually, computer-generated random numbers are not random given predefined values such as starting point and end points, making the sequence almost predictable. There are many applications of random numbers such business simulation, manufacturing, services domain, entertainment sector and other equally areas making worthwhile to design a unique method and to allow unpredictable random numbers. Applying stochastic simulation using linear congruential algorithm, it shows that as it increases the numbers of the seed and range the number randomly produced or selected by the computer becomes unique. If this implemented in an environment where random numbers are very much needed, the reliability of the random number is guaranteed.Keywords: stochastic simulation, random numbers, linear congruential algorithm, pseudorandomness
Procedia PDF Downloads 3169935 Analysis of Energy Planning and Optimization with Microgrid System in Dawei Region
Authors: Hninn Thiri Naing
Abstract:
In Myanmar, there are many regions that are far away from the national grid. For these areas, isolated regional micro-grids are one of the solutions. The study area in this paper is also operating in such way. The main difficulty in such regions is the high cost of electrical energy. This paper will be approached to cost-effective or cost-optimization by energy planning with renewable energy resources and natural gas. Micro-grid will be set up for performance in the Dawei region since it is economic zone in lower Myanmar and so far from national grids. The required metrological and geographical data collections are done. Currently, the status is electric unit rate is higher than the other. For microgrid planning and optimization, Homer Pro-software is employed in this research.Keywords: energy planning, renewable energy, homer pro, cost of energy
Procedia PDF Downloads 1299934 Three Dimensional Computational Fluid Dynamics Simulation of Wall Condensation inside Inclined Tubes
Authors: Amirhosein Moonesi Shabestary, Eckhard Krepper, Dirk Lucas
Abstract:
The current PhD project comprises CFD-modeling and simulation of condensation and heat transfer inside horizontal pipes. Condensation plays an important role in emergency cooling systems of reactors. The emergency cooling system consists of inclined horizontal pipes which are immersed in a tank of subcooled water. In the case of an accident the water level in the core is decreasing, steam comes in the emergency pipes, and due to the subcooled water around the pipe, this steam will start to condense. These horizontal pipes act as a strong heat sink which is responsible for a quick depressurization of the reactor core when any accident happens. This project is defined in order to model all these processes which happening in the emergency cooling systems. The most focus of the project is on detection of different morphologies such as annular flow, stratified flow, slug flow and plug flow. This project is an ongoing project which has been started 1 year ago in Helmholtz Zentrum Dresden Rossendorf (HZDR), Fluid Dynamics department. In HZDR most in cooperation with ANSYS different models are developed for modeling multiphase flows. Inhomogeneous MUSIG model considers the bubble size distribution and is used for modeling small-scaled dispersed gas phase. AIAD (Algebraic Interfacial Area Density Model) is developed for detection of the local morphology and corresponding switch between them. The recent model is GENTOP combines both concepts. GENTOP is able to simulate co-existing large-scaled (continuous) and small-scaled (polydispersed) structures. All these models are validated for adiabatic cases without any phase change. Therefore, the start point of the current PhD project is using the available models and trying to integrate phase transition and wall condensing models into them. In order to simplify the idea of condensation inside horizontal tubes, 3 steps have been defined. The first step is the investigation of condensation inside a horizontal tube by considering only direct contact condensation (DCC) and neglect wall condensation. Therefore, the inlet of the pipe is considered to be annular flow. In this step, AIAD model is used in order to detect the interface. The second step is the extension of the model to consider wall condensation as well which is closer to the reality. In this step, the inlet is pure steam, and due to the wall condensation, a liquid film occurs near the wall which leads to annular flow. The last step will be modeling of different morphologies which are occurring inside the tube during the condensation via using GENTOP model. By using GENTOP, the dispersed phase is able to be considered and simulated. Finally, the results of the simulations will be validated by experimental data which will be available also in HZDR.Keywords: wall condensation, direct contact condensation, AIAD model, morphology detection
Procedia PDF Downloads 3059933 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 2199932 Teaching Tools for Web Processing Services
Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr
Abstract:
Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.Keywords: deegree, interpolation, IDW, web processing service (WPS)
Procedia PDF Downloads 3559931 Troubleshooting Petroleum Equipment Based on Wireless Sensors Based on Bayesian Algorithm
Authors: Vahid Bayrami Rad
Abstract:
In this research, common methods and techniques have been investigated with a focus on intelligent fault finding and monitoring systems in the oil industry. In fact, remote and intelligent control methods are considered a necessity for implementing various operations in the oil industry, but benefiting from the knowledge extracted from countless data generated with the help of data mining algorithms. It is a avoid way to speed up the operational process for monitoring and troubleshooting in today's big oil companies. Therefore, by comparing data mining algorithms and checking the efficiency and structure and how these algorithms respond in different conditions, The proposed (Bayesian) algorithm using data clustering and their analysis and data evaluation using a colored Petri net has provided an applicable and dynamic model from the point of view of reliability and response time. Therefore, by using this method, it is possible to achieve a dynamic and consistent model of the remote control system and prevent the occurrence of leakage in oil pipelines and refineries and reduce costs and human and financial errors. Statistical data The data obtained from the evaluation process shows an increase in reliability, availability and high speed compared to other previous methods in this proposed method.Keywords: wireless sensors, petroleum equipment troubleshooting, Bayesian algorithm, colored Petri net, rapid miner, data mining-reliability
Procedia PDF Downloads 669930 Detecting Manipulated Media Using Deep Capsule Network
Authors: Joseph Uzuazomaro Oju
Abstract:
The ease at which manipulated media can be created, and the increasing difficulty in identifying fake media makes it a great threat. Most of the applications used for the creation of these high-quality fake videos and images are built with deep learning. Hence, the use of deep learning in creating a detection mechanism cannot be overemphasized. Any successful fake media that is being detected before it reached the populace will save people from the self-doubt of either a content is genuine or fake and will ensure the credibility of videos and images. The methodology introduced in this paper approaches the manipulated media detection challenge using a combo of VGG-19 and a deep capsule network. In the case of videos, they are converted into frames, which, in turn, are resized and cropped to the face region. These preprocessed images/videos are fed to the VGG-19 network to extract the latent features. The extracted latent features are inputted into a deep capsule network enhanced with a 3D -convolution dynamic routing agreement. The 3D –convolution dynamic routing agreement algorithm helps to reduce the linkages between capsules networks. Thereby limiting the poor learning shortcoming of multiple capsule network layers. The resultant output from the deep capsule network will indicate a media to be either genuine or fake.Keywords: deep capsule network, dynamic routing, fake media detection, manipulated media
Procedia PDF Downloads 1349929 Neural Network Based Fluctuation Frequency Control in PV-Diesel Hybrid Power System
Authors: Heri Suryoatmojo, Adi Kurniawan, Feby A. Pamuji, Nursalim, Syaffaruddin, Herbert Innah
Abstract:
Photovoltaic (PV) system hybrid with diesel system is utilized widely for electrification in remote area. PV output power fluctuates due to uncertainty condition of temperature and sun irradiance. When the penetration of PV power is large, the reliability of the power utility will be disturbed and seriously impact the unstable frequency of system. Therefore, designing a robust frequency controller in PV-diesel hybrid power system is very important. This paper proposes new method of frequency control application in hybrid PV-diesel system based on artificial neural network (ANN). This method can minimize the frequency deviation without smoothing PV output power that controlled by maximum power point tracking (MPPT) method. The neural network algorithm controller considers average irradiance, change of irradiance and frequency deviation. In order the show the effectiveness of proposed algorithm, the addition of battery as energy storage system is also presented. To validate the proposed method, the results of proposed system are compared with the results of similar system using MPPT only. The simulation results show that the proposed method able to suppress frequency deviation smaller compared to the results of system using MPPT only.Keywords: energy storage system, frequency deviation, hybrid power generation, neural network algorithm
Procedia PDF Downloads 5039928 Optimal Design of Submersible Permanent Magnet Linear Synchronous Motor Based Design of Experiment and Genetic Algorithm
Authors: Xiao Zhang, Wensheng Xiao, Junguo Cui, Hongmin Wang
Abstract:
Submersible permanent magnet linear synchronous motors (SPMLSMs) are electromagnetic devices, which can directly drive plunger pump to obtain the crude oil. Those motors have been gradually applied in oil fields due to high thrust force density and high efficiency. Since the force performance closely depends on the concrete structural parameters, the seven different structural parameters are investigated in detail. This paper presents an optimum design of an SPMLSM to minimize the detent force and maximize the thrust by using design of experiment (DOE) and genetic algorithm (GA). The three significant structural parameters (air-gap length, slot width, pole-arc coefficient) are separately screened using 27 1/16 fractional factorial design (FFD) to investigate the significant effect of seven parameters used in this research on the force performance. Response surface methodology (RSM) is well adapted to make analytical model of thrust and detent force with constraints of corresponding significant parameters and enable objective function to be easily created, respectively. GA is performed as a searching tool to search for the Pareto-optimal solutions. By finite element analysis, the proposed PMLSM shows merits in improving thrust and reducing the detent force dramatically.Keywords: optimization, force performance, design of experiment (DOE), genetic algorithm (GA)
Procedia PDF Downloads 2909927 Environmental Law and Payment for Environmental Services: Perceptions of the Family Farmers of the Federal District, Brazil
Authors: Kever Bruno Paradelo Gomes, Rosana Carvalho Cristo Martins
Abstract:
Payment for Environmental Services (PSA) has been a strategy used since the late 1990s by Latin American countries to finance environmental conservation. Payment for Environmental Services has been absorbing a growing amount of time in the discussions around environmentally sustainable development strategies in the world. In Brazil, this theme has permeated the discussions since the publication of the new Forest Code. The objective of this work was to verify the perception of the resident farmers in the region of Ponte Alta, Gama, Federal District, Brazil, on environmental legislation and Payments for Environmental Services. The work was carried out in 99 rural properties of the family farmers of the Rural Nucleus Ponte Alta, Administrative Region of Gama, in the city of Brasília, Federal District, Brazil. The present research is characterized methodologically as a quantitative, exploratory, and descriptive nature. The data treatment was performed through descriptive statistical analysis and hypothesis testing. The perceptions about environmental legislation in the rural area of Ponte Alta, Gama, DF respondents were positive. Although most of the family farmers interviewed have some knowledge about environmental legislation, it is perceived that in practice, the environmental adequacy of property is ineffective given the current situation of sustainable rural development; there is an abyss between what is envisaged by legislation and reality in the field. Thus, as in the reports of other researchers, it is verified that the majority of respondents are not aware of PSA (62.62%). Among those interviewed who were aware of the subject, two learned through the course, three through the university, two through TV and five through other people. The planting of native forest species on the rural property was the most informed practice by farmers if they received some Environmental Service Payment (PSA). Reflections on the environment allow us to infer that the effectiveness and fulfillment of the incentives and rewards in the scope of public policies to encourage the maintenance of environmental services, already existing in all spheres of government, are of great relevance to the process of environmental sustainability of rural properties. The relevance of the present research is an important tool to promote the discussion and formulation of public policies focused on sustainable rural development, especially on payments for environmental services; it is a space of great interest for the strengthening of the social group dedicated to production. Public policies that are efficient and accessible to the small rural producers become decisive elements for the promotion of changes in behavior in the field, be it economic, social, or environmental.Keywords: forest code, public policy, rural development, sustainable agriculture
Procedia PDF Downloads 1539926 Identification of Flood Prone Areas in Adigrat Town Using Boolean Logic with GIS and Remote Sensing Technique
Authors: Fikre Belay Tekulu
Abstract:
The Adigrat town lies in the Tigray region of Ethiopia. This region is mountainous and experiences a semiarid type of climate. Most of the rainfall occurs in four months of the year, which are June to September. During this season, flood is a common natural disaster, especially in urban areas. In this paper, an attempt is made to identify flood-prone areas in Adigrat town using Boolean logic with GIS and remote sensing techniques. Three parameters were incorporated as land use type, elevation, and slope. Boolean logic was used as land use equal to buildup land, elevation less than 2430 m, and slope less than 5 degrees. As a result, 0.575 km² was identified severely affected by floods during the rainy season.Keywords: flood, GIS, hydrology, Adigrat
Procedia PDF Downloads 1429925 Filtering Intrusion Detection Alarms Using Ant Clustering Approach
Authors: Ghodhbani Salah, Jemili Farah
Abstract:
With the growth of cyber attacks, information safety has become an important issue all over the world. Many firms rely on security technologies such as intrusion detection systems (IDSs) to manage information technology security risks. IDSs are considered to be the last line of defense to secure a network and play a very important role in detecting large number of attacks. However the main problem with today’s most popular commercial IDSs is generating high volume of alerts and huge number of false positives. This drawback has become the main motivation for many research papers in IDS area. Hence, in this paper we present a data mining technique to assist network administrators to analyze and reduce false positive alarms that are produced by an IDS and increase detection accuracy. Our data mining technique is unsupervised clustering method based on hybrid ANT algorithm. This algorithm discovers clusters of intruders’ behavior without prior knowledge of a possible number of classes, then we apply K-means algorithm to improve the convergence of the ANT clustering. Experimental results on real dataset show that our proposed approach is efficient with high detection rate and low false alarm rate.Keywords: intrusion detection system, alarm filtering, ANT class, ant clustering, intruders’ behaviors, false alarms
Procedia PDF Downloads 4049924 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 809923 A Review of Common Tropical Culture Trees
Authors: Victoria Tobi Dada, Emmanuel Dada
Abstract:
Culture trees are notable agricultural system in the tropical region of the world because of its great contribution to the economy of this region. Plantation agriculture such as oil palm, cocoa, cashew and rubber are the dominant agricultural trees in the tropical countries with the at least mean annual rainfall of 1500mm and 280c temperature. The study examines the review developmental trend in the common tropical culture trees. The study shows that global area of land occupied by rubber plantation increased from 9464276 hectares to 11739333 hectares between year 2010 and 2017, while oil palm cultivated land area increased from 1851278 in 2010 hectares to 2042718 hectares in 2013 across 35 countries. Global cashew plantation cultivation are dominated by West Africa with 44.8%, South-Eastern Asia with 32.9% and Sothern Asia with 13.8%, while the remaining 8.5% of the cultivated land area were distributed among six other tropical countries of the world. Cocoa cultivation and production globally are dominated by five West African countries, Indonesia and Brazil. The study revealed that notable tropical culture trees have not study together to determine their spatial distribution.Keywords: culture trees, tropical region, cultivated area, spatial distribution
Procedia PDF Downloads 1039922 High Resolution Image Generation Algorithm for Archaeology Drawings
Authors: Xiaolin Zeng, Lei Cheng, Zhirong Li, Xueping Liu
Abstract:
Aiming at the problem of low accuracy and susceptibility to cultural relic diseases in the generation of high-resolution archaeology drawings by current image generation algorithms, an archaeology drawings generation algorithm based on a conditional generative adversarial network is proposed. An attention mechanism is added into the high-resolution image generation network as the backbone network, which enhances the line feature extraction capability and improves the accuracy of line drawing generation. A dual-branch parallel architecture consisting of two backbone networks is implemented, where the semantic translation branch extracts semantic features from orthophotographs of cultural relics, and the gradient screening branch extracts effective gradient features. Finally, the fusion fine-tuning module combines these two types of features to achieve the generation of high-quality and high-resolution archaeology drawings. Experimental results on the self-constructed archaeology drawings dataset of grotto temple statues show that the proposed algorithm outperforms current mainstream image generation algorithms in terms of pixel accuracy (PA), structural similarity (SSIM), and peak signal-to-noise ratio (PSNR) and can be used to assist in drawing archaeology drawings.Keywords: archaeology drawings, digital heritage, image generation, deep learning
Procedia PDF Downloads 599921 Brain Tumor Segmentation Based on Minimum Spanning Tree
Authors: Simeon Mayala, Ida Herdlevær, Jonas Bull Haugsøen, Shamundeeswari Anandan, Sonia Gavasso, Morten Brun
Abstract:
In this paper, we propose a minimum spanning tree-based method for segmenting brain tumors. The proposed method performs interactive segmentation based on the minimum spanning tree without tuning parameters. The steps involve preprocessing, making a graph, constructing a minimum spanning tree, and a newly implemented way of interactively segmenting the region of interest. In the preprocessing step, a Gaussian filter is applied to 2D images to remove the noise. Then, the pixel neighbor graph is weighted by intensity differences and the corresponding minimum spanning tree is constructed. The image is loaded in an interactive window for segmenting the tumor. The region of interest and the background are selected by clicking to split the minimum spanning tree into two trees. One of these trees represents the region of interest and the other represents the background. Finally, the segmentation given by the two trees is visualized. The proposed method was tested by segmenting two different 2D brain T1-weighted magnetic resonance image data sets. The comparison between our results and the standard gold segmentation confirmed the validity of the minimum spanning tree approach. The proposed method is simple to implement and the results indicate that it is accurate and efficient.Keywords: brain tumor, brain tumor segmentation, minimum spanning tree, segmentation, image processing
Procedia PDF Downloads 1229920 Long Term Monitoring and Assessment of Atmospheric Aerosols in Indo-Gangetic Region of India
Authors: Ningombam Linthoingambi Devi, Amrendra Kumar
Abstract:
The long term sampling at one of the most populated city in Indo-Gangetic region shows higher mass concentration of atmospheric aerosol (PM₂.₅) during spring season (144.70µg/m³), summer season (91.96 µg/m³), the autumn season (266.48µg/m³) and winter season (367.09 µg/m³) respectively. The concentration of PM₂.₅ in Patna across the year shows much higher than the limit fixed by the national ambient air quality level fixed by central pollution control board India (CPCB, India) and World Health Organization (WHO). Different water-soluble cation (Na⁺, K⁺, Ca²⁺, NH₄⁺ , and Mg²⁺) and anion (Cl⁻, NO₃⁻ , and SO₄²⁻) species were detected in PM₂.₅. Results show the significantly higher loaded of water-soluble ions during winter and spring seasons. The acidity of the atmosphere was revealed and calculated using selected major cations (K⁺, Ca²⁺ , and NH₄⁺) and anions (SO₄²⁻, and NO₃⁻). A regression correlation was analyzed to check the significant linkage between the acidity and alkalinity ions. During the winter season (r² = 0.79) and spring season (r² = 0.64) shows good significant correlation between the cations and anions. The ratio of NO₃⁻/SO₄²⁻ indicates the sources of secondary pollutants were mainly influenced by industrial and vehicular emission however SO₄²⁻ mostly emitted from industries during the winter season.Keywords: aerosols, inorganic species, source apportionment, Indo-Gangetic region
Procedia PDF Downloads 1329919 Energy Separation Mechanism in Uni-Flow Vortex Tube Using Compressible Vortex Flow
Authors: Hiroshi Katanoda, Mohd Hazwan bin Yusof
Abstract:
A theoretical investigation from the viewpoint of gas-dynamics and thermodynamics was carried out, in order to clarify the energy separation mechanism in a viscous compressible vortex, as a primary flow element in a uni-flow vortex tube. The mathematical solutions of tangential velocity, density and temperature in a viscous compressible vortical flow were used in this study. It is clear that a total temperature in the vortex core falls well below that distant from the vortex core in the radial direction, causing a region with higher total temperature, compared to the distant region, peripheral to the vortex core.Keywords: energy separation mechanism, theoretical analysis, vortex tube, vortical flow
Procedia PDF Downloads 3999918 Haplotypes of the Human Leukocyte Antigen-G Different HIV-1 Groups from the Netherlands
Authors: A. Alyami, S. Christmas, K. Neeltje, G. Pollakis, B. Paxton, Z. Al-Bayati
Abstract:
The Human leukocyte antigen-G (HLA-G) molecule plays an important role in immunomodulation. To date, 16 untranslated regions (UTR) HLA-G haplotypes have been previously defined by sequenced SNPs in the coding region. From these, UTR-1, UTR-2, UTR-3, UTR-4, UTR-5, UTR-6 and UTR-7 are the most frequent 3’UTR haplotypes at the global level. UTR-1 is associated with higher levels of soluble HLA-G and HLA-G expression, whereas UTR-5 and UTR-7 are linked with low levels of soluble HLA-G and HLA-G expression. Human immunodeficiency virus type 1 (HIV-1) infection results in the progressive loss of immune function in infected individuals. The virus escape mechanism typically includes T lymphocytes and NK cell recognition and lyses by classical HLA-A and B down-regulation, which has been associated with non-classical HLA-G molecule up-regulation, respectively. We evaluated the haplotypes of the HLA-G 3′ untranslated region frequencies observed in three HIV-1 groups from the Netherlands and their susceptibility to develop infection. The three groups are made up of mainly men who have sex with men (MSM), injection drug users (IDU) and a high-risk-seronegative (HRSN) group. DNA samples were amplified with published primers prior sequencing. According to our results, the low expresser frequencies show higher in HRSN compared to other groups. This is indicating that 3’UTR polymorphisms may be identified as potential prognostic biomarkers to determine susceptibility to HIV.Keywords: Human leukocyte antigen-G (HLA-G) , men who have sex with men (MSM), injection drug users (IDU), high-risk-seronegative (HRSN) group, high-untranslated region (UTR)
Procedia PDF Downloads 1539917 Automatic Threshold Search for Heat Map Based Feature Selection: A Cancer Dataset Analysis
Authors: Carlos Huertas, Reyes Juarez-Ramirez
Abstract:
Public health is one of the most critical issues today; therefore, there is great interest to improve technologies in the area of diseases detection. With machine learning and feature selection, it has been possible to aid the diagnosis of several diseases such as cancer. In this work, we present an extension to the Heat Map Based Feature Selection algorithm, this modification allows automatic threshold parameter selection that helps to improve the generalization performance of high dimensional data such as mass spectrometry. We have performed a comparison analysis using multiple cancer datasets and compare against the well known Recursive Feature Elimination algorithm and our original proposal, the results show improved classification performance that is very competitive against current techniques.Keywords: biomarker discovery, cancer, feature selection, mass spectrometry
Procedia PDF Downloads 3389916 Developing an Online Application for Mental Skills Training and Development
Authors: Arjun Goutham, Chaitanya Sridhar, Sunita Maheshwari, Robin Uthappa, Prasanna Gopinath
Abstract:
In alignment with the growth in the sporting industry, a number of people playing and competing in sports are growing exponentially across the globe. However, the number of sports psychology experts are not growing at a similar rate, especially in the Asian and more so, Indian context. Hence, the access to actionable mental training solutions specific to individual athletes is limited. Also, the time constraint an athlete faces due to their intense training schedule makes one-on-one sessions difficult. One of the means to bridge that gap is through technology. Technology makes individualization possible. It allows for easy access to specific-qualitative content/information and provides a medium to place individualized assessments, analysis, solutions directly into an athlete's hands. This enables mental training awareness, education, and real-time actionable solutions possible for athletes in-spite of the limitation of available sports psychology experts in their region. Furthermore, many athletes are hesitant to seek support due to the stigma of appearing weak. Such individuals would prefer a more discreet way. Athletes who have strong mental performance tend to produce better results. The mobile application helps to equip athletes with assessing and developing their mental strategies directed towards improving performance on an ongoing basis. When an athlete understands their strengths and limitations in their mental application, they can focus specifically on applying the strategies that work and improve on zones of limitation. With reports, coaches get to understand the unique inner workings of an athlete and can utilize the data & analysis to coach them with better precision and use coaching styles & communication that suits better. Systematically capturing data and supporting athletes(with individual-specific solutions) or teams with assessment, planning, instructional content, actionable tools & strategies, reviewing mental performance and the achievement of objectives & goals facilitate for a consistent mental skills development at all levels of sporting stages of an athlete's career. The mobile application will help athletes recognize and align with their stable attributes such as their personalities, learning & execution modalities, challenges & requirements of their sport, etc and help develop dynamic attributes like states, beliefs, motivation levels, focus etc. with practice and training. It will provide measurable analysis on a regular basis and help them stay aligned to their objectives & goals. The solutions are based on researched areas of influence on sporting performance individually or in teams.Keywords: athletes, mental training, mobile application, performance, sports
Procedia PDF Downloads 2689915 Task Scheduling on Parallel System Using Genetic Algorithm
Authors: Jasbir Singh Gill, Baljit Singh
Abstract:
Scheduling and mapping the application task graph on multiprocessor parallel systems is considered as the most crucial and critical NP-complete problem. Many genetic algorithms have been proposed to solve such problems. In this paper, two genetic approach based algorithms have been designed and developed with or without task duplication. The proposed algorithms work on two fitness functions. The first fitness i.e. task fitness is used to minimize the total finish time of the schedule (schedule length) while the second fitness function i.e. process fitness is concerned with allocating the tasks to the available highly efficient processor from the list of available processors (load balance). Proposed genetic-based algorithms have been experimentally implemented and evaluated with other state-of-art popular and widely used algorithms.Keywords: parallel computing, task scheduling, task duplication, genetic algorithm
Procedia PDF Downloads 3499914 Study on Novel Reburning Process for NOx Reduction by Oscillating Injection of Reburn Fuel
Authors: Changyeop Lee, Sewon Kim, Jongho Lee
Abstract:
Reburning technology has been developed to adopt various commercial combustion systems. Fuel lean reburning is an advanced reburning method to reduce NOx economically without using burnout air, however it is not easy to get high NOx reduction efficiency. In the fuel lean reburning system, the localized fuel rich eddies are used to establish partial fuel rich regions so that the NOx can react with hydrocarbon radical restrictively. In this paper, a new advanced reburning method which supplies reburn fuel with oscillatory motion is introduced to increase NOx reduction rate effectively. To clarify whether forced oscillating injection of reburn fuel can effectively reduce NOx emission, experimental tests were conducted in vertical combustion furnace. Experiments were performed in flames stabilized by a gas burner, which was mounted at the bottom of the furnace. The natural gas is used as both main and reburn fuel and total thermal input is about 40kW. The forced oscillating injection of reburn fuel is realized by electronic solenoid valve, so that fuel rich region and fuel lean region is established alternately. In the fuel rich region, NOx is converted to N2 by reburning reaction, however unburned hydrocarbon and CO is oxidized in fuel lean zone and mixing zone at downstream where slightly fuel lean region is formed by mixing of two regions. This paper reports data on flue gas emissions and temperature distribution in the furnace for a wide range of experimental conditions. All experimental data has been measured at steady state. The NOx reduction rate increases up to 41% by forced oscillating reburn motion. The CO emissions were shown to be kept at very low level. And this paper makes clear that in order to decrease NOx concentration in the exhaust when oscillating reburn fuel injection system is adopted, the control of factors such as frequency and duty ratio is very important.Keywords: NOx, CO, reburning, pollutant
Procedia PDF Downloads 2889913 Improvement Image Summarization using Image Processing and Particle swarm optimization Algorithm
Authors: Hooman Torabifard
Abstract:
In the last few years, with the progress of technology and computers and artificial intelligence entry into all kinds of scientific and industrial fields, the lifestyles of human life have changed and in general, the way of humans live on earth has many changes and development. Until now, some of the changes has occurred in the context of digital images and image processing and still continues. However, besides all the benefits, there have been disadvantages. One of these disadvantages is the multiplicity of images with high volume and data; the focus of this paper is on improving and developing a method for summarizing and enhancing the productivity of these images. The general method used for this purpose in this paper consists of a set of methods based on data obtained from image processing and using the PSO (Particle swarm optimization) algorithm. In the remainder of this paper, the method used is elaborated in detail.Keywords: image summarization, particle swarm optimization, image threshold, image processing
Procedia PDF Downloads 133