Search results for: automatic web search
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2653

Search results for: automatic web search

2563 Discrete Swarm with Passive Congregation for Cost Minimization of the Multiple Vehicle Routing Problem

Authors: Tarek Aboueldahab, Hanan Farag

Abstract:

Cost minimization of Multiple Vehicle Routing Problem becomes a critical issue in the field of transportation because it is NP-hard optimization problem and the search space is complex. Many researches use the hybridization of artificial intelligence (AI) models to solve this problem; however, it can not guarantee to reach the best solution due to the difficulty of searching the whole search space. To overcome this problem, we introduce the hybrid model of Discrete Particle Swarm Optimization (DPSO) with a passive congregation which enable searching the whole search space to compromise between both local and global search. The practical experiment shows that our model obviously outperforms other hybrid models in cost minimization.

Keywords: cost minimization, multi-vehicle routing problem, passive congregation, discrete swarm, passive congregation

Procedia PDF Downloads 66
2562 Performance Analysis of Search Medical Imaging Service on Cloud Storage Using Decision Trees

Authors: González A. Julio, Ramírez L. Leonardo, Puerta A. Gabriel

Abstract:

Telemedicine services use a large amount of data, most of which are diagnostic images in Digital Imaging and Communications in Medicine (DICOM) and Health Level Seven (HL7) formats. Metadata is generated from each related image to support their identification. This study presents the use of decision trees for the optimization of information search processes for diagnostic images, hosted on the cloud server. To analyze the performance in the server, the following quality of service (QoS) metrics are evaluated: delay, bandwidth, jitter, latency and throughput in five test scenarios for a total of 26 experiments during the loading and downloading of DICOM images, hosted by the telemedicine group server of the Universidad Militar Nueva Granada, Bogotá, Colombia. By applying decision trees as a data mining technique and comparing it with the sequential search, it was possible to evaluate the search times of diagnostic images in the server. The results show that by using the metadata in decision trees, the search times are substantially improved, the computational resources are optimized and the request management of the telemedicine image service is improved. Based on the experiments carried out, search efficiency increased by 45% in relation to the sequential search, given that, when downloading a diagnostic image, false positives are avoided in management and acquisition processes of said information. It is concluded that, for the diagnostic images services in telemedicine, the technique of decision trees guarantees the accessibility and robustness in the acquisition and manipulation of medical images, in improvement of the diagnoses and medical procedures in patients.

Keywords: cloud storage, decision trees, diagnostic image, search, telemedicine

Procedia PDF Downloads 180
2561 Estimation of Fuel Cost Function Characteristics Using Cuckoo Search

Authors: M. R. Al-Rashidi, K. M. El-Naggar, M. F. Al-Hajri

Abstract:

The fuel cost function describes the electric power generation-cost relationship in thermal plants, hence, it sheds light on economical aspects of power industry. Different models have been proposed to describe this relationship with the quadratic function model being the most popular one. Parameters of second order fuel cost function are estimated in this paper using cuckoo search algorithm. It is a new population based meta-heuristic optimization technique that has been used in this study primarily as an accurate estimation tool. Its main features are flexibility, simplicity, and effectiveness when compared to other estimation techniques. The parameter estimation problem is formulated as an optimization one with the goal being minimizing the error associated with the estimated parameters. A case study is considered in this paper to illustrate cuckoo search promising potential as a valuable estimation and optimization technique.

Keywords: cuckoo search, parameters estimation, fuel cost function, economic dispatch

Procedia PDF Downloads 548
2560 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 96
2559 Concept for Determining the Focus of Technology Monitoring Activities

Authors: Guenther Schuh, Christina Koenig, Nico Schoen, Markus Wellensiek

Abstract:

Identification and selection of appropriate product and manufacturing technologies are key factors for competitiveness and market success of technology-based companies. Therefore many companies perform technology intelligence (TI) activities to ensure the identification of evolving technologies at the right time. Technology monitoring is one of the three base activities of TI, besides scanning and scouting. As the technological progress is accelerating, more and more technologies are being developed. Against the background of limited resources it is therefore necessary to focus TI activities. In this paper, we propose a concept for defining appropriate search fields for technology monitoring. This limitation of search space leads to more concentrated monitoring activities. The concept will be introduced and demonstrated through an anonymized case study conducted within an industry project at the Fraunhofer Institute for Production Technology. The described concept provides a customized monitoring approach, which is suitable for use in technology-oriented companies especially those that have not yet defined an explicit technology strategy. It is shown in this paper that the definition of search fields and search tasks are suitable methods to define topics of interest and thus to direct monitoring activities. Current as well as planned product, production and material technologies as well as existing skills, capabilities and resources form the basis of the described derivation of relevant search areas. To further improve the concept of technology monitoring the proposed concept should be extended during future research e.g. by the definition of relevant monitoring parameters.

Keywords: monitoring radar, search field, technology intelligence, technology monitoring

Procedia PDF Downloads 442
2558 Synthetic Method of Contextual Knowledge Extraction

Authors: Olga Kononova, Sergey Lyapin

Abstract:

Global information society requirements are transparency and reliability of data, as well as ability to manage information resources independently; particularly to search, to analyze, to evaluate information, thereby obtaining new expertise. Moreover, it is satisfying the society information needs that increases the efficiency of the enterprise management and public administration. The study of structurally organized thematic and semantic contexts of different types, automatically extracted from unstructured data, is one of the important tasks for the application of information technologies in education, science, culture, governance and business. The objectives of this study are the contextual knowledge typologization, selection or creation of effective tools for extracting and analyzing contextual knowledge. Explication of various kinds and forms of the contextual knowledge involves the development and use full-text search information systems. For the implementation purposes, the authors use an e-library 'Humanitariana' services such as the contextual search, different types of queries (paragraph-oriented query, frequency-ranked query), automatic extraction of knowledge from the scientific texts. The multifunctional e-library «Humanitariana» is realized in the Internet-architecture in WWS-configuration (Web-browser / Web-server / SQL-server). Advantage of use 'Humanitariana' is in the possibility of combining the resources of several organizations. Scholars and research groups may work in a local network mode and in distributed IT environments with ability to appeal to resources of any participating organizations servers. Paper discusses some specific cases of the contextual knowledge explication with the use of the e-library services and focuses on possibilities of new types of the contextual knowledge. Experimental research base are science texts about 'e-government' and 'computer games'. An analysis of the subject-themed texts trends allowed to propose the content analysis methodology, that combines a full-text search with automatic construction of 'terminogramma' and expert analysis of the selected contexts. 'Terminogramma' is made out as a table that contains a column with a frequency-ranked list of words (nouns), as well as columns with an indication of the absolute frequency (number) and the relative frequency of occurrence of the word (in %% ppm). The analysis of 'e-government' materials showed, that the state takes a dominant position in the processes of the electronic interaction between the authorities and society in modern Russia. The media credited the main role in these processes to the government, which provided public services through specialized portals. Factor analysis revealed two factors statistically describing the used terms: human interaction (the user) and the state (government, processes organizer); interaction management (public officer, processes performer) and technology (infrastructure). Isolation of these factors will lead to changes in the model of electronic interaction between government and society. In this study, the dominant social problems and the prevalence of different categories of subjects of computer gaming in science papers from 2005 to 2015 were identified. Therefore, there is an evident identification of several types of contextual knowledge: micro context; macro context; dynamic context; thematic collection of queries (interactive contextual knowledge expanding a composition of e-library information resources); multimodal context (functional integration of iconographic and full-text resources through hybrid quasi-semantic algorithm of search). Further studies can be pursued both in terms of expanding the resource base on which they are held, and in terms of the development of appropriate tools.

Keywords: contextual knowledge, contextual search, e-library services, frequency-ranked query, paragraph-oriented query, technologies of the contextual knowledge extraction

Procedia PDF Downloads 325
2557 An Open Source Advertisement System

Authors: Pushkar Umaranikar, Chris Pollett

Abstract:

An online advertisement system and its implementation for the Yioop open source search engine are presented. This system supports both selling advertisements and displaying them within search results. The selling of advertisements is done using a system to auction off daily impressions for keyword searches. This is an open, ascending price auction system in which all accepted bids will receive a fraction of the auctioned day’s impressions. New bids in our system are required to be at least one half of the sum of all previous bids ensuring the number of accepted bids is logarithmic in the total ad spend on a keyword for a day. The mechanics of creating an advertisement, attaching keywords to it, and adding it to an advertisement inventory are described. The algorithm used to go from accepted bids for a keyword to which ads are displayed at search time is also presented. We discuss properties of our system and compare it to existing auction systems and systems for selling online advertisements.

Keywords: online markets, online ad system, online auctions, search engines

Procedia PDF Downloads 293
2556 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis

Procedia PDF Downloads 298
2555 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach

Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti

Abstract:

Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.

Keywords: Javanese script, character recognition, statistical, automatic transliteration

Procedia PDF Downloads 316
2554 Use of Interpretable Evolved Search Query Classifiers for Sinhala Documents

Authors: Prasanna Haddela

Abstract:

Document analysis is a well matured yet still active research field, partly as a result of the intricate nature of building computational tools but also due to the inherent problems arising from the variety and complexity of human languages. Breaking down language barriers is vital in enabling access to a number of recent technologies. This paper investigates the application of document classification methods to new Sinhalese datasets. This language is geographically isolated and rich with many of its own unique features. We will examine the interpretability of the classification models with a particular focus on the use of evolved Lucene search queries generated using a Genetic Algorithm (GA) as a method of document classification. We will compare the accuracy and interpretability of these search queries with other popular classifiers. The results are promising and are roughly in line with previous work on English language datasets.

Keywords: evolved search queries, Sinhala document classification, Lucene Sinhala analyzer, interpretable text classification, genetic algorithm

Procedia PDF Downloads 89
2553 Earphone Style Wearable Device for Automatic Guidance Service with Position Sensing

Authors: Dawei Cai

Abstract:

This paper describes a design of earphone style wearable device that may provide an automatic guidance service for visitors. With both position information and orientation information obtained from NFC and terrestrial magnetism sensor, a high level automatic guide service may be realized. To realize the service, a algorithm for position detection using the packet from NFC tags, and developed an algorithm to calculate the device orientation based on the data from acceleration and terrestrial magnetism sensors called as MEMS. If visitors want to know some explanation about an exhibit in front of him, what he has to do is only move to the object and stands for a moment. The identification program will automatically recognize the status based on the information from NFC and MEMS, and start playing explanation content about the exhibit. This service should be useful for improving the understanding of the exhibition items and bring more satisfactory visiting experience without less burden.

Keywords: wearable device, MEMS sensor, ubiquitous computing, NFC

Procedia PDF Downloads 218
2552 Efficient Subsurface Mapping: Automatic Integration of Ground Penetrating Radar with Geographic Information Systems

Authors: Rauf R. Hussein, Devon M. Ramey

Abstract:

Integrating Ground Penetrating Radar (GPR) with Geographic Information Systems (GIS) can provide valuable insights for various applications, such as archaeology, transportation, and utility locating. Although there has been progress toward automating the integration of GPR data with GIS, fully automatic integration has not been achieved yet. Additionally, manually integrating GPR data with GIS can be a time-consuming and error-prone process. In this study, actual, real-world GPR applications are presented, and a software named GPR-GIS 10 is created to interactively extract subsurface targets from GPR radargrams and automatically integrate them into GIS. With this software, it is possible to quickly and reliably integrate the two techniques to create informative subsurface maps. The results indicated that automatic integration of GPR with GIS can be an efficient tool to map and view any subsurface targets in their appropriate location in a 3D space with the needed precision. The findings of this study could help GPR-GIS integrators save time and reduce errors in many GPR-GIS applications.

Keywords: GPR, GIS, GPR-GIS 10, drone technology, automation

Procedia PDF Downloads 56
2551 Optimal Placement of Phasor Measurement Units Using Gravitational Search Method

Authors: Satyendra Pratap Singh, S. P. Singh

Abstract:

This paper presents a methodology using Gravitational Search Algorithm for optimal placement of Phasor Measurement Units (PMUs) in order to achieve complete observability of the power system. The objective of proposed algorithm is to minimize the total number of PMUs at the power system buses, which in turn minimize installation cost of the PMUs. In this algorithm, the searcher agents are collection of masses which interact with each other using Newton’s laws of gravity and motion. This new Gravitational Search Algorithm based method has been applied to the IEEE 14-bus, IEEE 30-bus and IEEE 118-bus test systems. Case studies reveal optimal number of PMUs with better observability by proposed method.

Keywords: gravitational search algorithm (GSA), law of motion, law of gravity, observability, phasor measurement unit

Procedia PDF Downloads 476
2550 Applying Spanning Tree Graph Theory for Automatic Database Normalization

Authors: Chetneti Srisa-an

Abstract:

In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.

Keywords: relational database, functional dependency, automatic normalization, primary key, spanning tree

Procedia PDF Downloads 328
2549 Bit Error Rate Monitoring for Automatic Bias Control of Quadrature Amplitude Modulators

Authors: Naji Ali Albakay, Abdulrahman Alothaim, Isa Barshushi

Abstract:

The most common quadrature amplitude modulator (QAM) applies two Mach-Zehnder Modulators (MZM) and one phase shifter to generate high order modulation format. The bias of MZM changes over time due to temperature, vibration, and aging factors. The change in the biasing causes distortion to the generated QAM signal which leads to deterioration of bit error rate (BER) performance. Therefore, it is critical to be able to lock MZM’s Q point to the required operating point for good performance. We propose a technique for automatic bias control (ABC) of QAM transmitter using BER measurements and gradient descent optimization algorithm. The proposed technique is attractive because it uses the pertinent metric, BER, which compensates for bias drifting independently from other system variations such as laser source output power. The proposed scheme performance and its operating principles are simulated using OptiSystem simulation software for 4-QAM and 16-QAM transmitters.

Keywords: automatic bias control, optical fiber communication, optical modulation, optical devices

Procedia PDF Downloads 160
2548 Penguins Search Optimization Algorithm for Chaotic Synchronization System

Authors: Sofiane Bououden, Ilyes Boulkaibet

Abstract:

In terms of security of the information signal, the meta-heuristic Penguins Search Optimization Algorithm (PeSOA) is applied to synchronize chaotic encryption communications in the case of sensitive dependence on initial conditions in chaotic generator oscillator. The objective of this paper is the use of the PeSOA algorithm to exploring search space with random and iterative processes for synchronization of symmetric keys in both transmission and reception. Simulation results show the effectiveness of the PeSOA algorithm in generating symmetric keys of the encryption process and synchronizing.

Keywords: meta-heuristic, PeSOA, chaotic systems, encryption, synchronization optimization

Procedia PDF Downloads 155
2547 The Hyperbolic Smoothing Approach for Automatic Calibration of Rainfall-Runoff Models

Authors: Adilson Elias Xavier, Otto Corrêa Rotunno Filho, Paulo Canedo De Magalhães

Abstract:

This paper addresses the issue of automatic parameter estimation in conceptual rainfall-runoff (CRR) models. Due to threshold structures commonly occurring in CRR models, the associated mathematical optimization problems have the significant characteristic of being strongly non-differentiable. In order to face this enormous task, the resolution method proposed adopts a smoothing strategy using a special C∞ differentiable class function. The final estimation solution is obtained by solving a sequence of differentiable subproblems which gradually approach the original conceptual problem. The use of this technique, called Hyperbolic Smoothing Method (HSM), makes possible the application of the most powerful minimization algorithms, and also allows for the main difficulties presented by the original CRR problem to be overcome. A set of computational experiments is presented for the purpose of illustrating both the reliability and the efficiency of the proposed approach.

Keywords: rainfall-runoff models, automatic calibration, hyperbolic smoothing method

Procedia PDF Downloads 104
2546 A New Family of Globally Convergent Conjugate Gradient Methods

Authors: B. Sellami, Y. Laskri, M. Belloufi

Abstract:

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, a new family of conjugate gradient method is proposed for unconstrained optimization. This method includes the already existing two practical nonlinear conjugate gradient methods, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the Wolfe conditions. The numerical experiments are done to test the efficiency of the new method, which implies the new method is promising. In addition the methods related to this family are uniformly discussed.

Keywords: conjugate gradient method, global convergence, line search, unconstrained optimization

Procedia PDF Downloads 380
2545 Highly Accurate Target Motion Compensation Using Entropy Function Minimization

Authors: Amin Aghatabar Roodbary, Mohammad Hassan Bastani

Abstract:

One of the defects of stepped frequency radar systems is their sensitivity to target motion. In such systems, target motion causes range cell shift, false peaks, Signal to Noise Ratio (SNR) reduction and range profile spreading because of power spectrum interference of each range cell in adjacent range cells which induces distortion in High Resolution Range Profile (HRRP) and disrupt target recognition process. Thus Target Motion Parameters (TMPs) effects compensation should be employed. In this paper, such a method for estimating TMPs (velocity and acceleration) and consequently eliminating or suppressing the unwanted effects on HRRP based on entropy minimization has been proposed. This method is carried out in two major steps: in the first step, a discrete search method has been utilized over the whole acceleration-velocity lattice network, in a specific interval seeking to find a less-accurate minimum point of the entropy function. Then in the second step, a 1-D search over velocity is done in locus of the minimum for several constant acceleration lines, in order to enhance the accuracy of the minimum point found in the first step. The provided simulation results demonstrate the effectiveness of the proposed method.

Keywords: automatic target recognition (ATR), high resolution range profile (HRRP), motion compensation, stepped frequency waveform technique (SFW), target motion parameters (TMPs)

Procedia PDF Downloads 126
2544 Book Recommendation Using Query Expansion and Information Retrieval Methods

Authors: Ritesh Kumar, Rajendra Pamula

Abstract:

In this paper, we present our contribution for book recommendation. In our experiment, we combine the results of Sequential Dependence Model (SDM) and exploitation of book information such as reviews, tags and ratings. This social information is assigned by users. For this, we used CLEF-2016 Social Book Search Track Suggestion task. Finally, our proposed method extensively evaluated on CLEF -2015 Social Book Search datasets, and has better performance (nDCG@10) compared to other state-of-the-art systems. Recently we got the good performance in CLEF-2016.

Keywords: sequential dependence model, social information, social book search, query expansion

Procedia PDF Downloads 264
2543 Modeling Search-And-Rescue Operations by Autonomous Mobile Robots at Sea

Authors: B. Kriheli, E. Levner, T. C. E. Cheng, C. T. Ng

Abstract:

During the last decades, research interest in planning, scheduling, and control of emergency response operations, especially people rescue and evacuation from the dangerous zone of marine accidents, has increased dramatically. Until the survivors (called ‘targets’) are found and saved, it may cause loss or damage whose extent depends on the location of the targets and the search duration. The problem is to efficiently search for and detect/rescue the targets as soon as possible with the help of intelligent mobile robots so as to maximize the number of saved people and/or minimize the search cost under restrictions on the amount of saved people within the allowable response time. We consider a special situation when the autonomous mobile robots (AMR), e.g., unmanned aerial vehicles and remote-controlled robo-ships have no operator on board as they are guided and completely controlled by on-board sensors and computer programs. We construct a mathematical model for the search process in an uncertain environment and provide a new fast algorithm for scheduling the activities of the autonomous robots during the search-and rescue missions after an accident at sea. We presume that in the unknown environments, the AMR’s search-and-rescue activity is subject to two types of error: (i) a 'false-negative' detection error where a target object is not discovered (‘overlooked') by the AMR’s sensors in spite that the AMR is in a close neighborhood of the latter and (ii) a 'false-positive' detection error, also known as ‘a false alarm’, in which a clean place or area is wrongly classified by the AMR’s sensors as a correct target. As the general resource-constrained discrete search problem is NP-hard, we restrict our study to finding local-optimal strategies. A specificity of the considered operational research problem in comparison with the traditional Kadane-De Groot-Stone search models is that in our model the probability of the successful search outcome depends not only on cost/time/probability parameters assigned to each individual location but, as well, on parameters characterizing the entire history of (unsuccessful) search before selecting any next location. We provide a fast approximation algorithm for finding the AMR route adopting a greedy search strategy in which, in each step, the on-board computer computes a current search effectiveness value for each location in the zone and sequentially searches for a location with the highest search effectiveness value. Extensive experiments with random and real-life data provide strong evidence in favor of the suggested operations research model and corresponding algorithm.

Keywords: disaster management, intelligent robots, scheduling algorithm, search-and-rescue at sea

Procedia PDF Downloads 145
2542 Image Based Landing Solutions for Large Passenger Aircraft

Authors: Thierry Sammour Sawaya, Heikki Deschacht

Abstract:

In commercial aircraft operations, almost half of the accidents happen during approach or landing phases. Automatic guidance and automatic landings have proven to bring significant safety value added for this challenging landing phase. This is why Airbus and ScioTeq have decided to work together to explore the capability of image-based landing solutions as additional landing aids to further expand the possibility to perform automatic approach and landing to runways where the current guiding systems are either not fitted or not optimum. Current systems for automated landing often depend on radio signals provided by airport ground infrastructure on the airport or satellite coverage. In addition, these radio signals may not always be available with the integrity and performance required for safe automatic landing. Being independent from these radio signals would widen the operations possibilities and increase the number of automated landings. Airbus and ScioTeq are joining their expertise in the field of Computer Vision in the European Program called Clean Sky 2 Large Passenger Aircraft, in which they are leading the IMBALS (IMage BAsed Landing Solutions) project. The ultimate goal of this project is to demonstrate, develop, validate and verify a certifiable automatic landing system guiding an airplane during the approach and landing phases based on an onboard camera system capturing images, enabling automatic landing independent from radio signals and without precision instrument for landing. In the frame of this project, ScioTeq is responsible for the development of the Image Processing Platform (IPP), while Airbus is responsible for defining the functional and system requirements as well as the testing and integration of the developed equipment in a Large Passenger Aircraft representative environment. The aim of this paper will be to describe the system as well as the associated methods and tools developed for validation and verification.

Keywords: aircraft landing system, aircraft safety, autoland, avionic system, computer vision, image processing

Procedia PDF Downloads 63
2541 Cross Site Scripting (XSS) Attack and Automatic Detection Technology Research

Authors: Tao Feng, Wei-Wei Zhang, Chang-Ming Ding

Abstract:

Cross-site scripting (XSS) is one of the most popular WEB Attacking methods at present, and also one of the most risky web attacks. Because of the population of JavaScript, the scene of the cross site scripting attack is also gradually expanded. However, since the web application developers tend to only focus on functional testing and lack the awareness of the XSS, which has made the on-line web projects exist many XSS vulnerabilities. In this paper, different various techniques of XSS attack are analyzed, and a method automatically to detect it is proposed. It is easy to check the results of vulnerability detection when running it as a plug-in.

Keywords: XSS, no target attack platform, automatic detection,XSS detection

Procedia PDF Downloads 373
2540 Harmony Search-Based K-Coverage Enhancement in Wireless Sensor Networks

Authors: Shaimaa M. Mohamed, Haitham S. Hamza, Imane A. Saroit

Abstract:

Many wireless sensor network applications require K-coverage of the monitored area. In this paper, we propose a scalable harmony search based algorithm in terms of execution time, K-Coverage Enhancement Algorithm (KCEA), it attempts to enhance initial coverage, and achieve the required K-coverage degree for a specific application efficiently. Simulation results show that the proposed algorithm achieves coverage improvement of 5.34% compared to K-Coverage Rate Deployment (K-CRD), which achieves 1.31% when deploying one additional sensor. Moreover, the proposed algorithm is more time efficient.

Keywords: Wireless Sensor Networks (WSN), harmony search algorithms, K-Coverage, Mobile WSN

Procedia PDF Downloads 495
2539 Automatic Censoring in K-Distribution for Multiple Targets Situations

Authors: Naime Boudemagh, Zoheir Hammoudi

Abstract:

The parameters estimation of the K-distribution is an essential part in radar detection. In fact, presence of interfering targets in reference cells causes a decrease in detection performances. In such situation, the estimate of the shape and the scale parameters are far from the actual values. In the order to avoid interfering targets, we propose an Automatic Censoring (AC) algorithm of radar interfering targets in K-distribution. The censoring technique used in this work offers a good discrimination between homogeneous and non-homogeneous environments. The homogeneous population is then used to estimate the unknown parameters by the classical Method of Moment (MOM). The AC algorithm does not need any prior information about the clutter parameters nor does it require both the number and the position of interfering targets. The accuracy of the estimation parameters obtained by this algorithm are validated and compared to various actual values of the shape parameter, using Monte Carlo simulations, this latter show that the probability of censing in multiple target situations are in good agreement.

Keywords: parameters estimation, method of moments, automatic censoring, K distribution

Procedia PDF Downloads 349
2538 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region

Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan

Abstract:

Rainfall-runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15–May 18 2014). The prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.

Keywords: flood, HEC-HMS, prediction, rainfall, runoff

Procedia PDF Downloads 363
2537 Determination of Neighbor Node in Consideration of the Imaging Range of Cameras in Automatic Human Tracking System

Authors: Kozo Tanigawa, Tappei Yotsumoto, Kenichi Takahashi, Takao Kawamura, Kazunori Sugahara

Abstract:

An automatic human tracking system using mobile agent technology is realized because a mobile agent moves in accordance with a migration of a target person. In this paper, we propose a method for determining the neighbor node in consideration of the imaging range of cameras.

Keywords: human tracking, mobile agent, Pan/Tilt/Zoom, neighbor relation

Procedia PDF Downloads 475
2536 Elitist Self-Adaptive Step-Size Search in Optimum Sizing of Steel Structures

Authors: Oğuzhan Hasançebi, Saeid Kazemzadeh Azad

Abstract:

This paper covers application of an elitist selfadaptive
step-size search (ESASS) to optimum design of steel
skeletal structures. In the ESASS two approaches are considered for
improving the convergence accuracy as well as the computational
efficiency of the original technique namely the so called selfadaptive
step-size search (SASS). Firstly, an additional randomness
is incorporated into the sampling step of the technique to preserve
exploration capability of the algorithm during the optimization.
Moreover, an adaptive sampling scheme is introduced to improve the
quality of final solutions. Secondly, computational efficiency of the
technique is accelerated via avoiding unnecessary analyses during the
optimization process using an upper bound strategy. The numerical
results demonstrate the usefulness of the ESASS in the sizing
optimization problems of steel truss and frame structures.

Keywords: structural design optimization, optimal sizing, metaheuristics, self-adaptive step-size search, steel trusses, steel frames

Procedia PDF Downloads 340
2535 Automatic Early Breast Cancer Segmentation Enhancement by Image Analysis and Hough Transform

Authors: David Jurado, Carlos Ávila

Abstract:

Detection of early signs of breast cancer development is crucial to quickly diagnose the disease and to define adequate treatment to increase the survival probability of the patient. Computer Aided Detection systems (CADs), along with modern data techniques such as Machine Learning (ML) and Neural Networks (NN), have shown an overall improvement in digital mammography cancer diagnosis, reducing the false positive and false negative rates becoming important tools for the diagnostic evaluations performed by specialized radiologists. However, ML and NN-based algorithms rely on datasets that might bring issues to the segmentation tasks. In the present work, an automatic segmentation and detection algorithm is described. This algorithm uses image processing techniques along with the Hough transform to automatically identify microcalcifications that are highly correlated with breast cancer development in the early stages. Along with image processing, automatic segmentation of high-contrast objects is done using edge extraction and circle Hough transform. This provides the geometrical features needed for an automatic mask design which extracts statistical features of the regions of interest. The results shown in this study prove the potential of this tool for further diagnostics and classification of mammographic images due to the low sensitivity to noisy images and low contrast mammographies.

Keywords: breast cancer, segmentation, X-ray imaging, hough transform, image analysis

Procedia PDF Downloads 44
2534 Automatic Vowel and Consonant's Target Formant Frequency Detection

Authors: Othmane Bouferroum, Malika Boudraa

Abstract:

In this study, a dual exponential model for CV formant transition is derived from locus theory of speech perception. Then, an algorithm for automatic vowel and consonant’s target formant frequency detection is developed and tested on real speech. The results show that vowels and consonants are detected through transitions rather than their small stable portions. Also, vowel reduction is clearly observed in our data. These results are confirmed by the observations made in perceptual experiments in the literature.

Keywords: acoustic invariance, coarticulation, formant transition, locus equation

Procedia PDF Downloads 236