Search results for: sponsored search
1873 A Hybrid Heuristic for the Team Orienteering Problem
Authors: Adel Bouchakhchoukha, Hakim Akeb
Abstract:
In this work, we propose a hybrid heuristic in order to solve the Team Orienteering Problem (TOP). Given a set of points (or customers), each with associated score (profit or benefit), and a team that has a fixed number of members, the problem to solve is to visit a subset of points in order to maximize the total collected score. Each member performs a tour starting at the start point, visiting distinct customers and the tour terminates at the arrival point. In addition, each point is visited at most once, and the total time in each tour cannot be greater than a given value. The proposed heuristic combines beam search and a local optimization strategy. The algorithm was tested on several sets of instances and encouraging results were obtained.Keywords: team orienteering problem, vehicle routing, beam search, local search
Procedia PDF Downloads 4181872 Improvements in Double Q-Learning for Anomalous Radiation Source Searching
Authors: Bo-Bin Xiaoa, Chia-Yi Liua
Abstract:
In the task of searching for anomalous radiation sources, personnel holding radiation detectors to search for radiation sources may be exposed to unnecessary radiation risk, and automated search using machines becomes a required project. The research uses various sophisticated algorithms, which are double Q learning, dueling network, and NoisyNet, of deep reinforcement learning to search for radiation sources. The simulation environment, which is a 10*10 grid and one shielding wall setting in it, improves the development of the AI model by training 1 million episodes. In each episode of training, the radiation source position, the radiation source intensity, agent position, shielding wall position, and shielding wall length are all set randomly. The three algorithms are applied to run AI model training in four environments where the training shielding wall is a full-shielding wall, a lead wall, a concrete wall, and a lead wall or a concrete wall appearing randomly. The 12 best performance AI models are selected by observing the reward value during the training period and are evaluated by comparing these AI models with the gradient search algorithm. The results show that the performance of the AI model, no matter which one algorithm, is far better than the gradient search algorithm. In addition, the simulation environment becomes more complex, the AI model which applied Double DQN combined Dueling and NosiyNet algorithm performs better.Keywords: double Q learning, dueling network, NoisyNet, source searching
Procedia PDF Downloads 1141871 A New Conjugate Gradient Method with Guaranteed Descent
Authors: B. Sellami, M. Belloufi
Abstract:
Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, we propose a new two-parameter family of conjugate gradient methods for unconstrained optimization. The two-parameter family of methods not only includes the already existing three practical nonlinear conjugate gradient methods, but also has other family of conjugate gradient methods as subfamily. The two-parameter family of methods with the Wolfe line search is shown to ensure the descent property of each search direction. Some general convergence results are also established for the two-parameter family of methods. The numerical results show that this method is efficient for the given test problems. In addition, the methods related to this family are uniformly discussed.Keywords: unconstrained optimization, conjugate gradient method, line search, global convergence
Procedia PDF Downloads 4541870 The MoEDAL-MAPP* Experiment - Expanding the Discovery Horizon of the Large Hadron Collider
Authors: James Pinfold
Abstract:
The MoEDAL (Monopole and Exotics Detector at the LHC) experiment deployed at IP8 on the Large Hadron Collider ring was the first dedicated search experiment to take data at the Large Hadron Collider (LHC) in 2010. It was designed to search for Highly Ionizing Particle (HIP) avatars of new physics such as magnetic monopoles, dyons, Q-balls, multiply charged particles, massive, slowly moving charged particles and long-lived massive charge SUSY particles. We shall report on our search at LHC’s Run-2 for Magnetic monopoles and dyons produced in p-p and photon-fusion. In more detail, we will report our most recent result in this arena: the search for magnetic monopoles via the Schwinger Mechanism in Pb-Pb collisions. The MoEDAL detector, originally the first dedicated search detector at the LHC, is being reinstalled for LHC’s Run-3 to continue the search for electrically and magnetically charged HIPs with enhanced instantaneous luminosity, detector efficiency and a factor of ten lower thresholds for HIPs. As part of this effort, we will search for massive l long-lived, singly and multiply charged particles from various scenarios for which MoEDAL has a competitive sensitivity. An upgrade to MoEDAL, the MoEDAL Apparatus for Penetrating Particles (MAPP), is now the LHC’s newest detector. The MAPP detector, positioned in UA83, expands the physics reach of MoEDAL to include sensitivity to feebly-charged particles with charge, or effective charge, as low as 10-3 e (where e is the electron charge). Also, In conjunction with MoEDAL’s trapping detector, the MAPP detector gives us a unique sensitivity to extremely long-lived charged particles. MAPP also has some sensitivity to long-lived neutral particles. The addition of an Outrigger detector for MAPP-1 to increase its acceptance for more massive milli-charged particles is currently in the Technical Proposal stage. Additionally, we will briefly report on the plans for the MAPP-2 upgrade to the MoEDAL-MAPP experiment for the High Luminosity LHC (HL-LHC). This experiment phase is designed to maximize MoEDAL-MAPP’s sensitivity to very long-lived neutral messengers of physics beyond the Standard Model. We envisage this detector being deployed in the UGC1 gallery near IP8.Keywords: LHC, beyond the standard model, dedicated search experiment, highly ionizing particles, long-lived particles, milli-charged particles
Procedia PDF Downloads 681869 Searching for Health-Related Information on the Internet: A Case Study on Young Adults
Authors: Dana Weimann Saks
Abstract:
This study aimed to examine the use of the internet as a source of health-related information (HRI), as well as the change in attitudes following the online search for HRI. The current study sample included 88 participants, randomly divided into two experimental groups. One was given the name of an unfamiliar disease and told to search for information about it using various search engines, and the second was given a text about the disease from a credible scientific source. The study findings show a large percentage of participants used the internet as a source of HRI. Likewise, no differences were found in the extent to which the internet was used as a source of HRI when demographics were compared. Those who searched for the HRI on the internet had more negative opinions and believed symptoms of the disease were worse than the average opinion among those who obtained the information about the disease from a credible scientific source. The Internet clearly influences the participants’ beliefs, regardless of demographic differences.Keywords: health-related information, internet, young adults, HRI
Procedia PDF Downloads 1301868 Hyperspectral Image Classification Using Tree Search Algorithm
Authors: Shreya Pare, Parvin Akhter
Abstract:
Remotely sensing image classification becomes a very challenging task owing to the high dimensionality of hyperspectral images. The pixel-wise classification methods fail to take the spatial structure information of an image. Therefore, to improve the performance of classification, spatial information can be integrated into the classification process. In this paper, the multilevel thresholding algorithm based on a modified fuzzy entropy function is used to perform the segmentation of hyperspectral images. The fuzzy parameters of the MFE function have been optimized by using a new meta-heuristic algorithm based on the Tree-Search algorithm. The segmented image is classified by a large distribution machine (LDM) classifier. Experimental results are shown on a hyperspectral image dataset. The experimental outputs indicate that the proposed technique (MFE-TSA-LDM) achieves much higher classification accuracy for hyperspectral images when compared to state-of-art classification techniques. The proposed algorithm provides accurate segmentation and classification maps, thus becoming more suitable for image classification with large spatial structures.Keywords: classification, hyperspectral images, large distribution margin, modified fuzzy entropy function, multilevel thresholding, tree search algorithm, hyperspectral image classification using tree search algorithm
Procedia PDF Downloads 1801867 Improved Multi-Objective Particle Swarm Optimization Applied to Design Problem
Authors: Kapse Swapnil, K. Shankar
Abstract:
Aiming at optimizing the weight and deflection of cantilever beam subjected to maximum stress and maximum deflection, Multi-objective Particle Swarm Optimization (MOPSO) with Utopia Point based local search is implemented. Utopia point is used to govern the search towards the Pareto Optimal set. The elite candidates obtained during the iterations are stored in an archive according to non-dominated sorting and also the archive is truncated based on least crowding distance. Local search is also performed on elite candidates and the most diverse particle is selected as the global best. This method is implemented on standard test functions and it is observed that the improved algorithm gives better convergence and diversity as compared to NSGA-II in fewer iterations. Implementation on practical structural problem shows that in 5 to 6 iterations, the improved algorithm converges with better diversity as evident by the improvement of cantilever beam on an average of 0.78% and 9.28% in the weight and deflection respectively compared to NSGA-II.Keywords: Utopia point, multi-objective particle swarm optimization, local search, cantilever beam
Procedia PDF Downloads 5201866 Satellite Imagery Classification Based on Deep Convolution Network
Authors: Zhong Ma, Zhuping Wang, Congxin Liu, Xiangzeng Liu
Abstract:
Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold — First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at the same level, as the building block to build our DCNN model. Second, we proposed a genetic algorithm based method to efficiently search the best hyper-parameters of the DCNN in a large search space. The proposed method is evaluated on the benchmark database. The results of the proposed hyper-parameters search method show it will guide the search towards better regions of the parameter space. Based on the found hyper-parameters, we built our DCNN models, and evaluated its performance on satellite imagery classification, the results show the classification accuracy of proposed models outperform the state of the art method.Keywords: satellite imagery classification, deep convolution network, genetic algorithm, hyper-parameter optimization
Procedia PDF Downloads 3021865 Searchable Encryption in Cloud Storage
Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption
Procedia PDF Downloads 3831864 MIOM: A Mixed-Initiative Operational Model for Robots in Urban Search and Rescue
Authors: Mario Gianni, Federico Nardi, Federico Ferri, Filippo Cantucci, Manuel A. Ruiz Garcia, Karthik Pushparaj, Fiora Pirri
Abstract:
In this paper, we describe a Mixed-Initiative Operational Model (MIOM) which directly intervenes on the state of the functionalities embedded into a robot for Urban Search&Rescue (USAR) domain applications. MIOM extends the reasoning capabilities of the vehicle, i.e. mapping, path planning, visual perception and trajectory tracking, with operator knowledge. Especially in USAR scenarios, this coupled initiative has the main advantage of enhancing the overall performance of a rescue mission. In-field experiments with rescue responders have been carried out to evaluate the effectiveness of this operational model.Keywords: mixed-initiative planning and control, operator control interfaces for rescue robotics, situation awareness, urban search, rescue robotics
Procedia PDF Downloads 3761863 An Enhanced Particle Swarm Optimization Algorithm for Multiobjective Problems
Authors: Houda Abadlia, Nadia Smairi, Khaled Ghedira
Abstract:
Multiobjective Particle Swarm Optimization (MOPSO) has shown an effective performance for solving test functions and real-world optimization problems. However, this method has a premature convergence problem, which may lead to lack of diversity. In order to improve its performance, this paper presents a hybrid approach which embedded the MOPSO into the island model and integrated a local search technique, Variable Neighborhood Search, to enhance the diversity into the swarm. Experiments on two series of test functions have shown the effectiveness of the proposed approach. A comparison with other evolutionary algorithms shows that the proposed approach presented a good performance in solving multiobjective optimization problems.Keywords: particle swarm optimization, migration, variable neighborhood search, multiobjective optimization
Procedia PDF Downloads 1681862 Companies and Transplant Tourists to China
Authors: Pavel Porubiak, Lukas Kudlacek
Abstract:
Introduction Transplant tourism is a controversial method of obtaining an organ, and that goes all the more for a country such as China, where sources of evidence point out to the possibility of organs being harvested illegally. This research aimed at listing the individual countries these tourists come from, or which medical companies sell transplant related products in there, with China being used as an example. Materials and methods The methodology of scoping study was used for both parts of the research. The countries from which transplant tourists come to China were identified by a search through existing medical studies in the NCBI PubMed database, listed under the keyword ‘transplantation in China’. The search was not limited by any other criteria, but only the studies available for free – directly on PubMed or a linked source – were used. Other research studies on this topic were considered as well. The companies were identified through multiple methods. The first was an online search focused on medical companies and their products. The Bloomberg Service, used by stock brokers worldwide, was then used to identify the revenue of these companies in individual countries – if data were available – as well as their business presence in China. A search through the U.S. Securities and Exchange Commission was done in the same way. Also a search on the Chinese internet was done, and to obtain more results, a second online search was done as well. The results and discussion The extensive search has identified 14 countries with transplant tourists to China. The search for a similar studies or reports resulted in finding additional six countries. The companies identified by our research also amounted to 20. Eight of them are sourcing China with organ preservation products – of which one is just trying to enter the Chinese market, six with immunosuppressive drugs, four with transplant diagnostics, one with medical robots which Chinese doctors use for transplantation as well, and another one trying to enter the Chinese market with a consumable-type product also related to transplantation. The conclusion The question of the ethicality of transplant tourism may be very pressing, since as the research shows, just the sheer amount of participating countries, sourcing transplant tourists to another one, amounts to 20. The identified companies are facing risks due to the nature of transplantation business in China, as officially executed prisoners are used as sources, and widely cited pieces of evidence point out to illegal organ harvesting. Similar risks and ethical questions are also relevant to the countries sourcing the transplant tourists to China.Keywords: China, illegal organ harvesting, transplant tourism, organ harvesting technology
Procedia PDF Downloads 1341861 A Constrained Neural Network Based Variable Neighborhood Search for the Multi-Objective Dynamic Flexible Job Shop Scheduling Problems
Authors: Aydin Teymourifar, Gurkan Ozturk, Ozan Bahadir
Abstract:
In this paper, a new neural network based variable neighborhood search is proposed for the multi-objective dynamic, flexible job shop scheduling problems. The neural network controls the problems' constraints to prevent infeasible solutions, while the Variable Neighborhood Search (VNS) applies moves, based on the critical block concept to improve the solutions. Two approaches are used for managing the constraints, in the first approach, infeasible solutions are modified according to the constraints, after the moves application, while in the second one, infeasible moves are prevented. Several neighborhood structures from the literature with some modifications, also new structures are used in the VNS. The suggested neighborhoods are more systematically defined and easy to implement. Comparison is done based on a multi-objective flexible job shop scheduling problem that is dynamic because of the jobs different release time and machines breakdowns. The results show that the presented method has better performance than the compared VNSs selected from the literature.Keywords: constrained optimization, neural network, variable neighborhood search, flexible job shop scheduling, dynamic multi-objective optimization
Procedia PDF Downloads 3471860 Global Convergence of a Modified Three-Term Conjugate Gradient Algorithms
Authors: Belloufi Mohammed, Sellami Badreddine
Abstract:
This paper deals with a new nonlinear modified three-term conjugate gradient algorithm for solving large-scale unstrained optimization problems. The search direction of the algorithms from this class has three terms and is computed as modifications of the classical conjugate gradient algorithms to satisfy both the descent and the conjugacy conditions. An example of three-term conjugate gradient algorithm from this class, as modifications of the classical and well known Hestenes and Stiefel or of the CG_DESCENT by Hager and Zhang conjugate gradient algorithms, satisfying both the descent and the conjugacy conditions is presented. Under mild conditions, we prove that the modified three-term conjugate gradient algorithm with Wolfe type line search is globally convergent. Preliminary numerical results show the proposed method is very promising.Keywords: unconstrained optimization, three-term conjugate gradient, sufficient descent property, line search
Procedia PDF Downloads 3751859 Synthesis of Dispersion-Compensating Triangular Lattice Index-Guiding Photonic Crystal Fibers Using the Directed Tabu Search Method
Authors: F. Karim
Abstract:
In this paper, triangular lattice index-guiding photonic crystal fibers (PCFs) are synthesized to compensate the chromatic dispersion of a single mode fiber (SMF-28) for an 80 km optical link operating at 1.55 µm, by using the directed tabu search algorithm. Hole-to-hole distance, circular air-hole diameter, solid-core diameter, ring number and PCF length parameters are optimized for this purpose. Three Synthesized PCFs with different physical parameters are compared in terms of their objective functions values, residual dispersions and compensation ratios.Keywords: triangular lattice index-guiding photonic crystal fiber, dispersion compensation, directed tabu search, synthesis
Procedia PDF Downloads 4321858 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm
Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang
Abstract:
In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm
Procedia PDF Downloads 1521857 The Optimal Indirect Vector Controller Design via an Adaptive Tabu Search Algorithm
Authors: P. Sawatnatee, S. Udomsuk, K-N. Areerak, K-L. Areerak, A. Srikaew
Abstract:
The paper presents how to design the indirect vector control of three-phase induction motor drive systems using the artificial intelligence technique called the adaptive tabu search. The results from the simulation and the experiment show that the drive system with the controller designed from the proposed method can provide the best output speed response compared with those of the conventional method. The controller design using the proposed technique can be used to create the software package for engineers to achieve the optimal controller design of the induction motor speed control based on the indirect vector concept.Keywords: indirect vector control, induction motor, adaptive tabu search, control design, artificial intelligence
Procedia PDF Downloads 4001856 An Integrated Cognitive Performance Evaluation Framework for Urban Search and Rescue Applications
Authors: Antonio D. Lee, Steven X. Jiang
Abstract:
A variety of techniques and methods are available to evaluate cognitive performance in Urban Search and Rescue (USAR) applications. However, traditional cognitive performance evaluation techniques typically incorporate either the conscious or systematic aspect, failing to take into consideration the subconscious or intuitive aspect. This leads to incomplete measures and produces ineffective designs. In order to fill the gaps in past research, this study developed a theoretical framework to facilitate the integration of situation awareness (SA) and intuitive pattern recognition (IPR) to enhance the cognitive performance representation in USAR applications. This framework provides guidance to integrate both SA and IPR in order to evaluate the cognitive performance of the USAR responders. The application of this framework will help improve the system design.Keywords: cognitive performance, intuitive pattern recognition, situation awareness, urban search and rescue
Procedia PDF Downloads 3301855 Global Direct Search Optimization of a Tuned Liquid Column Damper Subject to Stochastic Load
Authors: Mansour H. Alkmim, Adriano T. Fabro, Marcus V. G. De Morais
Abstract:
In this paper, a global direct search optimization algorithm to reduce vibration of a tuned liquid column damper (TLCD), a class of passive structural control device, is presented. The objective is to find optimized parameters for the TLCD under stochastic load from different wind power spectral density. A verification is made considering the analytical solution of an undamped primary system under white noise excitation. Finally, a numerical example considering a simplified wind turbine model is given to illustrate the efficacy of the TLCD. Results from the random vibration analysis are shown for four types of random excitation wind model where the response PSDs obtained showed good vibration attenuation.Keywords: generalized pattern search, parameter optimization, random vibration analysis, vibration suppression
Procedia PDF Downloads 2761854 Improving Research by the Integration of a Collaborative Dimension in an Information Retrieval (IR) System
Authors: Amel Hannech, Mehdi Adda, Hamid Mcheick
Abstract:
In computer science, the purpose of finding useful information is still one of the most active and important research topics. The most popular application of information retrieval (IR) are Search Engines, they meet users' specific needs and aim to locate the effective information in the web. However, these search engines have some limitations related to the relevancy of the results and the ease to explore those results. In this context, we proposed in previous works a Multi-Space Search Engine model that is based on a multidimensional interpretation universe. In the present paper, we integrate an additional dimension that allows to offer users new research experiences. The added component is based on creating user profiles and calculating the similarity between them that then allow the use of collaborative filtering in retrieving search results. To evaluate the effectiveness of the proposed model, a prototype is developed. The experiments showed that the additional dimension has improved the relevancy of results by predicting the interesting items of users based on their experiences and the experiences of other similar users. The offered personalization service allows users to approve the pertinent items, which allows to enrich their profiles and further improve research.Keywords: information retrieval, v-facets, user behavior analysis, user profiles, topical ontology, association rules, data personalization
Procedia PDF Downloads 2641853 Crow Search Algorithm-Based Task Offloading Strategies for Fog Computing Architectures
Authors: Aniket Ganvir, Ritarani Sahu, Suchismita Chinara
Abstract:
The rapid digitization of various aspects of life is leading to the creation of smart IoT ecosystems, where interconnected devices generate significant amounts of valuable data. However, these IoT devices face constraints such as limited computational resources and bandwidth. Cloud computing emerges as a solution by offering ample resources for offloading tasks efficiently despite introducing latency issues, especially for time-sensitive applications like fog computing. Fog computing (FC) addresses latency concerns by bringing computation and storage closer to the network edge, minimizing data travel distance, and enhancing efficiency. Offloading tasks to fog nodes or the cloud can conserve energy and extend IoT device lifespan. The offloading process is intricate, with tasks categorized as full or partial, and its optimization presents an NP-hard problem. Traditional greedy search methods struggle to address the complexity of task offloading efficiently. To overcome this, the efficient crow search algorithm (ECSA) has been proposed as a meta-heuristic optimization algorithm. ECSA aims to effectively optimize computation offloading, providing solutions to this challenging problem.Keywords: IoT, fog computing, task offloading, efficient crow search algorithm
Procedia PDF Downloads 581852 Improving Load Frequency Control of Multi-Area Power System by Considering Uncertainty by Using Optimized Type 2 Fuzzy Pid Controller with the Harmony Search Algorithm
Authors: Mehrdad Mahmudizad, Roya Ahmadi Ahangar
Abstract:
This paper presents the method of designing the type 2 fuzzy PID controllers in order to solve the problem of Load Frequency Control (LFC). The Harmony Search (HS) algorithm is used to regulate the measurement factors and the effect of uncertainty of membership functions of Interval Type 2 Fuzzy Proportional Integral Differential (IT2FPID) controllers in order to reduce the frequency deviation resulted from the load oscillations. The simulation results implicitly show that the performance of the proposed IT2FPID LFC in terms of error, settling time and resistance against different load oscillations is more appropriate and preferred than PID and Type 1 Fuzzy Proportional Integral Differential (T1FPID) controllers.Keywords: load frequency control, fuzzy-pid controller, type 2 fuzzy system, harmony search algorithm
Procedia PDF Downloads 2791851 Enhanced Arabic Semantic Information Retrieval System Based on Arabic Text Classification
Authors: A. Elsehemy, M. Abdeen , T. Nazmy
Abstract:
Since the appearance of the Semantic web, many semantic search techniques and models were proposed to exploit the information in ontology to enhance the traditional keyword-based search. Many advances were made in languages such as English, German, French and Spanish. However, other languages such as Arabic are not fully supported yet. In this paper we present a framework for ontology based information retrieval for Arabic language. Our system consists of four main modules, namely query parser, indexer, search and a ranking module. Our approach includes building a semantic index by linking ontology concepts to documents, including an annotation weight for each link, to be used in ranking the results. We also augmented the framework with an automatic document categorizer, which enhances the overall document ranking. We have built three Arabic domain ontologies: Sports, Economic and Politics as example for the Arabic language. We built a knowledge base that consists of 79 classes and more than 1456 instances. The system is evaluated using the precision and recall metrics. We have done many retrieval operations on a sample of 40,316 documents with a size 320 MB of pure text. The results show that the semantic search enhanced with text classification gives better performance results than the system without classification.Keywords: Arabic text classification, ontology based retrieval, Arabic semantic web, information retrieval, Arabic ontology
Procedia PDF Downloads 5261850 Ant System with Acoustic Communication
Authors: Saad Bougrine, Salma Ouchraa, Belaid Ahiod, Abdelhakim Ameur El Imrani
Abstract:
Ant colony optimization is an ant algorithm framework that took inspiration from foraging behaviour of ant colonies. Indeed, ACO algorithms use a chemical communication, represented by pheromone trails, to build good solutions. However, ants involve different communication channels to interact. Thus, this paper introduces the acoustic communication between ants while they are foraging. This process allows fine and local exploration of search space and permits optimal solution to be improved.Keywords: acoustic communication, ant colony optimization, local search, traveling salesman problem
Procedia PDF Downloads 5871849 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach
Authors: Jean Berger, Nassirou Lo, Martin Noel
Abstract:
Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization
Procedia PDF Downloads 3721848 Rapid Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, complexity, parallelism
Procedia PDF Downloads 5391847 Speedup Breadth-First Search by Graph Ordering
Abstract:
Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.Keywords: breadth-first search, BFS, graph ordering, graph algorithm
Procedia PDF Downloads 1381846 Rapid Parallel Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information's are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, low complexity, parallelism
Procedia PDF Downloads 5031845 Semantic Search Engine Based on Query Expansion with Google Ranking and Similarity Measures
Authors: Ahmad Shahin, Fadi Chakik, Walid Moudani
Abstract:
Our study is about elaborating a potential solution for a search engine that involves semantic technology to retrieve information and display it significantly. Semantic search engines are not used widely over the web as the majorities are still in Beta stage or under construction. Many problems face the current applications in semantic search, the major problem is to analyze and calculate the meaning of query in order to retrieve relevant information. Another problem is the ontology based index and its updates. Ranking results according to concept meaning and its relation with query is another challenge. In this paper, we are offering a light meta-engine (QESM) which uses Google search, and therefore Google’s index, with some adaptations to its returned results by adding multi-query expansion. The mission was to find a reliable ranking algorithm that involves semantics and uses concepts and meanings to rank results. At the beginning, the engine finds synonyms of each query term entered by the user based on a lexical database. Then, query expansion is applied to generate different semantically analogous sentences. These are generated randomly by combining the found synonyms and the original query terms. Our model suggests the use of semantic similarity measures between two sentences. Practically, we used this method to calculate semantic similarity between each query and the description of each page’s content generated by Google. The generated sentences are sent to Google engine one by one, and ranked again all together with the adapted ranking method (QESM). Finally, our system will place Google pages with higher similarities on the top of the results. We have conducted experimentations with 6 different queries. We have observed that most ranked results with QESM were altered with Google’s original generated pages. With our experimented queries, QESM generates frequently better accuracy than Google. In some worst cases, it behaves like Google.Keywords: semantic search engine, Google indexing, query expansion, similarity measures
Procedia PDF Downloads 4261844 Genetic Algorithm for In-Theatre Military Logistics Search-and-Delivery Path Planning
Authors: Jean Berger, Mohamed Barkaoui
Abstract:
Discrete search path planning in time-constrained uncertain environment relying upon imperfect sensors is known to be hard, and current problem-solving techniques proposed so far to compute near real-time efficient path plans are mainly bounded to provide a few move solutions. A new information-theoretic –based open-loop decision model explicitly incorporating false alarm sensor readings, to solve a single agent military logistics search-and-delivery path planning problem with anticipated feedback is presented. The decision model consists in minimizing expected entropy considering anticipated possible observation outcomes over a given time horizon. The model captures uncertainty associated with observation events for all possible scenarios. Entropy represents a measure of uncertainty about the searched target location. Feedback information resulting from possible sensor observations outcomes along the projected path plan is exploited to update anticipated unit target occupancy beliefs. For the first time, a compact belief update formulation is generalized to explicitly include false positive observation events that may occur during plan execution. A novel genetic algorithm is then proposed to efficiently solve search path planning, providing near-optimal solutions for practical realistic problem instances. Given the run-time performance of the algorithm, natural extension to a closed-loop environment to progressively integrate real visit outcomes on a rolling time horizon can be easily envisioned. Computational results show the value of the approach in comparison to alternate heuristics.Keywords: search path planning, false alarm, search-and-delivery, entropy, genetic algorithm
Procedia PDF Downloads 360