Search results for: Monte Carlo algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3882

Search results for: Monte Carlo algorithm

3552 Investigation of Efficient Production of ¹³⁵La for the Auger Therapy Using Medical Cyclotron in Poland

Authors: N. Zandi, M. Sitarz, J. Jastrzebski, M. Vagheian, J. Choinski, A. Stolarz, A. Trzcinska

Abstract:

¹³⁵La with the half-life of 19.5 h can be considered as a good candidate for Auger therapy. ¹³⁵La decays almost 100% by electron capture to the stable ¹³⁵Ba. In this study, all important possible reactions leading to ¹³⁵La production are investigated in details, and the corresponding theoretical yield for each reaction using the Monte-Carlo method (MCNPX code) are presented. Among them, the best reaction based on the cost-effectiveness and production yield regarding Poland facilities equipped with medical cyclotron has been selected. ¹³⁵La is produced using 16.5 MeV proton beam of general electric PET trace cyclotron through the ¹³⁵Ba(p,n)¹³⁵La reaction. Moreover, for a consistent facilitating comparison between the theoretical calculations and the experimental measurements, the beam current and also the proton beam energy is measured experimentally. Then, the obtained proton energy is considered as the entrance energy for the theoretical calculations. The production yield finally is measured and compared with the results obtained using the MCNPX code. The results show the experimental measurement and the theoretical calculations are in good agreement.

Keywords: efficient ¹³⁵La production, proton cyclotron energy measurement, MCNPX code, theoretical and experimental production yield

Procedia PDF Downloads 126
3551 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm

Authors: Kristian Bautista, Ruben A. Idoy

Abstract:

A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.

Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization

Procedia PDF Downloads 205
3550 Hardware for Genetic Algorithm

Authors: Fariborz Ahmadi, Reza Tati

Abstract:

Genetic algorithm is a soft computing method that works on set of solutions. These solutions are called chromosome and the best one is the absolute solution of the problem. The main problem of this algorithm is that after passing through some generations, it may be produced some chromosomes that had been produced in some generations ago that causes reducing the convergence speed. From another respective, most of the genetic algorithms are implemented in software and less works have been done on hardware implementation. Our work implements genetic algorithm in hardware that doesn’t produce chromosome that have been produced in previous generations. In this work, most of genetic operators are implemented without producing iterative chromosomes and genetic diversity is preserved. Genetic diversity causes that not only do not this algorithm converge to local optimum but also reaching to global optimum. Without any doubts, proposed approach is so faster than software implementations. Evaluation results also show the proposed approach is faster than hardware ones.

Keywords: hardware, genetic algorithm, computer science, engineering

Procedia PDF Downloads 476
3549 A Kruskal Based Heuxistic for the Application of Spanning Tree

Authors: Anjan Naidu

Abstract:

In this paper we first discuss the minimum spanning tree, then we use the Kruskal algorithm to obtain minimum spanning tree. Based on Kruskal algorithm we propose Kruskal algorithm to apply an application to find minimum cost applying the concept of spanning tree.

Keywords: Minimum Spanning tree, algorithm, Heuxistic, application, classification of Sub 97K90

Procedia PDF Downloads 422
3548 Application of Imperialist Competitive Algorithm for Optimal Location and Sizing of Static Compensator Considering Voltage Profile

Authors: Vahid Rashtchi, Ashkan Pirooz

Abstract:

This paper applies the Imperialist Competitive Algorithm (ICA) to find the optimal place and size of Static Compensator (STATCOM) in power systems. The output of the algorithm is a two dimensional array which indicates the best bus number and STATCOM's optimal size that minimizes all bus voltage deviations from their nominal value. Simulations are performed on IEEE 5, 14, and 30 bus test systems. Also some comparisons have been done between ICA and the famous Particle Swarm Optimization (PSO) algorithm. Results show that how this method can be considered as one of the most precise evolutionary methods for the use of optimum compensator placement in electrical grids.

Keywords: evolutionary computation, imperialist competitive algorithm, power systems compensation, static compensators, voltage profile

Procedia PDF Downloads 586
3547 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software

Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat

Abstract:

Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.

Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software

Procedia PDF Downloads 453
3546 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm

Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang

Abstract:

In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.

Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm

Procedia PDF Downloads 123
3545 Seismic Fragility of Base-Isolated Multi-Story Piping System in Critical Facilities

Authors: Bu Seog Ju, Ho Young Son, Yong Hee Ryu

Abstract:

This study is focused on the evaluation of seismic fragility of multi-story piping system installed in critical structures, isolated with triple friction pendulum bearing. The concept of this study is to isolate the critical building structure as well as nonstructural component, especially piping system in order to mitigate the earthquake damage and achieve the reliable seismic design. Then, the building system and multi-story piping system was modeled in OpenSees. In particular, the triple friction pendulum isolator was accounted for the vertical and horizontal coupling behavior in the building system subjected to seismic ground motions. Consequently, in order to generate the seismic fragility of base-isolated multi-story piping system, 21 selected seismic ground motions were carried out, by using Monte Carlo Simulation accounted for the uncertainties in demand. Finally, the system-level fragility curves corresponding to the limit state of the piping system was conducted at each T-joint system, which was commonly failure points in piping systems during and after an earthquake. Additionally, the system-level fragilities were performed to the first floor and second floor level in critical structures.

Keywords: fragility, friction pendulum bearing, nonstructural component, seismic

Procedia PDF Downloads 130
3544 Numerical Response of Planar HPGe Detector for 241Am Contamination of Various Shapes

Authors: M. Manohari, Himanshu Gupta, S. Priyadharshini, R.Santhanam, S.Chandrasekaran, B|.Venkatraman

Abstract:

Injection is one of the potential routes of intake in a radioactive facility. The internal dose due to this intake is monitored at the radiation emergency medical centre, IGCAR using a portable planar HPGe detector. The contaminated wound may be having different shapes. In a reprocessing potential of wound contamination with actinide is more. Efficiency is one of the input parameters for estimation of internal dose. Estimating these efficiencies experimentally would be tedious and cumbersome. Numerical estimation can be a supplement to experiment. As an initial step in this study 241Am contamination of different shapes are studied. In this study portable planar HPGe detector was modeled using Monte Carlo code FLUKA and the effect of different parameters like distance of the contamination from the detector, radius of the circular contamination were studied. Efficiency values for point and surface contamination located at different distances were estimated. The effect of efficiency on the radius of the surface source was more predominant when the source is at 1 cm distance compared to when the source to detector distance is 10 cm. At 1 cm the efficiency decreased quadratically as the radius increased and at 10 cm it decreased linearly. The point source efficiency varied exponentially with source to detector distance.

Keywords: Planar HPGe, efficiency value, injection, surface source

Procedia PDF Downloads 23
3543 Nonlinear Power Measurement Algorithm of the Input Mix Components of the Noise Signal and Pulse Interference

Authors: Alexey V. Klyuev, Valery P. Samarin, Viktor F. Klyuev, Andrey V. Klyuev

Abstract:

A power measurement algorithm of the input mix components of the noise signal and pulse interference is considered. The algorithm efficiency analysis has been carried out for different interference to signal ratio. Algorithm performance features have been explored by numerical experiment results.

Keywords: noise signal, pulse interference, signal power, spectrum width, detection

Procedia PDF Downloads 318
3542 A Tagging Algorithm in Augmented Reality for Mobile Device Screens

Authors: Doga Erisik, Ahmet Karaman, Gulfem Alptekin, Ozlem Durmaz Incel

Abstract:

Augmented reality (AR) is a type of virtual reality aiming to duplicate real world’s environment on a computer’s video feed. The mobile application, which is built for this project (called SARAS), enables annotating real world point of interests (POIs) that are located near mobile user. In this paper, we aim at introducing a robust and simple algorithm for placing labels in an augmented reality system. The system places labels of the POIs on the mobile device screen whose GPS coordinates are given. The proposed algorithm is compared to an existing one in terms of energy consumption and accuracy. The results show that the proposed algorithm gives better results in energy consumption and accuracy while standing still, and acceptably accurate results when driving. The technique provides benefits to AR browsers with its open access algorithm. Going forward, the algorithm will be improved to more rapidly react to position changes while driving.

Keywords: accurate tagging algorithm, augmented reality, localization, location-based AR

Procedia PDF Downloads 348
3541 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 82
3540 6-Degree-Of-Freedom Spacecraft Motion Planning via Model Predictive Control and Dual Quaternions

Authors: Omer Burak Iskender, Keck Voon Ling, Vincent Dubanchet, Luca Simonini

Abstract:

This paper presents Guidance and Control (G&C) strategy to approach and synchronize with potentially rotating targets. The proposed strategy generates and tracks a safe trajectory for space servicing missions, including tasks like approaching, inspecting, and capturing. The main objective of this paper is to validate the G&C laws using a Hardware-In-the-Loop (HIL) setup with realistic rendezvous and docking equipment. Throughout this work, the assumption of full relative state feedback is relaxed by onboard sensors that bring realistic errors and delays and, while the proposed closed loop approach demonstrates the robustness to the above mentioned challenge. Moreover, G&C blocks are unified via the Model Predictive Control (MPC) paradigm, and the coupling between translational motion and rotational motion is addressed via dual quaternion based kinematic description. In this work, G&C is formulated as a convex optimization problem where constraints such as thruster limits and the output constraints are explicitly handled. Furthermore, the Monte-Carlo method is used to evaluate the robustness of the proposed method to the initial condition errors, the uncertainty of the target's motion and attitude, and actuator errors. A capture scenario is tested with the robotic test bench that has onboard sensors which estimate the position and orientation of a drifting satellite through camera imagery. Finally, the approach is compared with currently used robust H-infinity controllers and guidance profile provided by the industrial partner. The HIL experiments demonstrate that the proposed strategy is a potential candidate for future space servicing missions because 1) the algorithm is real-time implementable as convex programming offers deterministic convergence properties and guarantee finite time solution, 2) critical physical and output constraints are respected, 3) robustness to sensor errors and uncertainties in the system is proven, 4) couples translational motion with rotational motion.

Keywords: dual quaternion, model predictive control, real-time experimental test, rendezvous and docking, spacecraft autonomy, space servicing

Procedia PDF Downloads 124
3539 Multiple Fault Diagnosis in Digital Circuits using Critical Path Tracing and Enhanced Deduction Algorithm

Authors: Mohamed Mahmoud

Abstract:

This paper has developed an effect-cause analysis technique for fault diagnosis in digital circuits. The main algorithm of our technique is based on the Enhanced Deduction Algorithm, which processes the real response of the CUT to the applied test T to deduce the values of the internal lines. An experimental version of the algorithm has been implemented in C++. The code takes about 7592 lines. The internal values are determined based on the logic values under the permanent stuck-fault model. Using a backtracking strategy guarantees that the actual values are covered by at least one solution, or no solution is found.

Keywords: enhanced deduction algorithm, backtracking strategy, automatic test equipment, verfication

Procedia PDF Downloads 101
3538 Travel Behavior Simulation of Bike-Sharing System Users in Kaoshiung City

Authors: Hong-Yi Lin, Feng-Tyan Lin

Abstract:

In a Bike-sharing system (BSS), users can easily rent bikes from any station in the city for mid-range or short-range trips. BSS can also be integrated with other types of transport system, especially Green Transportation system, such as rail transport, bus etc. Since BSS records time and place of each pickup and return, the operational data can reflect more authentic and dynamic state of user behaviors. Furthermore, land uses around docking stations are highly associated with origins and destinations for the BSS users. As urban researchers, what concerns us more is to take BSS into consideration during the urban planning process and enhance the quality of urban life. This research focuses on the simulation of travel behavior of BSS users in Kaohsiung. First, rules of users’ behavior were derived by analyzing operational data and land use patterns nearby docking stations. Then, integrating with Monte Carlo method, these rules were embedded into a travel behavior simulation model, which was implemented by NetLogo, an agent-based modeling tool. The simulation model allows us to foresee the rent-return behaviour of BSS in order to choose potential locations of the docking stations. Also, it can provide insights and recommendations about planning and policies for the future BSS.

Keywords: agent-based model, bike-sharing system, BSS operational data, simulation

Procedia PDF Downloads 309
3537 Performance of the New Laboratory-Based Algorithm for HIV Diagnosis in Southwestern China

Authors: Yanhua Zhao, Chenli Rao, Dongdong Li, Chuanmin Tao

Abstract:

The Chinese Centers for Disease Control and Prevention (CCDC) issued a new laboratory-based algorithm for HIV diagnosis on April 2016, which initially screens with a combination HIV-1/HIV-2 antigen/antibody fourth-generation immunoassay (IA) followed, when reactive, an HIV-1/HIV-2 undifferentiated antibody IA in duplicate. Reactive specimens with concordant results undergo supplemental tests with western blots, or HIV-1 nucleic acid tests (NATs) and non-reactive specimens with discordant results receive HIV-1 NATs or p24 antigen tests or 2-4 weeks follow-up tests. However, little data evaluating the application of the new algorithm have been reported to date. The study was to evaluate the performance of new laboratory-based HIV diagnostic algorithm in an inpatient population of Southwest China over the initial 6 months by compared with the old algorithm. Plasma specimens collected from inpatients from May 1, 2016, to October 31, 2016, are submitted to the laboratory for screening HIV infection performed by both the new HIV testing algorithm and the old version. The sensitivity and specificity of the algorithms and the difference of the categorized numbers of plasmas were calculated. Under the new algorithm for HIV diagnosis, 170 of the total 52 749 plasma specimens were confirmed as positively HIV-infected (0.32%). The sensitivity and specificity of the new algorithm were 100% (170/170) and 100% (52 579/52 579), respectively; while 167 HIV-1 positive specimens were identified by the old algorithm with sensitivity 98.24% (167/170) and 100% (52 579/52 579), respectively. Three acute HIV-1 infections (AHIs) and two early HIV-1 infections (EHIs) were identified by the new algorithm; the former was missed by old procedure. Compared with the old version, the new algorithm produced fewer WB-indeterminate results (2 vs. 16, p = 0.001), which led to fewer follow-up tests. Therefore, the new HIV testing algorithm is more sensitive for detecting acute HIV-1 infections with maintaining the ability to verify the established HIV-1 infections and can dramatically decrease the greater number of WB-indeterminate specimens.

Keywords: algorithm, diagnosis, HIV, laboratory

Procedia PDF Downloads 380
3536 Approximately Similarity Measurement of Web Sites Using Genetic Algorithms and Binary Trees

Authors: Doru Anastasiu Popescu, Dan Rădulescu

Abstract:

In this paper, we determine the similarity of two HTML web applications. We are going to use a genetic algorithm in order to determine the most significant web pages of each application (we are not going to use every web page of a site). Using these significant web pages, we will find the similarity value between the two applications. The algorithm is going to be efficient because we are going to use a reduced number of web pages for comparisons but it will return an approximate value of the similarity. The binary trees are used to keep the tags from the significant pages. The algorithm was implemented in Java language.

Keywords: Tag, HTML, web page, genetic algorithm, similarity value, binary tree

Procedia PDF Downloads 337
3535 Optimal Sizing and Placement of Distributed Generators for Profit Maximization Using Firefly Algorithm

Authors: Engy Adel Mohamed, Yasser Gamal-Eldin Hegazy

Abstract:

This paper presents a firefly based algorithm for optimal sizing and allocation of distributed generators for profit maximization. Distributed generators in the proposed algorithm are of photovoltaic and combined heat and power technologies. Combined heat and power distributed generators are modeled as voltage controlled nodes while photovoltaic distributed generators are modeled as constant power nodes. The proposed algorithm is implemented in MATLAB environment and tested the unbalanced IEEE 37-node feeder. The results show the effectiveness of the proposed algorithm in optimal selection of distributed generators size and site in order to maximize the total system profit.

Keywords: distributed generators, firefly algorithm, IEEE 37-node feeder, profit maximization

Procedia PDF Downloads 419
3534 A Parallel Implementation of Artificial Bee Colony Algorithm within CUDA Architecture

Authors: Selcuk Aslan, Dervis Karaboga, Celal Ozturk

Abstract:

Artificial Bee Colony (ABC) algorithm is one of the most successful swarm intelligence based metaheuristics. It has been applied to a number of constrained or unconstrained numerical and combinatorial optimization problems. In this paper, we presented a parallelized version of ABC algorithm by adapting employed and onlooker bee phases to the Compute Unified Device Architecture (CUDA) platform which is a graphical processing unit (GPU) programming environment by NVIDIA. The execution speed and obtained results of the proposed approach and sequential version of ABC algorithm are compared on functions that are typically used as benchmarks for optimization algorithms. Tests on standard benchmark functions with different colony size and number of parameters showed that proposed parallelization approach for ABC algorithm decreases the execution time consumed by the employed and onlooker bee phases in total and achieved similar or better quality of the results compared to the standard sequential implementation of the ABC algorithm.

Keywords: Artificial Bee Colony algorithm, GPU computing, swarm intelligence, parallelization

Procedia PDF Downloads 353
3533 Fast Prediction Unit Partition Decision and Accelerating the Algorithm Using Cudafor Intra and Inter Prediction of HEVC

Authors: Qiang Zhang, Chun Yuan

Abstract:

Since the PU (Prediction Unit) decision process is the most time consuming part of the emerging HEVC (High Efficient Video Coding) standardin intra and inter frame coding, this paper proposes the fast PU decision algorithm and speed up the algorithm using CUDA (Compute Unified Device Architecture). In intra frame coding, the fast PU decision algorithm uses the texture features to skip intra-frame prediction or terminal the intra-frame prediction for smaller PU size. In inter frame coding of HEVC, the fast PU decision algorithm takes use of the similarity of its own two Nx2N size PU's motion vectors and the hierarchical structure of CU (Coding Unit) partition to skip some modes of PU partition, so as to reduce the motion estimation times. The accelerate algorithm using CUDA is based on the fast PU decision algorithm which uses the GPU to make the motion search and the gradient computation could be parallel computed. The proposed algorithm achieves up to 57% time saving compared to the HM 10.0 with little rate-distortion losses (0.043dB drop and 1.82% bitrate increase on average).

Keywords: HEVC, PU decision, inter prediction, intra prediction, CUDA, parallel

Procedia PDF Downloads 379
3532 Off-Grid Sparse Inverse Synthetic Aperture Imaging by Basis Shift Algorithm

Authors: Mengjun Yang, Zhulin Zong, Jie Gao

Abstract:

In this paper, a new and robust algorithm is proposed to achieve high resolution for inverse synthetic aperture radar (ISAR) imaging in the compressive sensing (CS) framework. Traditional CS based methods have to assume that unknown scatters exactly lie on the pre-divided grids; otherwise, their reconstruction performance dropped significantly. In this processing algorithm, several basis shifts are utilized to achieve the same effect as grid refinement does. The detailed implementation of the basis shift algorithm is presented in this paper. From the simulation we can see that using the basis shift algorithm, imaging precision can be improved. The effectiveness and feasibility of the proposed method are investigated by the simulation results.

Keywords: ISAR imaging, sparse reconstruction, off-grid, basis shift

Procedia PDF Downloads 244
3531 Explicit Numerical Approximations for a Pricing Weather Derivatives Model

Authors: Clarinda V. Nhangumbe, Ercília Sousa

Abstract:

Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.

Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives

Procedia PDF Downloads 71
3530 Distributed Coverage Control by Robot Networks in Unknown Environments Using a Modified EM Algorithm

Authors: Mohammadhosein Hasanbeig, Lacra Pavel

Abstract:

In this paper, we study a distributed control algorithm for the problem of unknown area coverage by a network of robots. The coverage objective is to locate a set of targets in the area and to minimize the robots’ energy consumption. The robots have no prior knowledge about the location and also about the number of the targets in the area. One efficient approach that can be used to relax the robots’ lack of knowledge is to incorporate an auxiliary learning algorithm into the control scheme. A learning algorithm actually allows the robots to explore and study the unknown environment and to eventually overcome their lack of knowledge. The control algorithm itself is modeled based on game theory where the network of the robots use their collective information to play a non-cooperative potential game. The algorithm is tested via simulations to verify its performance and adaptability.

Keywords: distributed control, game theory, multi-agent learning, reinforcement learning

Procedia PDF Downloads 433
3529 Optimal Planning of Dispatchable Distributed Generators for Power Loss Reduction in Unbalanced Distribution Networks

Authors: Mahmoud M. Othman, Y. G. Hegazy, A. Y. Abdelaziz

Abstract:

This paper proposes a novel heuristic algorithm that aims to determine the best size and location of distributed generators in unbalanced distribution networks. The proposed heuristic algorithm can deal with the planning cases where power loss is to be optimized without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power factor node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37 -node feeder. The results obtained show the effectiveness of the proposed algorithm.

Keywords: distributed generation, heuristic approach, optimization, planning

Procedia PDF Downloads 498
3528 A Case Study on the Numerical-Probability Approach for Deep Excavation Analysis

Authors: Komeil Valipourian

Abstract:

Urban advances and the growing need for developing infrastructures has increased the importance of deep excavations. In this study, after the introducing probability analysis as an important issue, an attempt has been made to apply it for the deep excavation project of Bangkok’s Metro as a case study. For this, the numerical probability model has been developed based on the Finite Difference Method and Monte Carlo sampling approach. The results indicate that disregarding the issue of probability in this project will result in an inappropriate design of the retaining structure. Therefore, probabilistic redesign of the support is proposed and carried out as one of the applications of probability analysis. A 50% reduction in the flexural strength of the structure increases the failure probability just by 8% in the allowable range and helps improve economic conditions, while maintaining mechanical efficiency. With regard to the lack of efficient design in most deep excavations, by considering geometrical and geotechnical variability, an attempt was made to develop an optimum practical design standard for deep excavations based on failure probability. On this basis, a practical relationship is presented for estimating the maximum allowable horizontal displacement, which can help improve design conditions without developing the probability analysis.

Keywords: numerical probability modeling, deep excavation, allowable maximum displacement, finite difference method (FDM)

Procedia PDF Downloads 105
3527 A Flexible Bayesian State-Space Modelling for Population Dynamics of Wildlife and Livestock Populations

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Hans-Peter Piepho

Abstract:

We aim to model dynamics of wildlife or pastoral livestock population for understanding of their population change and hence for wildlife conservation and promoting human welfare. The study is motivated by an age-sex structured population counts in different regions of Serengeti-Mara during the period 1989-2003. Developing reliable and realistic models for population dynamics of large herbivore population can be a very complex and challenging exercise. However, the Bayesian statistical domain offers some flexible computational methods that enable the development and efficient implementation of complex population dynamics models. In this work, we have used a novel Bayesian state-space model to analyse the dynamics of topi and hartebeest populations in the Serengeti-Mara Ecosystem of East Africa. The state-space model involves survival probabilities of the animals which further depend on various factors like monthly rainfall, size of habitat, etc. that cause recent declines in numbers of the herbivore populations and potentially threaten their future population viability in the ecosystem. Our study shows that seasonal rainfall is the most important factors shaping the population size of animals and indicates the age-class which most severely affected by any change in weather conditions.

Keywords: bayesian state-space model, Markov Chain Monte Carlo, population dynamics, conservation

Procedia PDF Downloads 183
3526 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning

Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza

Abstract:

The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.

Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library

Procedia PDF Downloads 154
3525 Block Based Imperial Competitive Algorithm with Greedy Search for Traveling Salesman Problem

Authors: Meng-Hui Chen, Chiao-Wei Yu, Pei-Chann Chang

Abstract:

Imperial competitive algorithm (ICA) simulates a multi-agent algorithm. Each agent is like a kingdom has its country, and the strongest country in each agent is called imperialist, others are colony. Countries are competitive with imperialist which in the same kingdom by evolving. So this country will move in the search space to find better solutions with higher fitness to be a new imperialist. The main idea in this paper is using the peculiarity of ICA to explore the search space to solve the kinds of combinational problems. Otherwise, we also study to use the greed search to increase the local search ability. To verify the proposed algorithm in this paper, the experimental results of traveling salesman problem (TSP) is according to the traveling salesman problem library (TSPLIB). The results show that the proposed algorithm has higher performance than the other known methods.

Keywords: traveling salesman problem, artificial chromosomes, greedy search, imperial competitive algorithm

Procedia PDF Downloads 443
3524 The Diffusion of Membrane Nanodomains with Specific Ganglioside Composition

Authors: Barbora Chmelova, Radek Sachl

Abstract:

Gangliosides are amphipathic membrane lipids. Due to the composition of bulky oligosaccharide chains containing one or more sialic acids linked to the hydrophobic ceramide base, gangliosides are classified among glycosphingolipids. This unique structure induces a high self-aggregating tendency of gangliosides and, therefore, the formation of nanoscopic clusters called nanodomains. Gangliosides are preferentially present in an extracellular membrane leaflet of all human tissues and thus have an impact on a huge number of biological processes, such as intercellular communication, cell signalling, membrane trafficking, and regulation of receptor activity. Defects in their metabolism, impairment of proper ganglioside function, or changes in their organization lead to serious health conditions such as Alzheimer´s and Parkinson´s diseases, autoimmune diseases, tumour growth, etc. This work mainly focuses on ganglioside organization into nanodomains and their dynamics within the plasma membrane. Current research investigates static ganglioside nanodomains characterization; nevertheless, the information about their diffusion is missing. In our study, fluorescence correlation spectroscopy is implemented together with stimulated emission depletion (STED-FCS), which combines the diffraction-unlimited spatial resolution with high temporal resolution. By comparison of the experiments performed on model vesicles containing 4 % of either GM1, GM2, or GM3 and Monte Carlo simulations of diffusion on the plasma membrane, the description of ganglioside clustering, diffusion of nanodomains, and even diffusion of ganglioside molecules inside investigated nanodomains are described.

Keywords: gangliosides, nanodomains, STED-FCS, flourescence microscopy, membrane diffusion

Procedia PDF Downloads 60
3523 Genetic Algorithm for Bi-Objective Hub Covering Problem

Authors: Abbas Mirakhorli

Abstract:

A hub covering problem is a type of hub location problem that tries to maximize the coverage area with the least amount of installed hubs. There have not been many studies in the literature about multi-objective hubs covering location problems. Thus, in this paper, a bi-objective model for the hub covering problem is presented. The two objectives that are considered in this paper are the minimization of total transportation costs and the maximization of coverage of origin-destination nodes. A genetic algorithm is presented to solve the model when the number of nodes is increased. The genetic algorithm is capable of solving the model when the number of nodes increases by more than 20. Moreover, the genetic algorithm solves the model in less amount of time.

Keywords: facility location, hub covering, multi-objective optimization, genetic algorithm

Procedia PDF Downloads 36