Search results for: optimization/inverse mapping
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4662

Search results for: optimization/inverse mapping

3882 Energy Benefits of Urban Platooning with Self-Driving Vehicles

Authors: Eduardo F. Mello, Peter H. Bauer

Abstract:

The primary focus of this paper is the generation of energy-optimal speed trajectories for heterogeneous electric vehicle platoons in urban driving conditions. Optimal speed trajectories are generated for individual vehicles and for an entire platoon under the assumption that they can be executed without errors, as would be the case for self-driving vehicles. It is then shown that the optimization for the “average vehicle in the platoon” generates similar transportation energy savings to optimizing speed trajectories for each vehicle individually. The introduced approach only requires the lead vehicle to run the optimization software while the remaining vehicles are only required to have adaptive cruise control capability. The achieved energy savings are typically between 30% and 50% for stop-to-stop segments in cities. The prime motivation of urban platooning comes from the fact that urban platoons efficiently utilize the available space and the minimization of transportation energy in cities is important for many reasons, i.e., for environmental, power, and range considerations.

Keywords: electric vehicles, energy efficiency, optimization, platooning, self-driving vehicles, urban traffic

Procedia PDF Downloads 175
3881 Design-Analysis and Optimization of 10 MW Permanent Magnet Surface Mounted Off-Shore Wind Generator

Authors: Mamidi Ramakrishna Rao, Jagdish Mamidi

Abstract:

With advancing technology, the market environment for wind power generation systems has become highly competitive. The industry has been moving towards higher wind generator power ratings, in particular, off-shore generator ratings. Current off-shore wind turbine generators are in the power range of 10 to 12 MW. Unlike traditional induction motors, slow-speed permanent magnet surface mounted (PMSM) high-power generators are relatively challenging and designed differently. In this paper, PMSM generator design features have been discussed and analysed. The focus attention is on armature windings, harmonics, and permanent magnet. For the power ratings under consideration, the generator air-gap diameters are in the range of 8 to 10 meters, and active material weigh ~60 tons and above. Therefore, material weight becomes one of the critical parameters. Particle Swarm Optimization (PSO) technique is used for weight reduction and performance improvement. Four independent variables have been considered, which are air gap diameter, stack length, magnet thickness, and winding current density. To account for core and teeth saturation, preventing demagnetization effects due to short circuit armature currents, and maintaining minimum efficiency, suitable penalty functions have been applied. To check for performance satisfaction, a detailed analysis and 2D flux plotting are done for the optimized design.

Keywords: offshore wind generator, PMSM, PSO optimization, design optimization

Procedia PDF Downloads 141
3880 Consideration of Uncertainty in Engineering

Authors: A. Mohammadi, M. Moghimi, S. Mohammadi

Abstract:

Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.

Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method

Procedia PDF Downloads 404
3879 Fault Diagnosis of Manufacturing Systems Using AntTreeStoch with Parameter Optimization by ACO

Authors: Ouahab Kadri, Leila Hayet Mouss

Abstract:

In this paper, we present three diagnostic modules for complex and dynamic systems. These modules are based on three ant colony algorithms, which are AntTreeStoch, Lumer & Faieta and Binary ant colony. We chose these algorithms for their simplicity and their wide application range. However, we cannot use these algorithms in their basement forms as they have several limitations. To use these algorithms in a diagnostic system, we have proposed three variants. We have tested these algorithms on datasets issued from two industrial systems, which are clinkering system and pasteurization system.

Keywords: ant colony algorithms, complex and dynamic systems, diagnosis, classification, optimization

Procedia PDF Downloads 294
3878 The Optimization of Decision Rules in Multimodal Decision-Level Fusion Scheme

Authors: Andrey V. Timofeev, Dmitry V. Egorov

Abstract:

This paper introduces an original method of parametric optimization of the structure for multimodal decision-level fusion scheme which combines the results of the partial solution of the classification task obtained from assembly of the mono-modal classifiers. As a result, a multimodal fusion classifier which has the minimum value of the total error rate has been obtained.

Keywords: classification accuracy, fusion solution, total error rate, multimodal fusion classifier

Procedia PDF Downloads 458
3877 Drought Detection and Water Stress Impact on Vegetation Cover Sustainability Using Radar Data

Authors: E. Farg, M. M. El-Sharkawy, M. S. Mostafa, S. M. Arafat

Abstract:

Mapping water stress provides important baseline data for sustainable agriculture. Recent developments in the new Sentinel-1 data which allow the acquisition of high resolution images and varied polarization capabilities. This study was conducted to detect and quantify vegetation water content from canopy backscatter for extracting spatial information to encourage drought mapping activities throughout new reclaimed sandy soils in western Nile delta, Egypt. The performance of radar imagery in agriculture strongly depends on the sensor polarization capability. The dual mode capabilities of Sentinel-1 improve the ability to detect water stress and the backscatter from the structure components improves the identification and separation of vegetation types with various canopy structures from other features. The fieldwork data allowed identifying of water stress zones based on land cover structure; those classes were used for producing harmonious water stress map. The used analysis techniques and results show high capability of active sensors data in water stress mapping and monitoring especially when integrated with multi-spectral medium resolution images. Also sub soil drip irrigation systems cropped areas have lower drought and water stress than center pivot sprinkler irrigation systems. That refers to high level of evaporation from soil surface in initial growth stages. Results show that high relationship between vegetation indices such as Normalized Difference Vegetation Index NDVI the observed radar backscattering. In addition to observational evidence showed that the radar backscatter is highly sensitive to vegetation water stress, and essentially potential to monitor and detect vegetative cover drought.

Keywords: canopy backscatter, drought, polarization, NDVI

Procedia PDF Downloads 137
3876 Pavement Maintenance and Rehabilitation Scheduling Using Genetic Algorithm Based Multi Objective Optimization Technique

Authors: Ashwini Gowda K. S, Archana M. R, Anjaneyappa V

Abstract:

This paper presents pavement maintenance and management system (PMMS) to obtain optimum pavement maintenance and rehabilitation strategies and maintenance scheduling for a network using a multi-objective genetic algorithm (MOGA). Optimal pavement maintenance & rehabilitation strategy is to maximize the pavement condition index of the road section in a network with minimum maintenance and rehabilitation cost during the planning period. In this paper, NSGA-II is applied to perform maintenance optimization; this maintenance approach was expected to preserve and improve the existing condition of the highway network in a cost-effective way. The proposed PMMS is applied to a network that assessed pavement based on the pavement condition index (PCI). The minimum and maximum maintenance cost for a planning period of 20 years obtained from the non-dominated solution was found to be 5.190x10¹⁰ ₹ and 4.81x10¹⁰ ₹, respectively.

Keywords: genetic algorithm, maintenance and rehabilitation, optimization technique, pavement condition index

Procedia PDF Downloads 141
3875 Hyperspectral Mapping Methods for Differentiating Mangrove Species along Karachi Coast

Authors: Sher Muhammad, Mirza Muhammad Waqar

Abstract:

It is necessary to monitor and identify mangroves types and spatial extent near coastal areas because it plays an important role in coastal ecosystem and environmental protection. This research aims at identifying and mapping mangroves types along Karachi coast ranging from 24.79 to 24.85 degree in latitude and 66.91 to 66.97 degree in longitude using hyperspectral remote sensing data and techniques. Image acquired during February, 2012 through Hyperion sensor have been used for this research. Image preprocessing includes geometric and radiometric correction followed by Minimum Noise Fraction (MNF) and Pixel Purity Index (PPI). The output of MNF and PPI has been analyzed by visualizing it in n-dimensions for end-member extraction. Well-distributed clusters on the n-dimensional scatter plot have been selected with the region of interest (ROI) tool as end members. These end members have been used as an input for classification techniques applied to identify and map mangroves species including Spectral Angle Mapper (SAM), Spectral Feature Fitting (SFF), and Spectral Information Diversion (SID). Only two types of mangroves namely Avicennia Marina (white mangroves) and Avicennia Germinans (black mangroves) have been observed throughout the study area.

Keywords: mangrove, hyperspectral, hyperion, SAM, SFF, SID

Procedia PDF Downloads 354
3874 Linkage between Trace Element Distribution and Growth Ring Formation in Japanese Red Coral (Paracorallium japonicum)

Authors: Luan Trong Nguyen, M. Azizur Rahman, Yusuke Tamenori, Toshihiro Yoshimura, Nozomu Iwasaki, Hiroshi Hasegawa

Abstract:

This study investigated the distribution of magnesium (Mg), phosphorus (P), sulfur (S) and strontium (Sr) using micro X-ray fluorescence (µ-XRF) along the annual growth rings in the skeleton of Japanese red coral Paracorallium japonicum. The Mg, P and S distribution in µ-XRF mapping images correspond to the dark and light bands along the annual growth rings observed in microscopic images of the coral skeleton. The µ-XRF mapping data showed a positive correlation (r = 0.6) between P and S distribution in the coral skeleton. A contrasting distribution pattern of S and Mg along the axial skeleton of P. japonicum indicates a weak negative correlation (r = -0.2) between these two trace elements. The distribution pattern of S, P and Mg reveals linkage between their distributions and the formation of dark/light bands along the annual growth rings in the axial skeleton of P. japonicum. Sulfur and P were distributed in the organic matrix rich dark bands, while Mg was distributed in the light bands of the annual growth rings.

Keywords: µ-XRF, trace element, precious coral, Paracorallium japonicum

Procedia PDF Downloads 435
3873 Reflectance Imaging Spectroscopy Data (Hyperspectral) for Mineral Mapping in the Orientale Basin Region on the Moon Surface

Authors: V. Sivakumar, R. Neelakantan

Abstract:

Mineral mapping on the Moon surface provides the clue to understand the origin, evolution, stratigraphy and geological history of the Moon. Recently, reflectance imaging spectroscopy plays a significant role in identifying minerals on the planetary surface in the Visible to NIR region of the electromagnetic spectrum. The Moon Mineralogy Mapper (M3) onboard Chandrayaan-1 provides unprecedented spectral data of lunar surface to study about the Moon surface. Here we used the M3 sensor data (hyperspectral imaging spectroscopy) for analysing mineralogy of Orientale basin region on the Moon surface. Reflectance spectrums were sampled from different locations of the basin and continuum was removed using ENvironment for Visualizing Images (ENVI) software. Reflectance spectra of unknown mineral composition were compared with known Reflectance Experiment Laboratory (RELAB) spectra for discriminating mineralogy. Minerals like olivine, Low-Ca Pyroxene (LCP), High-Ca Pyroxene (HCP) and plagioclase were identified. In addition to these minerals, an unusual type of spectral signature was identified, which indicates the probable Fe-Mg-spinel lithology in the basin region.

Keywords: chandryaan-1, moon mineralogy mapper, mineral, mare orientale, moon

Procedia PDF Downloads 381
3872 Mapping the Relationship between Elements of Urban Morphology Density of Crime

Authors: Fabio Salvador Aparecido Santos, Spencer Chainey, Richard Wortley

Abstract:

Urban morphology can be understood as the study of the physical form of cities through its elements. Crime, at this turn, can be oversimplified as an action that breaks the rules established in a certain society. This study involves these two subjects through the relationship between elements of urban morphology and density of crime occurrences. We consider that there is a research gap about the influence of urban features on crime occurrences using statistic methods and mapping techniques on Geographic Information Systems. The investigation will comprehend three main phases. The first phase involves examining how theoretical principles associated with urban morphology can be viewed in terms of their influence on crime patterns. The second phase involves the development of tools to be used to model elements of urban morphology, and measure the relationship between these urban morphological elements and patterns of crime. The third phase involves determining the extent to which elements of the urban environment can contribute to crime reduction. Understanding the relationship between urban morphology and crime patterns in a Latin American context will help highlight the influence urban planning has on the crime problems that emerge in these settings, and how effectively urban planning can contribute to reducing crime.

Keywords: Agent-based Modelling, Environmental Criminology, Geographic Information System, Urban Morphology

Procedia PDF Downloads 122
3871 Design an Intelligent Fire Detection System Based on Neural Network and Particle Swarm Optimization

Authors: Majid Arvan, Peyman Beygi, Sina Rokhsati

Abstract:

In-time detection of fire in buildings is of great importance. Employing intelligent methods in data processing in fire detection systems leads to a significant reduction of fire damage at lowest cost. In this paper, the raw data obtained from the fire detection sensor networks in buildings is processed by using intelligent methods based on neural networks and the likelihood of fire happening is predicted. In order to enhance the quality of system, the noise in the sensor data is reduced by analyzing wavelets and applying SVD technique. Meanwhile, the proposed neural network is trained using particle swarm optimization (PSO). In the simulation work, the data is collected from sensor network inside the room and applied to the proposed network. Then the outputs are compared with conventional MLP network. The simulation results represent the superiority of the proposed method over the conventional one.

Keywords: intelligent fire detection, neural network, particle swarm optimization, fire sensor network

Procedia PDF Downloads 375
3870 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction

Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey

Abstract:

In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.

Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization

Procedia PDF Downloads 340
3869 Flood Monitoring in the Vietnamese Mekong Delta Using Sentinel-1 SAR with Global Flood Mapper

Authors: Ahmed S. Afifi, Ahmed Magdy

Abstract:

Satellite monitoring is an essential tool to study, understand, and map large-scale environmental changes that affect humans, climate, and biodiversity. The Sentinel-1 Synthetic Aperture Radar (SAR) instrument provides a high collection of data in all-weather, short revisit time, and high spatial resolution that can be used effectively in flood management. Floods occur when an overflow of water submerges dry land that requires to be distinguished from flooded areas. In this study, we use global flood mapper (GFM), a new google earth engine application that allows users to quickly map floods using Sentinel-1 SAR. The GFM enables the users to adjust manually the flood map parameters, e.g., the threshold for Z-value for VV and VH bands and the elevation and slope mask threshold. The composite R:G:B image results by coupling the bands of Sentinel-1 (VH:VV:VH) reduces false classification to a large extent compared to using one separate band (e.g., VH polarization band). The flood mapping algorithm in the GFM and the Otsu thresholding are compared with Sentinel-2 optical data. And the results show that the GFM algorithm can overcome the misclassification of a flooded area in An Giang, Vietnam.

Keywords: SAR backscattering, Sentinel-1, flood mapping, disaster

Procedia PDF Downloads 95
3868 Optimal Design of Linear Generator to Recharge the Smartphone Battery

Authors: Jin Ho Kim, Yujeong Shin, Seong-Jin Cho, Dong-Jin Kim, U-Syn Ha

Abstract:

Due to the development of the information industry and technologies, cellular phones have must not only function to communicate, but also have functions such as the Internet, e-banking, entertainment, etc. These phones are called smartphones. The performance of smartphones has improved, because of the various functions of smartphones, and the capacity of the battery has been increased gradually. Recently, linear generators have been embedded in smartphones in order to recharge the smartphone's battery. In this study, optimization is performed and an array change of permanent magnets is examined in order to increase efficiency. We propose an optimal design using design of experiments (DOE) to maximize the generated induced voltage. The thickness of the poleshoe and permanent magnet (PM), the height of the poleshoe and PM, and the thickness of the coil are determined to be design variables. We made 25 sampling points using an orthogonal array according to four design variables. We performed electromagnetic finite element analysis to predict the generated induced voltage using the commercial electromagnetic analysis software ANSYS Maxwell. Then, we made an approximate model using the Kriging algorithm, and derived optimal values of the design variables using an evolutionary algorithm. The commercial optimization software PIAnO (Process Integration, Automation, and Optimization) was used with these algorithms. The result of the optimization shows that the generated induced voltage is improved.

Keywords: smartphone, linear generator, design of experiment, approximate model, optimal design

Procedia PDF Downloads 338
3867 Coastal Flood Mapping of Vulnerability Due to Sea Level Rise and Extreme Weather Events: A Case Study of St. Ives, UK

Authors: S. Vavias, T. R. Brewer, T. S. Farewell

Abstract:

Coastal floods have been identified as an important natural hazard that can cause significant damage to the populated built-up areas, related infrastructure and also ecosystems and habitats. This study attempts to fill the gap associated with the development of preliminary assessments of coastal flood vulnerability for compliance with the EU Directive on the Assessment and Management of Flood Risks (2007/60/EC). In this context, a methodology has been created by taking into account three major parameters; the maximum wave run-up modelled from historical weather observations, the highest tide according to historic time series, and the sea level rise projections due to climate change. A high resolution digital terrain model (DTM) derived from LIDAR data has been used to integrate the estimated flood events in a GIS environment. The flood vulnerability map created shows potential risk areas and can play a crucial role in the coastal zone planning process. The proposed method has the potential to be a powerful tool for policy and decision makers for spatial planning and strategic management.

Keywords: coastal floods, vulnerability mapping, climate change, extreme weather events

Procedia PDF Downloads 388
3866 Using Group Concept Mapping to Identify a Pharmacy-Based Trigger Tool to Detect Adverse Drug Events

Authors: Rodchares Hanrinth, Theerapong Srisil, Peeraya Sriphong, Pawich Paktipat

Abstract:

The trigger tool is the low-cost, low-tech method to detect adverse events through clues called triggers. The Institute for Healthcare Improvement (IHI) has developed the Global Trigger Tool for measuring and preventing adverse events. However, this tool is not specific for detecting adverse drug events. The pharmacy-based trigger tool is needed to detect adverse drug events (ADEs). Group concept mapping is an effective method for conceptualizing various ideas from diverse stakeholders. This technique was used to identify a pharmacy-based trigger to detect adverse drug events (ADEs). The aim of this study was to involve the pharmacists in conceptualizing, developing, and prioritizing a feasible trigger tool to detect adverse drug events in a provincial hospital, the northeastern part of Thailand. The study was conducted during the 6-month period between April 1 and September 30, 2017. Study participants involved 20 pharmacists (17 hospital pharmacists and 3 pharmacy lecturers) engaging in three concept mapping workshops. In this meeting, the concept mapping technique created by Trochim, a highly constructed qualitative group technic for idea generating and sharing, was used to produce and construct participants' views on what triggers were potential to detect ADEs. During the workshops, participants (n = 20) were asked to individually rate the feasibility and potentiality of each trigger and to group them into relevant categories to enable multidimensional scaling and hierarchical cluster analysis. The outputs of analysis included the trigger list, cluster list, point map, point rating map, cluster map, and cluster rating map. The three workshops together resulted in 21 different triggers that were structured in a framework forming 5 clusters: drug allergy, drugs induced diseases, dosage adjustment in renal diseases, potassium concerning, and drug overdose. The first cluster is drug allergy such as the doctor’s orders for dexamethasone injection combined with chlorpheniramine injection. Later, the diagnosis of drug-induced hepatitis in a patient taking anti-tuberculosis drugs is one trigger in the ‘drugs induced diseases’ cluster. Then, for the third cluster, the doctor’s orders for enalapril combined with ibuprofen in a patient with chronic kidney disease is the example of a trigger. The doctor’s orders for digoxin in a patient with hypokalemia is a trigger in a cluster. Finally, the doctor’s orders for naloxone with narcotic overdose was classified as a trigger in a cluster. This study generated triggers that are similar to some of IHI Global trigger tool, especially in the medication module such as drug allergy and drug overdose. However, there are some specific aspects of this tool, including drug-induced diseases, dosage adjustment in renal diseases, and potassium concerning which do not contain in any trigger tools. The pharmacy-based trigger tool is suitable for pharmacists in hospitals to detect potential adverse drug events using clues of triggers.

Keywords: adverse drug events, concept mapping, hospital, pharmacy-based trigger tool

Procedia PDF Downloads 155
3865 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit

Authors: Ahmed Elrewainy

Abstract:

Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.

Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets

Procedia PDF Downloads 191
3864 Application of Neutron Stimulated Gamma Spectroscopy for Soil Elemental Analysis and Mapping

Authors: Aleksandr Kavetskiy, Galina Yakubova, Nikolay Sargsyan, Stephen A. Prior, H. Allen Torbert

Abstract:

Determining soil elemental content and distribution (mapping) within a field are key features of modern agricultural practice. While traditional chemical analysis is a time consuming and labor-intensive multi-step process (e.g., sample collections, transport to laboratory, physical preparations, and chemical analysis), neutron-gamma soil analysis can be performed in-situ. This analysis is based on the registration of gamma rays issued from nuclei upon interaction with neutrons. Soil elements such as Si, C, Fe, O, Al, K, and H (moisture) can be assessed with this method. Data received from analysis can be directly used for creating soil elemental distribution maps (based on ArcGIS software) suitable for agricultural purposes. The neutron-gamma analysis system developed for field application consisted of an MP320 Neutron Generator (Thermo Fisher Scientific, Inc.), 3 sodium iodide gamma detectors (SCIONIX, Inc.) with a total volume of 7 liters, 'split electronics' (XIA, LLC), a power system, and an operational computer. Paired with GPS, this system can be used in the scanning mode to acquire gamma spectra while traversing a field. Using acquired spectra, soil elemental content can be calculated. These data can be combined with geographical coordinates in a geographical information system (i.e., ArcGIS) to produce elemental distribution maps suitable for agricultural purposes. Special software has been developed that will acquire gamma spectra, process and sort data, calculate soil elemental content, and combine these data with measured geographic coordinates to create soil elemental distribution maps. For example, 5.5 hours was needed to acquire necessary data for creating a carbon distribution map of an 8.5 ha field. This paper will briefly describe the physics behind the neutron gamma analysis method, physical construction the measurement system, and main characteristics and modes of work when conducting field surveys. Soil elemental distribution maps resulting from field surveys will be presented. and discussed. Comparison of these maps with maps created on the bases of chemical analysis and soil moisture measurements determined by soil electrical conductivity was similar. The maps created by neutron-gamma analysis were reproducible, as well. Based on these facts, it can be asserted that neutron stimulated soil gamma spectroscopy paired with GPS system is fully applicable for soil elemental agricultural field mapping.

Keywords: ArcGIS mapping, neutron gamma analysis, soil elemental content, soil gamma spectroscopy

Procedia PDF Downloads 129
3863 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach

Authors: Mohammad H. Almomani

Abstract:

In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.

Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization

Procedia PDF Downloads 345
3862 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization

Authors: Yihao Kuang, Bowen Ding

Abstract:

With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.

Keywords: reinforcement learning, PPO, knowledge inference

Procedia PDF Downloads 224
3861 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan

Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu

Abstract:

This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.

Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model

Procedia PDF Downloads 149
3860 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications

Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison

Abstract:

In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.

Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller

Procedia PDF Downloads 229
3859 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications

Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu

Abstract:

On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.

Keywords: cloud computing, CPU intensive applications, resource optimization, strategy

Procedia PDF Downloads 271
3858 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System

Authors: Tahsin A. H. Nishat, Raquib Ahsan

Abstract:

Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.

Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system

Procedia PDF Downloads 128
3857 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface

Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari

Abstract:

With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.

Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis

Procedia PDF Downloads 408
3856 The Bernstein Expansion for Exponentials in Taylor Functions: Approximation of Fixed Points

Authors: Tareq Hamadneh, Jochen Merker, Hassan Al-Zoubi

Abstract:

Bernstein's expansion for exponentials in Taylor functions provides lower and upper optimization values for the range of its original function. these values converge to the original functions if the degree is elevated or the domain subdivided. Taylor polynomial can be applied so that the exponential is a polynomial of finite degree over a given domain. Bernstein's basis has two main properties: its sum equals 1, and positive for all x 2 (0; 1). In this work, we prove the existence of fixed points for exponential functions in a given domain using the optimization values of Bernstein. The Bernstein basis of finite degree T over a domain D is defined non-negatively. Any polynomial p of degree t can be expanded into the Bernstein form of maximum degree t ≤ T, where we only need to compute the coefficients of Bernstein in order to optimize the original polynomial. The main property is that p(x) is approximated by the minimum and maximum Bernstein coefficients (Bernstein bound). If the bound is contained in the given domain, then we say that p(x) has fixed points in the same domain.

Keywords: Bernstein polynomials, Stability of control functions, numerical optimization, Taylor function

Procedia PDF Downloads 127
3855 Optimization of Floor Heating System in the Incompressible Turbulent Flow Using Constructal Theory

Authors: Karim Farahmandfar, Hamidolah Izadi, Mohammadreza Rezaei, Amin Ardali, Ebrahim Goshtasbi Rad, Khosro Jafarpoor

Abstract:

Statistics illustrates that the higher amount of annual energy consumption is related to surmounting the demand in buildings. Therefore, it is vital to economize the energy consumption and also find the solution with regard to this issue. One of the systems for the sake of heating the building is floor heating. As a matter of fact, floor heating performance is based on convection and radiation. Actually, in addition to creating a favorable heating condition, this method leads to energy saving. It is the goal of this article to outline the constructal theory and introduce the optimization method in branch networks for floor heating. There are several steps in order to gain this purpose. First of all, the pressure drop through the two points of the network is calculated. This pressure drop is as a function of pipes diameter and other parameters. After that, the amount of heat transfer is determined. Consequently, as a result of the combination of these two functions, the final function will be determined. It is necessary to mention that flow is laminar.

Keywords: constructal theory, optimization, floor heating system, turbulent flow

Procedia PDF Downloads 306
3854 Statistical Optimization of Vanillin Production by Pycnoporus Cinnabarinus 1181

Authors: Swarali Hingse, Shraddha Digole, Uday Annapure

Abstract:

The present study investigates the biotransformation of ferulic acid to vanillin by Pycnoporus cinnabarinus and its optimization using one-factor-at-a-time method as well as statistical approach. Effect of various physicochemical parameters and medium components was studied using one-factor-at-a-time method. Screening of the significant factors was carried out using L25 Taguchi orthogonal array and then these selected significant factors were further optimized using response surface methodology (RSM). Significant media components obtained using Taguchi L25 orthogonal array were glucose, KH2PO4 and yeast extract. Further, a Box Behnken design was used to investigate the interactive effects of the three most significant media components. The final medium obtained after optimization using RSM containing glucose (34.89 g/L), diammonium tartrate (1 g/L), yeast extract (1.47 g/L), MgSO4•7H2O (0.5 g/L), KH2PO4 (0.15 g/L), and CaCl2•2H2O (20 mg/L) resulted in amplification of vanillin production from 30.88 mg/L to 187.63 mg/L.

Keywords: ferulic acid, pycnoporus cinnabarinus, response surface methodology, vanillin

Procedia PDF Downloads 372
3853 Effect of the Initial Billet Shape Parameters on the Final Product in a Backward Extrusion Process for Pressure Vessels

Authors: Archana Thangavelu, Han-Ik Park, Young-Chul Park, Joon-Hong Park

Abstract:

In this numerical study, we have proposed a method for evaluation of backward extrusion process of pressure vessel made up of steel. Demand for lighter and stiffer products have been increasing in the last years especially in automobile engineering. Through detailed finite element analysis, effective stress, strain and velocity profile have been obtained with optimal range. The process design of a forward and backward extrusion axe-symmetric part has been studied. Forging is mainly carried out because forged products are highly reliable and possess superior mechanical properties when compared to normal products. Performing computational simulations of 3D hot forging with various dimensions of billet and optimization of weight is carried out using Taguchi Orthogonal Array (OA) Optimization technique. The technique used in this study can be used for newly developed materials to investigate its forgeability for much complicated shapes in closed hot die forging process.

Keywords: backward extrusion, hot forging, optimization, finite element analysis, Taguchi method

Procedia PDF Downloads 300