Search results for: multi objective particle swarm optimization (MOPSO)
12754 Optimal Design of Linear Generator to Recharge the Smartphone Battery
Authors: Jin Ho Kim, Yujeong Shin, Seong-Jin Cho, Dong-Jin Kim, U-Syn Ha
Abstract:
Due to the development of the information industry and technologies, cellular phones have must not only function to communicate, but also have functions such as the Internet, e-banking, entertainment, etc. These phones are called smartphones. The performance of smartphones has improved, because of the various functions of smartphones, and the capacity of the battery has been increased gradually. Recently, linear generators have been embedded in smartphones in order to recharge the smartphone's battery. In this study, optimization is performed and an array change of permanent magnets is examined in order to increase efficiency. We propose an optimal design using design of experiments (DOE) to maximize the generated induced voltage. The thickness of the poleshoe and permanent magnet (PM), the height of the poleshoe and PM, and the thickness of the coil are determined to be design variables. We made 25 sampling points using an orthogonal array according to four design variables. We performed electromagnetic finite element analysis to predict the generated induced voltage using the commercial electromagnetic analysis software ANSYS Maxwell. Then, we made an approximate model using the Kriging algorithm, and derived optimal values of the design variables using an evolutionary algorithm. The commercial optimization software PIAnO (Process Integration, Automation, and Optimization) was used with these algorithms. The result of the optimization shows that the generated induced voltage is improved.Keywords: smartphone, linear generator, design of experiment, approximate model, optimal design
Procedia PDF Downloads 34412753 Utility Analysis of API Economy Based on Multi-Sided Platform Markets Model
Authors: Mami Sugiura, Shinichi Arakawa, Masayuki Murata, Satoshi Imai, Toru Katagiri, Motoyoshi Sekiya
Abstract:
API (Application Programming Interface) economy, where many participants join/interact and form the economy, is expected to increase collaboration between information services through API, and thereby, it is expected to increase market value from the service collaborations. In this paper, we introduce API evaluators, which are the activator of API economy by reviewing and/or evaluating APIs, and develop a multi-sided API economy model that formulates interactions among platform provider, API developers, consumers, and API evaluators. By obtaining the equilibrium that maximizes utility of all participants, the impact of API evaluators on the utility of participants in the API economy is revealed. Numerical results show that, with the existence of API evaluators, the number of developers and consumers increase by 1.5% and the utility of platformer increases by 2.3%. We also discuss the strategies of platform provider to maximize its utility under the existence of API evaluators.Keywords: API economy, multi-sided markets, API evaluator, platform, platform provider
Procedia PDF Downloads 18512752 Development of a Non-Dispersive Infrared Multi Gas Analyzer for a TMS
Authors: T. V. Dinh, I. Y. Choi, J. W. Ahn, Y. H. Oh, G. Bo, J. Y. Lee, J. C. Kim
Abstract:
A Non-Dispersive Infrared (NDIR) multi-gas analyzer has been developed to monitor the emission of carbon monoxide (CO) and sulfur dioxide (SO2) from various industries. The NDIR technique for gas measurement is based on the wavelength absorption in the infrared spectrum as a way to detect particular gasses. NDIR analyzers have popularly applied in the Tele-Monitoring System (TMS). The advantage of the NDIR analyzer is low energy consumption and cost compared with other spectroscopy methods. However, zero/span drift and interference are its urgent issues to be solved. Multi-pathway technique based on optical White cell was employed to improve the sensitivity of the analyzer in this work. A pyroelectric detector was used to detect the Infrared radiation. The analytical range of the analyzer was 0 ~ 200 ppm. The instrument response time was < 2 min. The detection limits of CO and SO2 were < 4 ppm and < 6 ppm, respectively. The zero and span drift of 24 h was less than 3%. The linearity of the analyzer was less than 2.5% of reference values. The precision and accuracy of both CO and SO2 channels were < 2.5% of relative standard deviation. In general, the analyzer performed well. However, the detection limit and 24h drift should be improved to be a more competitive instrument.Keywords: analyzer, CEMS, monitoring, NDIR, TMS
Procedia PDF Downloads 25512751 Sparsity-Based Unsupervised Unmixing of Hyperspectral Imaging Data Using Basis Pursuit
Authors: Ahmed Elrewainy
Abstract:
Mixing in the hyperspectral imaging occurs due to the low spatial resolutions of the used cameras. The existing pure materials “endmembers” in the scene share the spectra pixels with different amounts called “abundances”. Unmixing of the data cube is an important task to know the present endmembers in the cube for the analysis of these images. Unsupervised unmixing is done with no information about the given data cube. Sparsity is one of the recent approaches used in the source recovery or unmixing techniques. The l1-norm optimization problem “basis pursuit” could be used as a sparsity-based approach to solve this unmixing problem where the endmembers is assumed to be sparse in an appropriate domain known as dictionary. This optimization problem is solved using proximal method “iterative thresholding”. The l1-norm basis pursuit optimization problem as a sparsity-based unmixing technique was used to unmix real and synthetic hyperspectral data cubes.Keywords: basis pursuit, blind source separation, hyperspectral imaging, spectral unmixing, wavelets
Procedia PDF Downloads 19412750 Particle Jetting Induced by the Explosive Dispersal
Authors: Kun Xue, Lvlan Miu, Jiarui Li
Abstract:
Jetting structures are widely found in particle rings or shells dispersed by the central explosion. In contrast, some explosive dispersal of particles only results in a dispersed cloud without distinctive structures. Employing the coupling method of the compressible computational fluid mechanics and discrete element method (CCFD-DEM), we reveal the underlying physics governing the formation of the jetting structure, which is related to the competition between the shock compaction and gas infiltration, two major processes during the shock interaction with the granular media. If the shock compaction exceeds the gas infiltration, the discernable jetting structures are expected, precipitated by the agglomerates of fast-moving particles induced by the heterogenous network of force chains. Otherwise, particles are uniformly accelerated by the interstitial flows, and no distinguishable jetting structures are formed. We proceed to devise the phase map of the jetting formation in the space defined by two dimensionless parameters which characterize the timescales of the shock compaction and the gas infiltration, respectively.Keywords: compressible multiphase flows, DEM, granular jetting, pattern formation
Procedia PDF Downloads 7512749 Optimizing Recycling and Reuse Strategies for Circular Construction Materials with Life Cycle Assessment
Authors: Zhongnan Ye, Xiaoyi Liu, Shu-Chien Hsu
Abstract:
Rapid urbanization has led to a significant increase in construction and demolition waste (C&D waste), underscoring the need for sustainable waste management strategies in the construction industry. Aiming to enhance the sustainability of urban construction practices, this study develops an optimization model to effectively suggest the optimal recycling and reuse strategies for C&D waste, including concrete and steel. By employing Life Cycle Assessment (LCA), the model evaluates the environmental impacts of adopted construction materials throughout their lifecycle. The model optimizes the quantity of materials to recycle or reuse, the selection of specific recycling and reuse processes, and logistics decisions related to the transportation and storage of recycled materials with the objective of minimizing the overall environmental impact, quantified in terms of carbon emissions, energy consumption, and associated costs, while adhering to a range of constraints. These constraints include capacity limitations, quality standards for recycled materials, compliance with environmental regulations, budgetary limits, and temporal considerations such as project deadlines and material availability. The strategies are expected to be both cost-effective and environmentally beneficial, promoting a circular economy within the construction sector, aligning with global sustainability goals, and providing a scalable framework for managing construction waste in densely populated urban environments. The model is helpful in reducing the carbon footprint of construction projects, conserving valuable resources, and supporting the industry’s transition towards a more sustainable future.Keywords: circular construction, construction and demolition waste, material recycling, optimization modeling
Procedia PDF Downloads 5512748 Numerical Simulation of Urea Water Solution Evaporation Behavior inside the Diesel Selective Catalytic Reduction System
Authors: Kumaresh Selvakumar, Man Young Kim
Abstract:
Selective catalytic reduction (SCR) converts the nitrogen oxides with the aid of a catalyst by adding aqueous urea into the exhaust stream. In this work, the urea water droplets are sprayed over the exhaust gases by treating with Lagrangian particle tracking. The evaporation of ammonia from a single droplet of urea water solution is investigated computationally by convection-diffusion controlled model. The conversion to ammonia due to thermolysis of urea water droplets is measured downstream at different sections using finite rate/eddy dissipation model. In this paper, the mixer installed at the upstream enhances the distribution of ammonia over the entire domain which is calculated for different time steps. Calculations are made within the respective duration such that the complete decomposition of urea is possible at a much shorter residence time.Keywords: convection-diffusion controlled model, lagrangian particle tracking, selective catalytic reduction, thermolysis
Procedia PDF Downloads 40312747 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach
Authors: Mohammad H. Almomani
Abstract:
In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization
Procedia PDF Downloads 35412746 Introduction to Multi-Agent Deep Deterministic Policy Gradient
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents
Procedia PDF Downloads 2112745 Modelling of Multi-Agent Systems for the Scheduling of Multi-EV Charging from Power Limited Sources
Authors: Manan’Iarivo Rasolonjanahary, Chris Bingham, Nigel Schofield, Masoud Bazargan
Abstract:
This paper presents the research and application of model predictive scheduled charging of electric vehicles (EV) subject to limited available power resource. To focus on algorithm and operational characteristics, the EV interface to the source is modelled as a battery state equation during the charging operation. The researched methods allow for the priority scheduling of EV charging in a multi-vehicle regime and when subject to limited source power availability. Priority attribution for each connected EV is described. The validity of the developed methodology is shown through the simulation of different scenarios of charging operation of multiple connected EVs including non-scheduled and scheduled operation with various numbers of vehicles. Performance of the developed algorithms is also reported with the recommendation of the choice of suitable parameters.Keywords: model predictive control, non-scheduled, power limited sources, scheduled and stop-start battery charging
Procedia PDF Downloads 15512744 Study on Seismic Response Feature of Multi-Span Bridges Crossing Fault
Authors: Yingxin Hui
Abstract:
Understanding seismic response feature of the bridges crossing fault is the basis of the seismic fortification. Taking a multi-span bridge crossing active fault under construction as an example, the seismic ground motions at bridge site were generated following hybrid simulation methodology. Multi-support excitations displacement input models and nonlinear time history analysis was used to calculate seismic response of structures, and the results were compared with bridge in the near-fault region. The results showed that the seismic response features of bridges crossing fault were different from the bridges in the near-fault region. The design according to the bridge in near-fault region would cause the calculation results with insecurity and non-reasonable if the effect of cross the fault was ignored. The design of seismic fortification should be based on seismic response feature, which could reduce the adverse effect caused by the structure damage.Keywords: bridge engineering, seismic response feature, across faults, rupture directivity effect, fling step
Procedia PDF Downloads 42912743 Chronic Impact of Silver Nanoparticle on Aerobic Wastewater Biofilm
Authors: Sanaz Alizadeh, Yves Comeau, Arshath Abdul Rahim, Sunhasis Ghoshal
Abstract:
The application of silver nanoparticles (AgNPs) in personal care products, various household and industrial products has resulted in an inevitable environmental exposure of such engineered nanoparticles (ENPs). Ag ENPs, released via household and industrial wastes, reach water resource recovery facilities (WRRFs), yet the fate and transport of ENPs in WRRFs and their potential risk in the biological wastewater processes are poorly understood. Accordingly, our main objective was to elucidate the impact of long-term continuous exposure to AgNPs on biological activity of aerobic wastewater biofilm. The fate, transport and toxicity of 10 μg.L-1and 100 μg.L-1 PVP-stabilized AgNPs (50 nm) were evaluated in an attached growth biological treatment process, using lab-scale moving bed bioreactors (MBBRs). Two MBBR systems for organic matter removal were fed with a synthetic influent and operated at a hydraulic retention time (HRT) of 180 min and 60% volumetric filling ratio of Anox-K5 carriers with specific surface area of 800 m2/m3. Both reactors were operated for 85 days after reaching steady state conditions to develop a mature biofilm. The impact of AgNPs on the biological performance of the MBBRs was characterized over a period of 64 days in terms of the filtered biodegradable COD (SCOD) removal efficiency, the biofilm viability and key enzymatic activities (α-glucosidase and protease). The AgNPs were quantitatively characterized using single-particle inductively coupled plasma mass spectroscopy (spICP-MS), determining simultaneously the particle size distribution, particle concentration and dissolved silver content in influent, bioreactor and effluent samples. The generation of reactive oxygen species and the oxidative stress were assessed as the proposed toxicity mechanism of AgNPs. Results indicated that a low concentration of AgNPs (10 μg.L-1) did not significantly affect the SCOD removal efficiency whereas a significant reduction in treatment efficiency (37%) was observed at 100 μg.L-1AgNPs. Neither the viability nor the enzymatic activities of biofilm were affected at 10 μg.L-1AgNPs but a higher concentration of AgNPs induced cell membrane integrity damage resulting in 31% loss of viability and reduced α-glucosidase and protease enzymatic activities by 31% and 29%, respectively, over the 64-day exposure period. The elevated intercellular ROS in biofilm at a higher AgNPs concentration over time was consistent with a reduced biological biofilm performance, confirming the occurrence of a nanoparticle-induced oxidative stress in the heterotrophic biofilm. The spICP-MS analysis demonstrated a decrease in the nanoparticles concentration over the first 25 days, indicating a significant partitioning of AgNPs into the biofilm matrix in both reactors. The concentration of nanoparticles increased in effluent of both reactors after 25 days, however, indicating a decreased retention capacity of AgNPs in biofilm. The observed significant detachment of biofilm also contributed to a higher release of nanoparticles due to cell-wall destabilizing properties of AgNPs as an antimicrobial agent. The removal efficiency of PVP-AgNPs and the biofilm biological responses were a function of nanoparticle concentration and exposure time. This study contributes to a better understanding of the fate and behavior of AgNPs in biological wastewater processes, providing key information that can be used to predict the environmental risks of ENPs in aquatic ecosystems.Keywords: biofilm, silver nanoparticle, single particle ICP-MS, toxicity, wastewater
Procedia PDF Downloads 26712742 Multi Agent Based Pre-Hospital Emergency Management Architecture
Authors: Jaleh Shoshtarian Malak, Niloofar Mohamadzadeh
Abstract:
Managing pre-hospital emergency patients requires real-time practices and efficient resource utilization. Since we are facing a distributed Network of healthcare providers, services and applications choosing the right resources and treatment protocol considering patient situation is a critical task. Delivering care to emergency patients at right time and with the suitable treatment settings can save ones live and prevent further complication. In recent years Multi Agent Systems (MAS) introduced great solutions to deal with real-time, distributed and complicated problems. In this paper we propose a multi agent based pre-hospital emergency management architecture in order to manage coordination, collaboration, treatment protocol and healthcare provider selection between different parties in pre-hospital emergency in a self-organizing manner. We used AnyLogic Agent Based Modeling (ABM) tool in order to simulate our proposed architecture. We have analyzed and described the functionality of EMS center, Ambulance, Consultation Center, EHR Repository and Quality of Care Monitoring as main collaborating agents. Future work includes implementation of the proposed architecture and evaluation of its impact on patient quality of care improvement.Keywords: multi agent systems, pre-hospital emergency, simulation, software architecture
Procedia PDF Downloads 42312741 Structural Analysis of Phase Transformation and Particle Formation in Metastable Metallic Thin Films Grown by Plasma-Enhanced Atomic Layer Deposition
Authors: Pouyan Motamedi, Ken Bosnick, Ken Cadien, James Hogan
Abstract:
Growth of conformal ultrathin metal films has attracted a considerable amount of attention recently. Plasma-enhanced atomic layer deposition (PEALD) is a method capable of growing conformal thin films at low temperatures, with an exemplary control over thickness. The authors have recently reported on growth of metastable epitaxial nickel thin films via PEALD, along with a comprehensive characterization of the films and a study on the relationship between the growth parameters and the film characteristics. The goal of the current study is to use the mentioned films as a case study to investigate the temperature-activated phase transformation and agglomeration in ultrathin metallic films. For this purpose, metastable hexagonal nickel thin films were annealed using a controlled heating/cooling apparatus. The transformations in the crystal structure were observed via in-situ synchrotron x-ray diffraction. The samples were annealed to various temperatures in the range of 400-1100° C. The onset and progression of particle formation were studied in-situ via laser measurements. In addition, a four-point probe measurement tool was used to record the changes in the resistivity of the films, which is affected by phase transformation, as well as roughening and agglomeration. Thin films annealed at various temperature steps were then studied via atomic force microscopy, scanning electron microscopy and high-resolution transmission electron microscopy, in order to get a better understanding of the correlated mechanisms, through which phase transformation and particle formation occur. The results indicate that the onset of hcp-to-bcc transformation is at 400°C, while particle formations commences at 590° C. If the annealed films are quenched after transformation, but prior to agglomeration, they show a noticeable drop in resistivity. This can be attributed to the fact that the hcp films are grown epitaxially, and are under severe tensile strain, and annealing leads to relaxation of the mismatch strain. In general, the results shed light on the nature of structural transformation in nickel thin films, as well as metallic thin films, in general.Keywords: atomic layer deposition, metastable, nickel, phase transformation, thin film
Procedia PDF Downloads 32712740 Lubrication Performance of Multi-Level Gear Oil in a Gasoline Engine
Authors: Feng-Tsai Weng, Dong- Syuan Cai, Tsochu-Lin
Abstract:
A vehicle gasoline engine converts gasoline into power so that the car can move, and lubricants are important for engines and also gear boxes. Manufacturers have produced numbers of engine oils, and gear oils for engines and gear boxes to SAE International Standards. Some products not only can improve the lubrication of both the engine and gear box but also can raise power of vehicle this can be easily seen in the advertisement declared by the manufacturers. To observe the lubrication performance, a multi-leveled (heavy duty) gear oil was added to a gasoline engine as the oil in the vehicle. The oil was checked at about every 10,000 kilometers. The engine was detailed disassembled, cleaned, and parts were measured. The wear of components of the engine parts were checked and recorded finally. Based on the experiment results, some gear oil seems possible to be used as engine oil in particular vehicles. Vehicle owners should change oil periodically in about every 6,000 miles (or 10,000 kilometers). Used car owners may change engine oil in even longer distance.Keywords: multi-level gear oil, engine oil, viscosity, abrasion
Procedia PDF Downloads 31812739 Topical Delivery of Griseofulvin via Lipid Nanoparticles
Authors: Yann Jean Tan, Hui Meng Er, Choy Sin Lee, Shew Fung Wong, Wen Huei Lim
Abstract:
Griseofulvin is a long standing fungistatic agent against dermatophytosis. Nevertheless, it has several drawbacks such as poor and highly variable bio availability, long duration of treatment, systemic side effects and drug interactions. Targeted treatment for the superficial skin infection, dermatophytosis via topical route could be beneficial. Nevertheless, griseofulvin is only available in the form of oral preparation. Hence, it generates interest in developing a topical formulation for griseofulvin, by using lipid nano particle as the vehicle. Lipid nanoparticle is a submicron colloidal carrier with a core that is solid in nature (lipid). It has combined advantages of various traditional carriers and is a promising vehicle for topical delivery. The griseofulvin loaded lipid nano particles produced using high pressure homogenization method were characterized and investigated for its skin targeting effect in vitro. It has a mean particle size of 179.8±4.9 nm with polydispersity index of 0.306±0.011. Besides, it showed higher skin permeation and better skin targeting effect compared to the griseofulvin suspension.Keywords: lipid nanoparticles, griseofulvin, topical, dermatophytosis
Procedia PDF Downloads 45612738 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization
Authors: Yihao Kuang, Bowen Ding
Abstract:
With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.Keywords: reinforcement learning, PPO, knowledge inference
Procedia PDF Downloads 24112737 The Search of Anomalous Higgs Boson Couplings at the Large Hadron Electron Collider and Future Circular Electron Hadron Collider
Authors: Ilkay Turk Cakir, Murat Altinli, Zekeriya Uysal, Abdulkadir Senol, Olcay Bolukbasi Yalcinkaya, Ali Yilmaz
Abstract:
The Higgs boson was discovered by the ATLAS and CMS experimental groups in 2012 at the Large Hadron Collider (LHC). Production and decay properties of the Higgs boson, Standard Model (SM) couplings, and limits on effective scale of the Higgs boson’s couplings with other bosons are investigated at particle colliders. Deviations from SM estimates are parametrized by effective Lagrangian terms to investigate Higgs couplings. This is a model-independent method for describing the new physics. In this study, sensitivity to neutral gauge boson anomalous couplings with the Higgs boson is investigated using the parameters of the Large Hadron electron Collider (LHeC) and the Future Circular electron-hadron Collider (FCC-eh) with a model-independent approach. By using MadGraph5_aMC@NLO multi-purpose event generator with the parameters of LHeC and FCC-eh, the bounds on the anomalous Hγγ, HγZ and HZZ couplings in e− p → e− q H process are obtained. Detector simulations are also taken into account in the calculations.Keywords: anomalos couplings, FCC-eh, Higgs, Z boson
Procedia PDF Downloads 20612736 Fuzzy Vehicle Routing Problem for Extreme Environment
Authors: G. Sirbiladze, B. Ghvaberidze, B. Matsaberidze
Abstract:
A fuzzy vehicle routing problem is considered in the possibilistic environment. A new criterion, maximization of expectation of reliability for movement on closed routes is constructed. The objective of the research is to implement a two-stage scheme for solution of this problem. Based on the algorithm of preferences on the first stage, the sample of so-called “promising” routes will be selected. On the second stage, for the selected promising routes new bi-criteria problem will be solved - minimization of total traveled distance and maximization of reliability of routes. The problem will be stated as a fuzzy-partitioning problem. Two possible solutions of this scheme are considered.Keywords: vehicle routing problem, fuzzy partitioning problem, multiple-criteria optimization, possibility theory
Procedia PDF Downloads 54712735 A Comparative Analysis Approach Based on Fuzzy AHP, TOPSIS and PROMETHEE for the Selection Problem of GSCM Solutions
Authors: Omar Boutkhoum, Mohamed Hanine, Abdessadek Bendarag
Abstract:
Sustainable economic growth is nowadays driving firms to extend toward the adoption of many green supply chain management (GSCM) solutions. However, the evaluation and selection of these solutions is a matter of concern that needs very serious decisions, involving complexity owing to the presence of various associated factors. To resolve this problem, a comparative analysis approach based on multi-criteria decision-making methods is proposed for adequate evaluation of sustainable supply chain management solutions. In the present paper, we propose an integrated decision-making model based on FAHP (Fuzzy Analytic Hierarchy Process), TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) and PROMETHEE (Preference Ranking Organisation METHod for Enrichment Evaluations) to contribute to a better understanding and development of new sustainable strategies for industrial organizations. Due to the varied importance of the selected criteria, FAHP is used to identify the evaluation criteria and assign the importance weights for each criterion, while TOPSIS and PROMETHEE methods employ these weighted criteria as inputs to evaluate and rank the alternatives. The main objective is to provide a comparative analysis based on TOPSIS and PROMETHEE processes to help make sound and reasoned decisions related to the selection problem of GSCM solution.Keywords: GSCM solutions, multi-criteria analysis, decision support system, TOPSIS, FAHP, PROMETHEE
Procedia PDF Downloads 16212734 Fracture Crack Monitoring Using Digital Image Correlation Technique
Authors: B. G. Patel, A. K. Desai, S. G. Shah
Abstract:
The main of objective of this paper is to develop new measurement technique without touching the object. DIC is advance measurement technique use to measure displacement of particle with very high accuracy. This powerful innovative technique which is used to correlate two image segments to determine the similarity between them. For this study, nine geometrically similar beam specimens of different sizes with (steel fibers and glass fibers) and without fibers were tested under three-point bending in a closed loop servo-controlled machine with crack mouth opening displacement control with a rate of opening of 0.0005 mm/sec. Digital images were captured before loading (unreformed state) and at different instances of loading and were analyzed using correlation techniques to compute the surface displacements, crack opening and sliding displacements, load-point displacement, crack length and crack tip location. It was seen that the CMOD and vertical load-point displacement computed using DIC analysis matches well with those measured experimentally.Keywords: Digital Image Correlation, fibres, self compacting concrete, size effect
Procedia PDF Downloads 38712733 Air Pollution: The Journey from Single Particle Characterization to in vitro Fate
Authors: S. Potgieter-Vermaak, N. Bain, A. Brown, K. Shaw
Abstract:
It is well-known from public news media that air pollution is a health hazard and is responsible for early deaths. The quantification of the relationship between air quality and health is a probing question not easily answered. It is known that airborne particulate matter (APM) <2.5µm deposits in the tracheal and alveoli zones and our research probes the possibility of quantifying pulmonary injury by linking reactive oxygen species (ROS) in these particles to DNA damage. Currently, APM mass concentration is linked to early deaths and limited studies probe the influence of other properties on human health. To predict the full extent and type of impact, particles need to be characterised for chemical composition and structure. APMs are routinely analysed for their bulk composition, but of late analysis on a micro level probing single particle character, using micro-analytical techniques, are considered. The latter, single particle analysis (SPA), permits one to obtain detailed information on chemical character from nano- to micron-sized particles. This paper aims to provide a snapshot of studies using data obtained from chemical characterisation and its link with in-vitro studies to inform on personal health risks. For this purpose, two studies will be compared, namely, the bioaccessibility of the inhalable fraction of urban road dust versus total suspended solids (TSP) collected in the same urban environment. The significant influence of metals such as Cu and Fe in TSP on DNA damage is illustrated. The speciation of Hg (determined by SPA) in different urban environments proved to dictate its bioaccessibility in artificial lung fluids rather than its concentration.Keywords: air pollution, human health, in-vitro studies, particulate matter
Procedia PDF Downloads 22512732 A Resource Optimization Strategy for CPU (Central Processing Unit) Intensive Applications
Authors: Junjie Peng, Jinbao Chen, Shuai Kong, Danxu Liu
Abstract:
On the basis of traditional resource allocation strategies, the usage of resources on physical servers in cloud data center is great uncertain. It will cause waste of resources if the assignment of tasks is not enough. On the contrary, it will cause overload if the assignment of tasks is too much. This is especially obvious when the applications are the same type because of its resource preferences. Considering CPU intensive application is one of the most common types of application in the cloud, we studied the optimization strategy for CPU intensive applications on the same server. We used resource preferences to analyze the case that multiple CPU intensive applications run simultaneously, and put forward a model which can predict the execution time for CPU intensive applications which run simultaneously. Based on the prediction model, we proposed the method to select the appropriate number of applications for a machine. Experiments show that the model can predict the execution time accurately for CPU intensive applications. To improve the execution efficiency of applications, we propose a scheduling model based on priority for CPU intensive applications. Extensive experiments verify the validity of the scheduling model.Keywords: cloud computing, CPU intensive applications, resource optimization, strategy
Procedia PDF Downloads 27612731 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System
Authors: Tahsin A. H. Nishat, Raquib Ahsan
Abstract:
Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system
Procedia PDF Downloads 13512730 The Bernstein Expansion for Exponentials in Taylor Functions: Approximation of Fixed Points
Authors: Tareq Hamadneh, Jochen Merker, Hassan Al-Zoubi
Abstract:
Bernstein's expansion for exponentials in Taylor functions provides lower and upper optimization values for the range of its original function. these values converge to the original functions if the degree is elevated or the domain subdivided. Taylor polynomial can be applied so that the exponential is a polynomial of finite degree over a given domain. Bernstein's basis has two main properties: its sum equals 1, and positive for all x 2 (0; 1). In this work, we prove the existence of fixed points for exponential functions in a given domain using the optimization values of Bernstein. The Bernstein basis of finite degree T over a domain D is defined non-negatively. Any polynomial p of degree t can be expanded into the Bernstein form of maximum degree t ≤ T, where we only need to compute the coefficients of Bernstein in order to optimize the original polynomial. The main property is that p(x) is approximated by the minimum and maximum Bernstein coefficients (Bernstein bound). If the bound is contained in the given domain, then we say that p(x) has fixed points in the same domain.Keywords: Bernstein polynomials, Stability of control functions, numerical optimization, Taylor function
Procedia PDF Downloads 13412729 Optimization of Floor Heating System in the Incompressible Turbulent Flow Using Constructal Theory
Authors: Karim Farahmandfar, Hamidolah Izadi, Mohammadreza Rezaei, Amin Ardali, Ebrahim Goshtasbi Rad, Khosro Jafarpoor
Abstract:
Statistics illustrates that the higher amount of annual energy consumption is related to surmounting the demand in buildings. Therefore, it is vital to economize the energy consumption and also find the solution with regard to this issue. One of the systems for the sake of heating the building is floor heating. As a matter of fact, floor heating performance is based on convection and radiation. Actually, in addition to creating a favorable heating condition, this method leads to energy saving. It is the goal of this article to outline the constructal theory and introduce the optimization method in branch networks for floor heating. There are several steps in order to gain this purpose. First of all, the pressure drop through the two points of the network is calculated. This pressure drop is as a function of pipes diameter and other parameters. After that, the amount of heat transfer is determined. Consequently, as a result of the combination of these two functions, the final function will be determined. It is necessary to mention that flow is laminar.Keywords: constructal theory, optimization, floor heating system, turbulent flow
Procedia PDF Downloads 31812728 On Energy Condition Violation for Shifting Negative Mass Black Holes
Authors: Manuel Urueña Palomo
Abstract:
In this paper, we introduce the study of a new solution to gravitational singularities by violating the energy conditions of the Penrose Hawking singularity theorems. We consider that a shift to negative energies, and thus, to negative masses, takes place at the event horizon of a black hole, justified by the original, singular and exact Schwarzschild solution. These negative energies are supported by relativistic particle physics considering the negative energy solutions of the Dirac equation, which states that a time transformation shifts to a negative energy particle. In either general relativity or full Newtonian mechanics, these negative masses are predicted to be repulsive. It is demonstrated that the model fits actual observations, and could possibly clarify the size of observed and unexplained supermassive black holes, when considering the inflation that would take place inside the event horizon where massive particles interact antigravitationally. An approximated solution of the model proposed could be simulated in order to compare it with these observations.Keywords: black holes, CPT symmetry, negative mass, time transformation
Procedia PDF Downloads 14812727 Application of Multilinear Regression Analysis for Prediction of Synthetic Shear Wave Velocity Logs in Upper Assam Basin
Authors: Triveni Gogoi, Rima Chatterjee
Abstract:
Shear wave velocity (Vs) estimation is an important approach in the seismic exploration and characterization of a hydrocarbon reservoir. There are varying methods for prediction of S-wave velocity, if recorded S-wave log is not available. But all the available methods for Vs prediction are empirical mathematical models. Shear wave velocity can be estimated using P-wave velocity by applying Castagna’s equation, which is the most common approach. The constants used in Castagna’s equation vary for different lithologies and geological set-ups. In this study, multiple regression analysis has been used for estimation of S-wave velocity. The EMERGE module from Hampson-Russel software has been used here for generation of S-wave log. Both single attribute and multi attributes analysis have been carried out for generation of synthetic S-wave log in Upper Assam basin. Upper Assam basin situated in North Eastern India is one of the most important petroleum provinces of India. The present study was carried out using four wells of the study area. Out of these wells, S-wave velocity was available for three wells. The main objective of the present study is a prediction of shear wave velocities for wells where S-wave velocity information is not available. The three wells having S-wave velocity were first used to test the reliability of the method and the generated S-wave log was compared with actual S-wave log. Single attribute analysis has been carried out for these three wells within the depth range 1700-2100m, which corresponds to Barail group of Oligocene age. The Barail Group is the main target zone in this study, which is the primary producing reservoir of the basin. A system generated list of attributes with varying degrees of correlation appeared and the attribute with the highest correlation was concerned for the single attribute analysis. Crossplot between the attributes shows the variation of points from line of best fit. The final result of the analysis was compared with the available S-wave log, which shows a good visual fit with a correlation of 72%. Next multi-attribute analysis has been carried out for the same data using all the wells within the same analysis window. A high correlation of 85% has been observed between the output log from the analysis and the recorded S-wave. The almost perfect fit between the synthetic S-wave and the recorded S-wave log validates the reliability of the method. For further authentication, the generated S-wave data from the wells have been tied to the seismic and correlated them. Synthetic share wave log has been generated for the well M2 where S-wave is not available and it shows a good correlation with the seismic. Neutron porosity, density, AI and P-wave velocity are proved to be the most significant variables in this statistical method for S-wave generation. Multilinear regression method thus can be considered as a reliable technique for generation of shear wave velocity log in this study.Keywords: Castagna's equation, multi linear regression, multi attribute analysis, shear wave logs
Procedia PDF Downloads 22612726 Nursing Experience in Improving Physical and Mental Well-Being of a Patient with Premature Menopause Osteoporosis and Sarcopenia in Nursing-Led Multi-Discipline Care
Authors: Huang Chiung Chiu
Abstract:
This article is about the nursing experience of assisting an outpatient with premature menopause, osteoporosis and sarcopenia through a multi-discipline care model. The nursing period is from September 22nd, 2020, to December 7th, 2020, collecting data through interviews with the patient, observation, and physical assessment. It was found that the main health problems were insufficient nutrition, less physical need, insomnia, and potentially dangerous falls. As an outpatient nurse, the author observed that in recent years, the age group of women with premature menopause, osteoporosis and sarcopenia had shifted downward. Integrated multi-disciplinary interventions were provided upon the initial diagnosis of osteoporosis and sarcopenia. Under the outpatient care setting, the collaborative team works between the doctors, nutritionists, osteoporosis educators, rehabilitates, physical therapists and other specialized teams were applied to provide individualized, integrated multi-disciplinary care. Through empathy and the establishment of attentive care, companionship and trust, we discussed care plans and treatment guidelines with the case, providing accurate, complete disease information and feedback education to strengthen the patient’s knowledge and motivation for exercise. Nursing guidance regarding the dietary nutrition and adjustment of daily routine was provided to increase the self-care ability, improve the health problems of muscle weakness and insomnia, and prevent falls. For patients with postmenopausal osteoporosis and sarcopenia, it is recommended that the nurses coordinate the multi-discipline integrated care model, adjust patients’ lifestyle and diet, and establish a regular exercise plan so that the cases can be evaluated holistically to improve the quality of care and physical and mental comfort.Keywords: multi-discipline care model, premature menopause, osteoporosis, sarcopenia, insomnia
Procedia PDF Downloads 11712725 Statistical Optimization of Vanillin Production by Pycnoporus Cinnabarinus 1181
Authors: Swarali Hingse, Shraddha Digole, Uday Annapure
Abstract:
The present study investigates the biotransformation of ferulic acid to vanillin by Pycnoporus cinnabarinus and its optimization using one-factor-at-a-time method as well as statistical approach. Effect of various physicochemical parameters and medium components was studied using one-factor-at-a-time method. Screening of the significant factors was carried out using L25 Taguchi orthogonal array and then these selected significant factors were further optimized using response surface methodology (RSM). Significant media components obtained using Taguchi L25 orthogonal array were glucose, KH2PO4 and yeast extract. Further, a Box Behnken design was used to investigate the interactive effects of the three most significant media components. The final medium obtained after optimization using RSM containing glucose (34.89 g/L), diammonium tartrate (1 g/L), yeast extract (1.47 g/L), MgSO4•7H2O (0.5 g/L), KH2PO4 (0.15 g/L), and CaCl2•2H2O (20 mg/L) resulted in amplification of vanillin production from 30.88 mg/L to 187.63 mg/L.Keywords: ferulic acid, pycnoporus cinnabarinus, response surface methodology, vanillin
Procedia PDF Downloads 381