Search results for: optimal formulation
3533 Optimal and Best Timing for Capturing Satellite Thermal Images of Concrete Object
Authors: Toufic Abd El-Latif Sadek
Abstract:
The concrete object represents the concrete areas, like buildings. The best, easy, and efficient extraction of the concrete object from satellite thermal images occurred at specific times during the days of the year, by preventing the gaps in times which give the close and same brightness from different objects. Thus, to achieve the best original data which is the aim of the study and then better extraction of the concrete object and then better analysis. The study was done using seven sample objects, asphalt, concrete, metal, rock, dry soil, vegetation, and water, located at one place carefully investigated in a way that all the objects achieve the homogeneous in acquired data at the same time and same weather conditions. The samples of the objects were on the roof of building at position taking by global positioning system (GPS) which its geographical coordinates is: Latitude= 33 degrees 37 minutes, Longitude= 35 degrees 28 minutes, Height= 600 m. It has been found that the first choice and the best time in February is at 2:00 pm, in March at 4 pm, in April and may at 12 pm, in August at 5:00 pm, in October at 11:00 am. The best time in June and November is at 2:00 pm.Keywords: best timing, concrete areas, optimal, satellite thermal images
Procedia PDF Downloads 3553532 Financial Portfolio Optimization in Electricity Markets: Evaluation via Sharpe Ratio
Authors: F. Gökgöz, M. E. Atmaca
Abstract:
Electricity plays an indispensable role in human life and the economy. It is a unique product or service that must be balanced instantaneously, as electricity is not stored, generation and consumption should be proportional. Effective and efficient use of electricity is very important not only for society, but also for the environment. A competitive electricity market is one of the best ways to provide a suitable platform for effective and efficient use of electricity. On the other hand, it carries some risks that should be carefully managed by the market players. Risk management is an essential part in market players’ decision making. In this paper, risk management through diversification is applied with the help of Markowitz’s Mean-variance, Down-side and Semi-variance methods for a case study. Performance of optimal electricity sale solutions are measured and evaluated via Sharpe-Ratio, and the optimal portfolio solutions are improved. Two years of historical weekdays’ price data of the Turkish Day Ahead Market are used to demonstrate the approach.Keywords: electricity market, portfolio optimization, risk management in electricity market, sharpe ratio
Procedia PDF Downloads 3663531 Topology Optimization of Structures with Web-Openings
Authors: D. K. Lee, S. M. Shin, J. H. Lee
Abstract:
Topology optimization technique utilizes constant element densities as design parameters. Finally, optimal distribution contours of the material densities between voids (0) and solids (1) in design domain represent the determination of topology. It means that regions with element density values become occupied by solids in design domain, while there are only void phases in regions where no density values exist. Therefore the void regions of topology optimization results provide design information to decide appropriate depositions of web-opening in structure. Contrary to the basic objective of the topology optimization technique which is to obtain optimal topology of structures, this present study proposes a new idea that topology optimization results can be also utilized for decision of proper web-opening’s position. Numerical examples of linear elastostatic structures demonstrate efficiency of methodological design processes using topology optimization in order to determinate the proper deposition of web-openings.Keywords: topology optimization, web-opening, structure, element density, material
Procedia PDF Downloads 4753530 Numerical Study of Elastic Performances of Sandwich Beam with Carbon-Fibre Reinforced Skins
Authors: Soukaina Ounss, Hamid Mounir, Abdellatif El Marjani
Abstract:
Sandwich materials with composite reinforced skins are mostly required in advanced construction applications with a view to ensure resistant structures. Their lightweight, their high flexural stiffness and their optimal thermal insulation make them a suitable solution to obtain efficient structures with performing rigidity and optimal energy safety. In this paper, the mechanical behavior of a sandwich beam with composite skins reinforced by unidirectional carbon fibers is investigated numerically through analyzing the impact of reinforcements specifications on the longitudinal elastic modulus in order to select the adequate sandwich configuration that has an interesting rigidity and an accurate convergence to the analytical approach which is proposed to verify performed numerical simulations. Therefore, concerned study starts by testing flexion performances of skins with various fibers orientations and volume fractions to determine those to use in sandwich beam. For that, the combination of a reinforcement inclination of 30° and a volume ratio of 60% is selected with the one with 60° of fibers orientation and 40% of volume fraction, this last guarantees to chosen skins an important rigidity with an optimal fibers concentration and a great enhance in convergence to analytical results in the sandwich model for the reason of the crucial core role as transverse shear absorber. Thus, a resistant sandwich beam is elaborated from a face-sheet constituted from two layers of previous skins with fibers oriented in 60° and an epoxy core; concerned beam has a longitudinal elastic modulus of 54 Gpa (gigapascal) that equals to the analytical value by a negligible error of 2%.Keywords: fibers orientation, fibers volume ratio, longitudinal elastic modulus, sandwich beam
Procedia PDF Downloads 1733529 Non-Interactive XOR Quantum Oblivious Transfer: Optimal Protocols and Their Experimental Implementations
Authors: Lara Stroh, Nikola Horová, Robert Stárek, Ittoop V. Puthoor, Michal Mičuda, Miloslav Dušek, Erika Andersson
Abstract:
Oblivious transfer (OT) is an important cryptographic primitive. Any multi-party computation can be realised with OT as a building block. XOR oblivious transfer (XOT) is a variant where the sender Alice has two bits, and a receiver, Bob, obtains either the first bit, the second bit, or their XOR. Bob should not learn anything more than this, and Alice should not learn what Bob has learned. Perfect quantum OT with information-theoretic security is known to be impossible. We determine the smallest possible cheating probabilities for unrestricted dishonest parties in non-interactive quantum XOT protocols using symmetric pure states and present an optimal protocol which outperforms classical protocols. We also "reverse" this protocol so that Bob becomes the sender of a quantum state and Alice the receiver who measures it while still implementing oblivious transfer from Alice to Bob. Cheating probabilities for both parties stay the same as for the unreversed protocol. We optically implemented both the unreversed and the reversed protocols and cheating strategies, noting that the reversed protocol is easier to implement.Keywords: oblivious transfer, quantum protocol, cryptography, XOR
Procedia PDF Downloads 1283528 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique
Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak
Abstract:
The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method
Procedia PDF Downloads 1803527 Optimization of E-motor Control Parameters for Electrically Propelled Vehicles by Integral Squared Method
Authors: Ibrahim Cicek, Melike Nikbay
Abstract:
Electrically propelled vehicles, either road or aerial vehicles are studied on contemporarily for their robust maneuvers and cost-efficient transport operations. The main power generating systems of such vehicles electrified by selecting proper components and assembled as e-powertrain. Generally, e-powertrain components selected considering the target performance requirements. Since the main component of propulsion is the drive unit, e-motor control system is subjected to achieve the performance targets. In this paper, the optimization of e-motor control parameters studied by Integral Squared Method (ISE). The overall aim is to minimize power consumption of such vehicles depending on mission profile and maintaining smooth maneuvers for passenger comfort. The sought-after values of control parameters are computed using the Optimal Control Theory. The system is modeled as a closed-loop linear control system with calibratable parameters.Keywords: optimization, e-powertrain, optimal control, electric vehicles
Procedia PDF Downloads 1333526 Modeling and Optimal Control of Acetylene Catalytic Hydrogenation Reactor in Olefin Plant by Artificial Neural Network
Authors: Faezeh Aghazadeh, Mohammad Javad Sharifi
Abstract:
The application of neural networks to model a full-scale industrial acetylene hydrogenation in olefin plant has been studied. The operating variables studied are the, input-temperature of the reactor, output-temperature of the reactor, hydrogen ratio of the reactor, [C₂H₂]input, and [C₂H₆]input. The studied operating variables were used as the input to the constructed neural network to predict the [C₂H₆]output at any time as the output or the target. The constructed neural network was found to be highly precise in predicting the quantity of [C₂H₆]output for the new input data, which are kept unaware of the trained neural network showing its applicability to determine the [C₂H₆]output for any operating conditions. The enhancement of [C₂H₆]output as compared with [C₂H₆]input was a consequence of low selective acetylene hydrogenation to ethylene.Keywords: acetylene hydrogenation, Pd-Ag/Al₂O₃, artificial neural network, modeling, optimal design
Procedia PDF Downloads 2783525 A Risk-Based Comprehensive Framework for the Assessment of the Security of Multi-Modal Transport Systems
Authors: Mireille Elhajj, Washington Ochieng, Deeph Chana
Abstract:
The challenges of the rapid growth in the demand for transport has traditionally been seen within the context of the problems of congestion, air quality, climate change, safety, and affordability. However, there are increasing threats including those related to crime such as cyber-attacks that threaten the security of the transport of people and goods. To the best of the authors’ knowledge, this paper presents for the first time, a comprehensive framework for the assessment of the current and future security issues of multi-modal transport systems. The approach or method proposed is based on a structured framework starting with a detailed specification of the transport asset map (transport system architecture), followed by the identification of vulnerabilities. The asset map and vulnerabilities are used to identify the various approaches for exploitation of the vulnerabilities, leading to the creation of a set of threat scenarios. The threat scenarios are then transformed into risks and their categories, and include insights for their mitigation. The consideration of the mitigation space is holistic and includes the formulation of appropriate policies and tactics and/or technical interventions. The quality of the framework is ensured through a structured and logical process that identifies the stakeholders, reviews the relevant documents including policies and identifies gaps, incorporates targeted surveys to augment the reviews, and uses subject matter experts for validation. The approach to categorising security risks is an extension of the current methods that are typically employed. Specifically, the partitioning of risks into either physical or cyber categories is too limited for developing mitigation policies and tactics/interventions for transport systems where an interplay between physical and cyber processes is very often the norm. This interplay is rapidly taking on increasing significance for security as the emergence of cyber-physical technologies, are shaping the future of all transport modes. Examples include: Connected Autonomous Vehicles (CAVs) in road transport; the European Rail Traffic Management System (ERTMS) in rail transport; Automatic Identification System (AIS) in maritime transport; advanced Communications, Navigation and Surveillance (CNS) technologies in air transport; and the Internet of Things (IoT). The framework adopts a risk categorisation scheme that considers risks as falling within the following threat→impact relationships: Physical→Physical, Cyber→Cyber, Cyber→Physical, and Physical→Cyber). Thus the framework enables a more complete risk picture to be developed for today’s transport systems and, more importantly, is readily extendable to account for emerging trends in the sector that will define future transport systems. The framework facilitates the audit and retro-fitting of mitigations in current transport operations and the analysis of security management options for the next generation of Transport enabling strategic aspirations such as systems with security-by-design and co-design of safety and security to be achieved. An initial application of the framework to transport systems has shown that intra-modal consideration of security measures is sub-optimal and that a holistic and multi-modal approach that also addresses the intersections/transition points of such networks is required as their vulnerability is high. This is in-line with traveler-centric transport service provision, widely accepted as the future of mobility services. In summary, a risk-based framework is proposed for use by the stakeholders to comprehensively and holistically assess the security of transport systems. It requires a detailed understanding of the transport architecture to enable a detailed vulnerabilities analysis to be undertaken, creates threat scenarios and transforms them into risks which form the basis for the formulation of interventions.Keywords: mitigations, risk, transport, security, vulnerabilities
Procedia PDF Downloads 1663524 Everolimus Loaded Polyvinyl Alcohol Microspheres for Sustained Drug Delivery in the Treatment of Subependymal Giant Cell Astrocytoma
Authors: Lynn Louis, Bor Shin Chee, Marion McAfee, Michael Nugent
Abstract:
This article aims to develop a sustained release formulation of microspheres containing the mTOR inhibitor Everolimus (EVR) using Polyvinyl alcohol (PVA) to enhance the bioavailability of the drug and to overcome poor solubility characteristics of Everolimus. This paper builds on recent work in the manufacture of microspheres using the sessile droplet technique by freezing the polymer-drug solution by suspending the droplets into pre-cooled ethanol vials immersed in liquid nitrogen. The spheres were subjected to 6 freezing cycles and 3 freezing cycles with thawing to obtain proper geometry, prevent aggregation, and achieve physical cross-linking. The prepared microspheres were characterised for surface morphology by SEM, where a 3-D porous structure was observed. The in vitro release studies showed a 62.17% release over 12.5 days, indicating a sustained release due to good encapsulation. This result is comparatively much more than the 49.06% release achieved within 4 hours from the solvent cast Everolimus film as a control with no freeze-thaw cycles performed. The solvent cast films were made in this work for comparison. A prolonged release of Everolimus using a polymer-based drug delivery system is essential to reach optimal therapeutic concentrations in treating SEGA tumours without systemic exposure. These results suggest that the combination of PVA and Everolimus via a rheological synergism enhanced the bioavailability of the hydrophobic drug Everolimus. Physical-chemical characterisation using DSC and FTIR analysis showed compatibility of the drug with the polymer, and the stability of the drug was maintained owing to the high molecular weight of the PVA. The obtained results indicate that the developed PVA/EVR microsphere is highly suitable as a potential drug delivery system with improved bioavailability in treating Subependymal Giant cell astrocytoma (SEGA).Keywords: drug delivery system, everolimus, freeze-thaw cycles, polyvinyl alcohol
Procedia PDF Downloads 1313523 The Economics of Justice as Fairness
Authors: Antonio Abatemarco, Francesca Stroffolini
Abstract:
In the economic literature, Rawls’ Theory of Justice is usually interpreted in a two-stage setting, where a priority to the worst off individual is imposed as a distributive value judgment. In this paper, instead, we model Rawls’ Theory in a three-stage setting, that is, a separating line is drawn between the original position, the educational stage, and the working life. Hence, in this paper, we challenge the common interpretation of Rawls’ Theory of Justice as Fairness by showing that this Theory goes well beyond the definition of a distributive value judgment, in such a way as to embrace efficiency issues as well. In our model, inequalities are shown to be permitted as far as they stimulate a greater effort in education in the population, and so economic growth. To our knowledge, this is the only possibility for the inequality to be ‘bought’ by both the most-, and above all, the least-advantaged individual as suggested by the Difference Principle. Finally, by recalling the old tradition of ‘universal ex-post efficiency’, we show that a unique optimal social contract does not exist behind the veil of ignorance; more precisely, the sole set of potentially Rawls-optimal social contracts can be identified a priori, and partial justice orderings derived accordingly.Keywords: justice, Rawls, inequality, social contract
Procedia PDF Downloads 2253522 Special Features Of Phacoemulsification Technique For Dense Cataracts
Authors: Shilkin A.G., Goncharov D.V., Rotanov D.A., Voitecha M.A., Kulyagina Y.I., Mochalova U.E.
Abstract:
Context: Phacoemulsification is a surgical technique used to remove cataracts, but it has a higher number of complications when dense cataracts are present. The risk factors include thin posterior capsule, dense nucleus fragments, and prolonged exposure to high-power ultrasound. To minimize these complications, various methods are used. Research aim: The aim of this study is to develop and implement optimal methods of ultrasound phacoemulsification for dense cataracts in order to minimize postoperative complications. Methodology: The study involved 36 eyes of dogs with dense cataracts over a period of 5 years. The surgeries were performed using a LEICA 844 surgical microscope and an Oertli Faros phacoemulsifier. The surgical techniques included the optimal technique for breaking the nucleus, bimanual surgery, and the use of Akahoshi prechoppers. Findings: The complications observed during the surgery included rupture of the posterior capsule and the need for anterior vitrectomy. Complications in the postoperative period included corneal edema and uveitis. Theoretical importance: This study contributes to the field by providing insights into the special features of phacoemulsification for dense cataracts. It highlights the importance of using specific techniques and settings to minimize complications. Data collection and analysis procedures: The data for the study were collected from surgeries performed on dogs with dense cataracts. The complications were documented and analyzed. Question addressed: The study addressed the question of how to minimize complications during phacoemulsification surgery for dense cataracts. Conclusion: By following the optimal techniques, settings, and using prechoppers, the surgery for dense cataracts can be made safer and faster, minimizing the risks and complications.Keywords: dense cataracts, phacoemulsification, phacoemulsification of cataracts in elderly dogs, осложнения факоэмульсификации
Procedia PDF Downloads 633521 A Stochastic Vehicle Routing Problem with Ordered Customers and Collection of Two Similar Products
Authors: Epaminondas G. Kyriakidis, Theodosis D. Dimitrakos, Constantinos C. Karamatsoukis
Abstract:
The vehicle routing problem (VRP) is a well-known problem in Operations Research and has been widely studied during the last fifty-five years. The context of the VRP is that of delivering or collecting products to or from customers who are scattered in a geographical area and have placed orders for these products. A vehicle or a fleet of vehicles start their routes from a depot and visit the customers in order to satisfy their demands. Special attention has been given to the capacitated VRP in which the vehicles have limited carrying capacity for the goods that are delivered or collected. In the present work, we present a specific capacitated stochastic vehicle routing problem which has many realistic applications. We develop and analyze a mathematical model for a specific vehicle routing problem in which a vehicle starts its route from a depot and visits N customers according to a particular sequence in order to collect from them two similar but not identical products. We name these products, product 1 and product 2. Each customer possesses items either of product 1 or product 2 with known probabilities. The number of the items of product 1 or product 2 that each customer possesses is a discrete random variable with known distribution. The actual quantity and the actual type of product that each customer possesses are revealed only when the vehicle arrives at the customer’s site. It is assumed that the vehicle has two compartments. We name these compartments, compartment 1 and compartment 2. It is assumed that compartment 1 is suitable for loading product 1 and compartment 2 is suitable for loading product 2. However, it is permitted to load items of product 1 into compartment 2 and items of product 2 into compartment 1. These actions cause costs that are due to extra labor. The vehicle is allowed during its route to return to the depot to unload the items of both products. The travel costs between consecutive customers and the travel costs between the customers and the depot are known. The objective is to find the optimal routing strategy, i.e. the routing strategy that minimizes the total expected cost among all possible strategies for servicing all customers. It is possible to develop a suitable dynamic programming algorithm for the determination of the optimal routing strategy. It is also possible to prove that the optimal routing strategy has a specific threshold-type strategy. Specifically, it is shown that for each customer the optimal actions are characterized by some critical integers. This structural result enables us to design a special-purpose dynamic programming algorithm that operates only over these strategies having this structural property. Extensive numerical results provide strong evidence that the special-purpose dynamic programming algorithm is considerably more efficient than the initial dynamic programming algorithm. Furthermore, if we consider the same problem without the assumption that the customers are ordered, numerical experiments indicate that the optimal routing strategy can be computed if N is smaller or equal to eight.Keywords: dynamic programming, similar products, stochastic demands, stochastic preferences, vehicle routing problem
Procedia PDF Downloads 2573520 Optimization of Passive Vibration Damping of Space Structures
Authors: Emad Askar, Eldesoky Elsoaly, Mohamed Kamel, Hisham Kamel
Abstract:
The objective of this article is to improve the passive vibration damping of solar array (SA) used in space structures, by the effective application of numerical optimization. A case study of a SA is used for demonstration. A finite element (FE) model was created and verified by experimental testing. Optimization was then conducted by implementing the FE model with the genetic algorithm, to find the optimal placement of aluminum circular patches, to suppress the first two bending mode shapes. The results were verified using experimental testing. Finally, a parametric study was conducted using the FE model where patch locations, material type, and shape were varied one at a time, and the results were compared with the optimal ones. The results clearly show that through the proper application of FE modeling and numerical optimization, passive vibration damping of space structures has been successfully achieved.Keywords: damping optimization, genetic algorithm optimization, passive vibration damping, solar array vibration damping
Procedia PDF Downloads 4513519 Credit Card Fraud Detection with Ensemble Model: A Meta-Heuristic Approach
Authors: Gong Zhilin, Jing Yang, Jian Yin
Abstract:
The purpose of this paper is to develop a novel system for credit card fraud detection based on sequential modeling of data using hybrid deep learning models. The projected model encapsulates five major phases are pre-processing, imbalance-data handling, feature extraction, optimal feature selection, and fraud detection with an ensemble classifier. The collected raw data (input) is pre-processed to enhance the quality of the data through alleviation of the missing data, noisy data as well as null values. The pre-processed data are class imbalanced in nature, and therefore they are handled effectively with the K-means clustering-based SMOTE model. From the balanced class data, the most relevant features like improved Principal Component Analysis (PCA), statistical features (mean, median, standard deviation) and higher-order statistical features (skewness and kurtosis). Among the extracted features, the most optimal features are selected with the Self-improved Arithmetic Optimization Algorithm (SI-AOA). This SI-AOA model is the conceptual improvement of the standard Arithmetic Optimization Algorithm. The deep learning models like Long Short-Term Memory (LSTM), Convolutional Neural Network (CNN), and optimized Quantum Deep Neural Network (QDNN). The LSTM and CNN are trained with the extracted optimal features. The outcomes from LSTM and CNN will enter as input to optimized QDNN that provides the final detection outcome. Since the QDNN is the ultimate detector, its weight function is fine-tuned with the Self-improved Arithmetic Optimization Algorithm (SI-AOA).Keywords: credit card, data mining, fraud detection, money transactions
Procedia PDF Downloads 1313518 Optimization of High Flux Density Design for Permanent Magnet Motor
Authors: Dong-Woo Kang
Abstract:
This paper presents an optimal magnet shape of a spoke-shaped interior permanent magnet synchronous motor by using ferrite magnets. Generally, the permanent magnet motor used the ferrite magnets has lower output power and efficiency than a rare-earth magnet motor, because the ferrite magnet has lower magnetic energy than the rare-earth magnet. Nevertheless, the ferrite magnet motor is used to many industrial products owing to cost effectiveness. In this paper, the authors propose a high power density design of the ferrite permanent magnet synchronous motor. Furthermore, because the motor design has to be taken a manufacturing process into account, the design is simulated by using the finite element method for analyzing the demagnetization, the magnetizing, and the structure stiffness. Especially, the magnet shape and dimensions are decided for satisfying these properties. Finally, the authors design an optimal motor for applying our system. That final design is manufactured and evaluated from experimentations.Keywords: demagnetization, design optimization, magnetic analysis, permanent magnet motors
Procedia PDF Downloads 3773517 Mathematical Model and Algorithm for the Berth and Yard Resource Allocation at Seaports
Authors: Ming Liu, Zhihui Sun, Xiaoning Zhang
Abstract:
This paper studies a deterministic container transportation problem, jointly optimizing the berth allocation, quay crane assignment and yard storage allocation at container ports. The problem is formulated as an integer program to coordinate the decisions. Because of the large scale, it is then transformed into a set partitioning formulation, and a framework of branchand- price algorithm is provided to solve it.Keywords: branch-and-price, container terminal, joint scheduling, maritime logistics
Procedia PDF Downloads 2933516 A Method of Effective Planning and Control of Industrial Facility Energy Consumption
Authors: Aleksandra Aleksandrovna Filimonova, Lev Sergeevich Kazarinov, Tatyana Aleksandrovna Barbasova
Abstract:
A method of effective planning and control of industrial facility energy consumption is offered. The method allows to optimally arrange the management and full control of complex production facilities in accordance with the criteria of minimal technical and economic losses at the forecasting control. The method is based on the optimal construction of the power efficiency characteristics with the prescribed accuracy. The problem of optimal designing of the forecasting model is solved on the basis of three criteria: maximizing the weighted sum of the points of forecasting with the prescribed accuracy; the solving of the problem by the standard principles at the incomplete statistic data on the basis of minimization of the regularized function; minimizing the technical and economic losses due to the forecasting errors.Keywords: energy consumption, energy efficiency, energy management system, forecasting model, power efficiency characteristics
Procedia PDF Downloads 3933515 A Serious Game to Upgrade the Learning of Organizational Skills in Nursing Schools
Authors: Benoit Landi, Hervé Pingaud, Jean-Benoit Culie, Michel Galaup
Abstract:
Serious games have been widely disseminated in the field of digital learning. They have proved their utility in improving skills through virtual environments that simulate the field where new competencies have to be improved and assessed. This paper describes how we created CLONE, a serious game whose purpose is to help nurses create an efficient work plan in a hospital care unit. In CLONE, the number of patients to take care of is similar to the reality of their job, going far beyond what is currently practiced in nurse school classrooms. This similarity with the operational field increases proportionally the number of activities to be scheduled. Moreover, very often, the team of nurses is composed of regular nurses and nurse assistants that must share the work with respect to the regulatory obligations. Therefore, on the one hand, building a short-term planning is a complex task with a large amount of data to deal with, and on the other, good clinical practices have to be systematically applied. We present how reference planning has been defined by addressing an optimization problem formulation using the expertise of teachers. This formulation ensures the gameplay feasibility for the scenario that has been produced and enhanced throughout the game design process. It was also crucial to steer a player toward a specific gaming strategy. As one of our most important learning outcomes is a clear understanding of the workload concept, its factual calculation for each caregiver along time and its inclusion in the nurse reasoning during planning elaboration are focal points. We will demonstrate how to modify the game scenario to create a digital environment in which these somewhat abstract principles can be understood and applied. Finally, we give input on an experience we had on a pilot of a thousand undergraduate nursing students.Keywords: care planning, workload, game design, hospital nurse, organizational skills, digital learning, serious game
Procedia PDF Downloads 1913514 Formulation and Evaluation of Piroxicam Hydrotropic Starch Gel
Authors: Mohammed Ghazwani, Shyma Ali Alshahrani, Zahra Abdu Yousef, Taif Torki Asiri, Ghofran Abdur Rahman, Asma Ali Alshahrani, Umme Hani
Abstract:
Background and introduction: Piroxicam is a nonsteroidal anti-inflammatory drug characterized by low solubility-high permeability used to reduce pain, swelling, and joint stiffness from arthritis. Hydrotropes are a class of compounds that normally increase the aqueous solubility of insoluble solutes. Aim: The objective of the present research study was to formulate and optimize Piroxicam hydrotropic starch gel using sodium salicylate, sodium benzoate as hydrotropic salts, and potato starch for topical application. Materials and methods: The prepared Piroxicam hydrotropic starch gel was characterized for various physicochemical parameters like drug content estimation, pH, tube extrudability, and spreadability; all the prepared formulations were subjected to in-vitro diffusion studies for six hours in 100 ml phosphate buffer (pH 7.4) and determined gel strength. Results: All formulations were found to be white opaque in appearance and have good homogeneity. The pH of formulations was found to be between 6.9-7.9. Drug content ranged from 96.8%-99.4.5%. Spreadability plays an important role in patient compliance and helps in the uniform application of gel to the skin as gels should spread easily; F4 showed a spreadability of 2.4cm highest among all other formulations. In in vitro diffusion studies, extrudability and gel strength were good with F4 in comparison with other formulations; hence F4 was selected as the optimized formulation. Conclusion: Isolated potato starch was successfully employed to prepare the gel. Hydrotropic salt sodium salicylate increased the solubility of Piroxicam and resulted in a stable gel, whereas the gel prepared using sodium benzoate changed its color after one week of preparation from white to light yellowish. Hydrotropic potato starch gel proposed a suitable vehicle for the topical delivery of Piroxicam.Keywords: Piroxicam, potato starch, hydrotropic salts, hydrotropic starch gel
Procedia PDF Downloads 1453513 Revolutionizing Manufacturing: Embracing Additive Manufacturing with Eggshell Polylactide (PLA) Polymer
Authors: Choy Sonny Yip Hong
Abstract:
This abstract presents an exploration into the creation of a sustainable bio-polymer compound for additive manufacturing, specifically 3D printing, with a focus on eggshells and polylactide (PLA) polymer. The project initially conducted experiments using a variety of food by-products to create bio-polymers, and promising results were obtained when combining eggshells with PLA polymer. The research journey involved precise measurements, drying of PLA to remove moisture, and the utilization of a filament-making machine to produce 3D printable filaments. The project began with exploratory research and experiments, testing various combinations of food by-products to create bio-polymers. After careful evaluation, it was discovered that eggshells and PLA polymer produced promising results. The initial mixing of the two materials involved heating them just above the melting point. To make the compound 3D printable, the research focused on finding the optimal formulation and production process. The process started with precise measurements of the PLA and eggshell materials. The PLA was placed in a heating oven to remove any absorbed moisture. Handmade testing samples were created to guide the planning for 3D-printed versions. The scrap PLA was recycled and ground into a powdered state. The drying process involved gradual moisture evaporation, which required several hours. The PLA and eggshell materials were then placed into the hopper of a filament-making machine. The machine's four heating elements controlled the temperature of the melted compound mixture, allowing for optimal filament production with accurate and consistent thickness. The filament-making machine extruded the compound, producing filament that could be wound on a wheel. During the testing phase, trials were conducted with different percentages of eggshell in the PLA mixture, including a high percentage (20%). However, poor extrusion results were observed for high eggshell percentage mixtures. Samples were created, and continuous improvement and optimization were pursued to achieve filaments with good performance. To test the 3D printability of the DIY filament, a 3D printer was utilized, set to print the DIY filament smoothly and consistently. Samples were printed and mechanically tested using a universal testing machine to determine their mechanical properties. This testing process allowed for the evaluation of the filament's performance and suitability for additive manufacturing applications. In conclusion, the project explores the creation of a sustainable bio-polymer compound using eggshells and PLA polymer for 3D printing. The research journey involved precise measurements, drying of PLA, and the utilization of a filament-making machine to produce 3D printable filaments. Continuous improvement and optimization were pursued to achieve filaments with good performance. The project's findings contribute to the advancement of additive manufacturing, offering opportunities for design innovation, carbon footprint reduction, supply chain optimization, and collaborative potential. The utilization of eggshell PLA polymer in additive manufacturing has the potential to revolutionize the manufacturing industry, providing a sustainable alternative and enabling the production of intricate and customized products.Keywords: additive manufacturing, 3D printing, eggshell PLA polymer, design innovation, carbon footprint reduction, supply chain optimization, collaborative potential
Procedia PDF Downloads 743512 Failure Analysis of Electrode, Nozzle Plate, and Powder Injector during Air Plasma Spray Coating
Authors: Nemes Alexandra
Abstract:
The aim of the research is to develop an optimum microstructure of steel coatings on aluminum surfaces for application on the crankcase cylinder bores. For the proper design of the microstructure of the coat, it is important to control the plasma gun unit properly. The maximum operating time was determined while the plasma gun could optimally work before its destruction. Objectives: The aim of the research is to determine the optimal operating time of the plasma gun between renovations (the renovation shall involve the replacement of the test components of the plasma gun: electrode, nozzle plate, powder injector. Methodology: Plasma jet and particle flux analysis with PFI (PFI is a diagnostic tool for all kinds of thermal spraying processes), CT reconstruction and analysis on the new and the used plasma guns, failure analysis of electrodes, nozzle plates, and powder injectors, microscopic examination of the microstructure of the coating. Contributions: As the result of the failure analysis detailed above, the use of the plasma gun was maximized at 100 operating hours in order to get optimal microstructure for the coat.Keywords: APS, air plasma spray, failure analysis, electrode, nozzle plate, powder injector
Procedia PDF Downloads 1203511 Optimal Design of the Power Generation Network in California: Moving towards 100% Renewable Electricity by 2045
Authors: Wennan Long, Yuhao Nie, Yunan Li, Adam Brandt
Abstract:
To fight against climate change, California government issued the Senate Bill No. 100 (SB-100) in 2018 September, which aims at achieving a target of 100% renewable electricity by the end of 2045. A capacity expansion problem is solved in this case study using a binary quadratic programming model. The optimal locations and capacities of the potential renewable power plants (i.e., solar, wind, biomass, geothermal and hydropower), the phase-out schedule of existing fossil-based (nature gas) power plants and the transmission of electricity across the entire network are determined with the minimal total annualized cost measured by net present value (NPV). The results show that the renewable electricity contribution could increase to 85.9% by 2030 and reach 100% by 2035. Fossil-based power plants will be totally phased out around 2035 and solar and wind will finally become the most dominant renewable energy resource in California electricity mix.Keywords: 100% renewable electricity, California, capacity expansion, mixed integer non-linear programming
Procedia PDF Downloads 1713510 Optimal Load Control Strategy in the Presence of Stochastically Dependent Renewable Energy Sources
Authors: Mahmoud M. Othman, Almoataz Y. Abdelaziz, Yasser G. Hegazy
Abstract:
This paper presents a load control strategy based on modification of the Big Bang Big Crunch optimization method. The proposed strategy aims to determine the optimal load to be controlled and the corresponding time of control in order to minimize the energy purchased from substation. The presented strategy helps the distribution network operator to rely on the renewable energy sources in supplying the system demand. The renewable energy sources used in the presented study are modeled using the diagonal band Copula method and sequential Monte Carlo method in order to accurately consider the multivariate stochastic dependence between wind power, photovoltaic power and the system demand. The proposed algorithms are implemented in MATLAB environment and tested on the IEEE 37-node feeder. Several case studies are done and the subsequent discussions show the effectiveness of the proposed algorithm.Keywords: big bang big crunch, distributed generation, load control, optimization, planning
Procedia PDF Downloads 3473509 Identifying Dominant Anaerobic Microorganisms for Degradation of Benzene
Authors: Jian Peng, Wenhui Xiong, Zheng Lu
Abstract:
An optimal recipe of amendment (nutrients and electron acceptors) was developed and dominant indigenous benzene-degrading microorganisms were characterized in this study. Lessons were learnt from the development of the optimal amendment recipe: (1) salinity and substantial initial concentration of benzene were detrimental for benzene biodegradation; (2) large dose of amendments can shorten the lag time for benzene biodegradation occurrence; (3) toluene was an essential co-substance for promoting benzene degradation activity. The stable isotope probing study identified incorporation 13C from 13C-benzene into microorganisms, which can be considered as a direct evidence of the occurrence of benzene biodegradation. The dominant mechanism for benzene removal was identified by quantitative polymerase chain reaction analysis to be nitrate reduction. Microbial analyses (denaturing gradient gel electrophoresis and 16S ribosomal RNA) demonstrated that members of genus Dokdonella spp., Pusillimonas spp., and Advenella spp. were predominant within the microbial community and involved in the anaerobic benzene bioremediation.Keywords: benzene, enhanced anaerobic bioremediation, stable isotope probing, biosep biotrap
Procedia PDF Downloads 3423508 Elephant Herding Optimization for Service Selection in QoS-Aware Web Service Composition
Authors: Samia Sadouki Chibani, Abdelkamel Tari
Abstract:
Web service composition combines available services to provide new functionality. Given the number of available services with similar functionalities and different non functional aspects (QoS), the problem of finding a QoS-optimal web service composition is considered as an optimization problem belonging to NP-hard class. Thus, an optimal solution cannot be found by exact algorithms within a reasonable time. In this paper, a meta-heuristic bio-inspired is presented to address the QoS aware web service composition; it is based on Elephant Herding Optimization (EHO) algorithm, which is inspired by the herding behavior of elephant group. EHO is characterized by a process of dividing and combining the population to sub populations (clan); this process allows the exchange of information between local searches to move toward a global optimum. However, with Applying others evolutionary algorithms the problem of early stagnancy in a local optimum cannot be avoided. Compared with PSO, the results of experimental evaluation show that our proposition significantly outperforms the existing algorithm with better performance of the fitness value and a fast convergence.Keywords: bio-inspired algorithms, elephant herding optimization, QoS optimization, web service composition
Procedia PDF Downloads 3283507 Estimating the Properties of Polymer Concrete Using the Response Surface Method
Authors: Oguz Ugurkan Akkaya, Alpaslan Sipahi, Ozgur Firat Pamukcu, Murat Yasar, Tolga Guler, Arif Ulu, Ferit Cakir
Abstract:
With the increase in human population, expansion, and renovation of cities, infrastructure systems today need to be manufactured to be more durable and long-lasting. The most cost-effective and durable manufacturing of components is a general problem of all engineering disciplines. Therefore, it is important to determine the most optimal components. This study mainly focuses on the most optimal component design of the polymer concrete. For this purpose, the lower and upper limits of the three main components of the polymer concrete are determined. The effects of these three principal components on the compressive strength, tensile strength, and unit price of polymer concrete are estimated using the response surface method. Box-Behnken Design is used in designing the experiments. Compressive strength, tensile strength, and unit prices are successfully estimated with variance ratios (R²) of 0.82, 0.92, and 0.90, respectively, and the optimum mixture quantity is determined.Keywords: Box-Behnken Design, compressive strength, mechanical tests, polymer concrete, tensile strength
Procedia PDF Downloads 1723506 Utilizing Bario Rice, a Natural Red-Pigmented Rice from Sarawak, Malaysia, in the Development of Gluten-Free Bread
Authors: Macdalyna Esther Ronie, Hasmadi Mamat, Ahmad Hazim Abdul Aziz, Mohamad Khairi Zainol
Abstract:
Current trends in gluten-free food development are increasingly leaning towards the utilization of pigmented rice flour, with a particular focus on Bario Merah Sederhana (BMS), a red-pigmented rice native to Sarawak, Malaysia. This study delves into the evaluation of the nutritional, textural, and sensory attributes of gluten-free rice bread produced from a blend of BMS rice flour and potato starch. The resulting samples are denoted as F1 (100% BMS rice flour), F2 (90% BMS rice flour and 10% potato starch), F3 (80% BMS rice flour and 20% potato starch), and F4 (70% BMS rice flour and 30% potato starch). Comparatively, these gluten-free rice bread formulations exhibit higher levels of ash and crude fiber, along with lower carbohydrate content when juxtaposed with conventional wheat bread. Notably, the crude protein content of the rice bread diminishes significantly (p<0.05) as the proportion of rice flour decreases, primarily due to the higher protein content found in wheat flour. The crumb of the rice bread appears darker owing to the red pigment in the rice flour, while the crust is lighter than that of the control sample, possibly attributable to a reduced Maillard reaction. Among the various rice bread formulations, F4 stands out with the least dough and bread hardness, accompanied by the highest levels of stickiness and springiness in both dough and bread, respectively. In sensory evaluations, wheat bread garners the highest rating (p<0.05). However, within the realm of rice breads, F4 emerges as a viable and acceptable formulation, as indicated by its commendable scores in color (7.03), flavor (5.73), texture (6.03), and overall acceptability (6.18). These findings underscore the potential of BMS in the creation of gluten-free rice breads, with the formulation consisting of 70% rice flour and 30% potato starch emerging as a well-received and suitable option.Keywords: gluten-free bread, bario rice, proximate composition, sensory evaluation
Procedia PDF Downloads 2433505 Ta-DAH: Task Driven Automated Hardware Design of Free-Flying Space Robots
Authors: Lucy Jackson, Celyn Walters, Steve Eckersley, Mini Rai, Simon Hadfield
Abstract:
Space robots will play an integral part in exploring the universe and beyond. A correctly designed space robot will facilitate OOA, satellite servicing and ADR. However, problems arise when trying to design such a system as it is a highly complex multidimensional problem into which there is little research. Current design techniques are slow and specific to terrestrial manipulators. This paper presents a solution to the slow speed of robotic hardware design, and generalizes the technique to free-flying space robots. It presents Ta-DAH Design, an automated design approach that utilises a multi-objective cost function in an iterative and automated pipeline. The design approach leverages prior knowledge and facilitates the faster output of optimal designs. The result is a system that can optimise the size of the base spacecraft, manipulator and some key subsystems for any given task. Presented in this work is the methodology behind Ta-DAH Design and a number optimal space robot designs.Keywords: space robots, automated design, on-orbit operations, hardware design
Procedia PDF Downloads 733504 Development of an Efficient Algorithm for Cessna Citation X Speed Optimization in Cruise
Authors: Georges Ghazi, Marc-Henry Devillers, Ruxandra M. Botez
Abstract:
Aircraft flight trajectory optimization has been identified to be a promising solution for reducing both airline costs and the aviation net carbon footprint. Nowadays, this role has been mainly attributed to the flight management system. This system is an onboard multi-purpose computer responsible for providing the crew members with the optimized flight plan from a destination to the next. To accomplish this function, the flight management system uses a variety of look-up tables to compute the optimal speed and altitude for each flight regime instantly. Because the cruise is the longest segment of a typical flight, the proposed algorithm is focused on minimizing fuel consumption for this flight phase. In this paper, a complete methodology to estimate the aircraft performance and subsequently compute the optimal speed in cruise is presented. Results showed that the obtained performance database was accurate enough to predict the flight costs associated with the cruise phase.Keywords: Cessna Citation X, cruise speed optimization, flight cost, cost index, and golden section search
Procedia PDF Downloads 292