Search results for: time base maintenance
19953 Exploratory Study to Obtain a Biolubricant Base from Transesterified Oils of Animal Fats (Tallow)
Authors: Carlos Alfredo Camargo Vila, Fredy Augusto Avellaneda Vargas, Debora Alcida Nabarlatz
Abstract:
Due to the current need to implement environmentally friendly technologies, the possibility of using renewable raw materials to produce bioproducts such as biofuels, or in this case, to produce biolubricant bases, from residual oils (tallow), originating has been studied of the bovine industry. Therefore, it is hypothesized that through the study and control of the operating variables involved in the reverse transesterification method, a biolubricant base with high performance is obtained on a laboratory scale using animal fats from the bovine industry as raw materials, as an alternative for material recovery and environmental benefit. To implement this process, esterification of the crude tallow oil must be carried out in the first instance, which allows the acidity index to be decreased ( > 1 mg KOH/g oil), this by means of an acid catalysis with sulfuric acid and methanol, molar ratio 7.5:1 methanol: tallow, 1.75% w/w catalyst at 60°C for 150 minutes. Once the conditioning has been completed, the biodiesel is continued to be obtained from the improved sebum, for which an experimental design for the transesterification method is implemented, thus evaluating the effects of the variables involved in the process such as the methanol molar ratio: improved sebum and catalyst percentage (KOH) over methyl ester content (% FAME). Finding that the highest percentage of FAME (92.5%) is given with a 7.5:1 methanol: improved tallow ratio and 0.75% catalyst at 60°C for 120 minutes. And although the% FAME of the biodiesel produced does not make it suitable for commercialization, it does ( > 90%) for its use as a raw material in obtaining biolubricant bases. Finally, once the biodiesel is obtained, an experimental design is carried out to obtain biolubricant bases using the reverse transesterification method, which allows the study of the effects of the biodiesel: TMP (Trimethylolpropane) molar ratio and the percentage of catalyst on viscosity and yield as response variables. As a result, a biolubricant base is obtained that meets the requirements of ISO VG (Classification for industrial lubricants according to ASTM D 2422) 32 (viscosity and viscosity index) for commercial lubricant bases, using a 4:1 biodiesel molar ratio: TMP and 0.51% catalyst at 120°C, at a pressure of 50 mbar for 180 minutes. It is necessary to highlight that the product obtained consists of two phases, a liquid and a solid one, being the first object of study, and leaving the classification and possible application of the second one incognito. Therefore, it is recommended to carry out studies of the greater depth that allows characterizing both phases, as well as improving the method of obtaining by optimizing the variables involved in the process and thus achieving superior results.Keywords: biolubricant base, bovine tallow, renewable resources, reverse transesterification
Procedia PDF Downloads 11719952 Dynamic Risk Identification Using Fuzzy Failure Mode Effect Analysis in Fabric Process Industries: A Research Article as Management Perspective
Authors: A. Sivakumar, S. S. Darun Prakash, P. Navaneethakrishnan
Abstract:
In and around Erode District, it is estimated that more than 1250 chemical and allied textile processing fabric industries are affected, partially closed and shut off for various reasons such as poor management, poor supplier performance, lack of planning for productivity, fluctuation of output, poor investment, waste analysis, labor problems, capital/labor ratio, accumulation of stocks, poor maintenance of resources, deficiencies in the quality of fabric, low capacity utilization, age of plant and equipment, high investment and input but low throughput, poor research and development, lack of energy, workers’ fear of loss of jobs, work force mix and work ethic. The main objective of this work is to analyze the existing conditions in textile fabric sector, validate the break even of Total Productivity (TP), analyze, design and implement fuzzy sets and mathematical programming for improvement of productivity and quality dimensions in the fabric processing industry. It needs to be compatible with the reality of textile and fabric processing industries. The highly risk events from productivity and quality dimension were found by fuzzy systems and results are wrapped up among the textile fabric processing industry.Keywords: break even point, fuzzy crisp data, fuzzy sets, productivity, productivity cycle, total productive maintenance
Procedia PDF Downloads 33819951 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network
Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar
Abstract:
Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network
Procedia PDF Downloads 51719950 Co-Integration Model for Predicting Inflation Movement in Nigeria
Authors: Salako Rotimi, Oshungade Stephen, Ojewoye Opeyemi
Abstract:
The maintenance of price stability is one of the macroeconomic challenges facing Nigeria as a nation. This paper attempts to build a co-integration multivariate time series model for inflation movement in Nigeria using data extracted from the abstract of statistics of the Central Bank of Nigeria (CBN) from 2008 to 2017. The Johansen cointegration test suggests at least one co-integration vector describing the long run relationship between Consumer Price Index (CPI), Food Price Index (FPI) and Non-Food Price Index (NFPI). All three series show increasing pattern, which indicates a sign of non-stationary in each of the series. Furthermore, model predictability was established with root-mean-square-error, mean absolute error, mean average percentage error, and Theil’s unbiased statistics for n-step forecasting. The result depicts that the long run coefficient of a consumer price index (CPI) has a positive long-run relationship with the food price index (FPI) and non-food price index (NFPI).Keywords: economic, inflation, model, series
Procedia PDF Downloads 24419949 A Cooperative, Autonomous, and Continuously Operating Drone System Offered to Railway and Bridge Industry: The Business Model Behind
Authors: Paolo Guzzini, Emad Samuel M. Ebeid
Abstract:
Bridges and Railways are critical infrastructures. Ensuring safety for transports using such assets is a primary goal as it directly impacts the lives of people. By the way, improving safety could require increased investments in O&M, and therefore optimizing resource usage for asset maintenance becomes crucial. Drones4Safety (D4S), a European project funded under the H2020 Research and Innovation Action (RIA) program, aims to increase the safety of the European civil transport by building a system that relies on 3 main pillars: • Drones operating autonomously in swarm mode; • Drones able to recharge themselves using inductive phenomena produced by transmission lines in the nearby of bridges and railways assets to be inspected; • Data acquired that are analyzed with AI-empowered algorithms for defect detection This paper describes the business model behind this disruptive project. The Business Model is structured in 2 parts: • The first part is focused on the design of the business model Canvas, to explain the value provided by the Drone4safety project; • The second part aims at defining a detailed financial analysis, with the target of calculating the IRR (Internal Return rate) and the NPV (Net Present Value) of the investment in a 7 years plan (2 years to run the project + 5 years post-implementation). As to the financial analysis 2 different points of view are assumed: • Point of view of the Drones4safety company in charge of designing, producing, and selling the new system; • Point of view of the Utility company that will adopt the new system in its O&M practices; Assuming the point of view of the Drones4safety company 3 scenarios were considered: • Selling the drones > revenues will be produced by the drones’ sales; • Renting the drones > revenues will be produced by the rental of the drones (with a time-based model); • Selling the data acquisition service > revenues will be produced by the sales of pictures acquired by drones; Assuming the point of view of a utility adopting the D4S system, a 4th scenario was analyzed taking into account the decremental costs related to the change of operation and maintenance practices. The paper will show, for both companies, what are the key parameters affecting most of the business model and which are the sustainable scenarios.Keywords: a swarm of drones, AI, bridges, railways, drones4safety company, utility companies
Procedia PDF Downloads 14119948 Developing a Maturity Model of Digital Twin Application for Infrastructure Asset Management
Authors: Qingqing Feng, S. Thomas Ng, Frank J. Xu, Jiduo Xing
Abstract:
Faced with unprecedented challenges including aging assets, lack of maintenance budget, overtaxed and inefficient usage, and outcry for better service quality from the society, today’s infrastructure systems has become the main focus of many metropolises to pursue sustainable urban development and improve resilience. Digital twin, being one of the most innovative enabling technologies nowadays, may open up new ways for tackling various infrastructure asset management (IAM) problems. Digital twin application for IAM, as its name indicated, represents an evolving digital model of intended infrastructure that possesses functions including real-time monitoring; what-if events simulation; and scheduling, maintenance, and management optimization based on technologies like IoT, big data and AI. Up to now, there are already vast quantities of global initiatives of digital twin applications like 'Virtual Singapore' and 'Digital Built Britain'. With digital twin technology permeating the IAM field progressively, it is necessary to consider the maturity of the application and how those institutional or industrial digital twin application processes will evolve in future. In order to deal with the gap of lacking such kind of benchmark, a draft maturity model is developed for digital twin application in the IAM field. Firstly, an overview of current smart cities maturity models is given, based on which the draft Maturity Model of Digital Twin Application for Infrastructure Asset Management (MM-DTIAM) is developed for multi-stakeholders to evaluate and derive informed decision. The process of development follows a systematic approach with four major procedures, namely scoping, designing, populating and testing. Through in-depth literature review, interview and focus group meeting, the key domain areas are populated, defined and iteratively tuned. Finally, the case study of several digital twin projects is conducted for self-verification. The findings of the research reveal that: (i) the developed maturity model outlines five maturing levels leading to an optimised digital twin application from the aspects of strategic intent, data, technology, governance, and stakeholders’ engagement; (ii) based on the case study, levels 1 to 3 are already partially implemented in some initiatives while level 4 is on the way; and (iii) more practices are still needed to refine the draft to be mutually exclusive and collectively exhaustive in key domain areas.Keywords: digital twin, infrastructure asset management, maturity model, smart city
Procedia PDF Downloads 15719947 Additive Weibull Model Using Warranty Claim and Finite Element Analysis Fatigue Analysis
Authors: Kanchan Mondal, Dasharath Koulage, Dattatray Manerikar, Asmita Ghate
Abstract:
This paper presents an additive reliability model using warranty data and Finite Element Analysis (FEA) data. Warranty data for any product gives insight to its underlying issues. This is often used by Reliability Engineers to build prediction model to forecast failure rate of parts. But there is one major limitation in using warranty data for prediction. Warranty periods constitute only a small fraction of total lifetime of a product, most of the time it covers only the infant mortality and useful life zone of a bathtub curve. Predicting with warranty data alone in these cases is not generally provide results with desired accuracy. Failure rate of a mechanical part is driven by random issues initially and wear-out or usage related issues at later stages of the lifetime. For better predictability of failure rate, one need to explore the failure rate behavior at wear out zone of a bathtub curve. Due to cost and time constraints, it is not always possible to test samples till failure, but FEA-Fatigue analysis can provide the failure rate behavior of a part much beyond warranty period in a quicker time and at lesser cost. In this work, the authors proposed an Additive Weibull Model, which make use of both warranty and FEA fatigue analysis data for predicting failure rates. It involves modeling of two data sets of a part, one with existing warranty claims and other with fatigue life data. Hazard rate base Weibull estimation has been used for the modeling the warranty data whereas S-N curved based Weibull parameter estimation is used for FEA data. Two separate Weibull models’ parameters are estimated and combined to form the proposed Additive Weibull Model for prediction.Keywords: bathtub curve, fatigue, FEA, reliability, warranty, Weibull
Procedia PDF Downloads 7319946 Evaluating and Reducing Aircraft Technical Delays and Cancellations Impact on Reliability Operational: Case Study of Airline Operator
Authors: Adel A. Ghobbar, Ahmad Bakkar
Abstract:
Although special care is given to maintenance, aircraft systems fail, and these failures cause delays and cancellations. The occurrence of Delays and Cancellations affects operators and manufacturers negatively. To reduce technical delays and cancellations, one should be able to determine the important systems causing them. The goal of this research is to find a method to define the most expensive delays and cancellations systems for Airline operators. A predictive model was introduced to forecast the failure and their impact after carrying out research that identifies relevant information to tackle the problems faced while answering the questions of this paper. Data were obtained from the manufacturers’ services reliability team database. Subsequently, delays and cancellations evaluation methods were identified. No cost estimation methods were used due to their complexity. The model was developed, and it takes into account the frequency of delays and cancellations and uses weighting factors to give an indication of the severity of their duration. The weighting factors are based on customer experience. The data Analysis approach has shown that delays and cancellations events are not seasonal and do not follow any specific trends. The use of weighting factor does have an influence on the shortlist over short periods (Monthly) but not the analyzed period of three years. Landing gear and the navigation system are among the top 3 factors causing delays and cancellations for all three aircraft types. The results did confirm that the cooperation between certain operators and manufacture reduce the impact of delays and cancellations.Keywords: reliability, availability, delays & cancellations, aircraft maintenance
Procedia PDF Downloads 13219945 The Budget Impact of the DISCERN™ Diagnostic Test for Alzheimer’s Disease in the United States
Authors: Frederick Huie, Lauren Fusfeld, William Burchenal, Scott Howell, Alyssa McVey, Thomas F. Goss
Abstract:
Alzheimer’s Disease (AD) is a degenerative brain disease characterized by memory loss and cognitive decline that presents a substantial economic burden for patients and health insurers in the US. This study evaluates the payer budget impact of the DISCERN™ test in the diagnosis and management of patients with symptoms of dementia evaluated for AD. DISCERN™ comprises three assays that assess critical factors related to AD that regulate memory, formation of synaptic connections among neurons, and levels of amyloid plaques and neurofibrillary tangles in the brain and can provide a quicker, more accurate diagnosis than tests in the current diagnostic pathway (CDP). An Excel-based model with a three-year horizon was developed to assess the budget impact of DISCERN™ compared with CDP in a Medicare Advantage plan with 1M beneficiaries. Model parameters were identified through a literature review and were verified through consultation with clinicians experienced in diagnosis and management of AD. The model assesses direct medical costs/savings for patients based on the following categories: •Diagnosis: costs of diagnosis using DISCERN™ and CDP. •False Negative (FN) diagnosis: incremental cost of care avoidable with a correct AD diagnosis and appropriately directed medication. •True Positive (TP) diagnosis: AD medication costs; cost from a later TP diagnosis with the CDP versus DISCERN™ in the year of diagnosis, and savings from the delay in AD progression due to appropriate AD medication in patients who are correctly diagnosed after a FN diagnosis.•False Positive (FP) diagnosis: cost of AD medication for patients who do not have AD. A one-way sensitivity analysis was conducted to assess the effect of varying key clinical and cost parameters ±10%. An additional scenario analysis was developed to evaluate the impact of individual inputs. In the base scenario, DISCERN™ is estimated to decrease costs by $4.75M over three years, equating to approximately $63.11 saved per test per year for a cohort followed over three years. While the diagnosis cost is higher with DISCERN™ than with CDP modalities, this cost is offset by the higher overall costs associated with CDP due to the longer time needed to receive a TP diagnosis and the larger number of patients who receive a FN diagnosis and progress more rapidly than if they had received appropriate AD medication. The sensitivity analysis shows that the three parameters with the greatest impact on savings are: reduced sensitivity of DISCERN™, improved sensitivity of the CDP, and a reduction in the percentage of disease progression that is avoided with appropriate AD medication. A scenario analysis in which DISCERN™ reduces the utilization for patients of computed tomography from 21% in the base case to 16%, magnetic resonance imaging from 37% to 27% and cerebrospinal fluid biomarker testing, positive emission tomography, electroencephalograms, and polysomnography testing from 4%, 5%, 10%, and 8%, respectively, in the base case to 0%, results in an overall three-year net savings of $14.5M. DISCERN™ improves the rate of accurate, definitive diagnosis of AD earlier in the disease and may generate savings for Medicare Advantage plans.Keywords: Alzheimer’s disease, budget, dementia, diagnosis.
Procedia PDF Downloads 13819944 Assessment of Causes of Building Collapse in Nigeria
Authors: Olufemi Oyedele
Abstract:
Building collapse (BC) in Nigeria is becoming a regular occurrence, each recording great casualties in the number of lives and materials lost. Building collapse is a situation where building which has been completed and occupied, completed but not occupied or under construction, collapses on its own due to action or inaction of man or due to natural event like earthquake, storm, flooding, tsunami or wildfire. It is different from building demolition. There are various causes of building collapse and each case requires expert judgment to decide the cause of its collapse. Rate of building collapse is a reflection of the level of organization and control of building activities and degree of sophistication of the construction professionals in a country. This study explored the use of case study by examining the causes of six (6) collapsed buildings (CB) across Nigeria. Samples of materials from the sites of the collapsed buildings were taken for testing and analysis, while critical observations were made at the sites to note the conditions of the ground (building base). The study found out that majority of the building collapses in Nigeria were due to poor workmanship, sub-standard building materials, followed by bad building base and poor design. The National Building Code 2006 is not effective due to lack of enforcement and the Physical Development Departments of states and Federal Capital Territory are just mere agents of corruption allowing all types of construction without building approvals.Keywords: building collapse, concrete tests, differential settlement, integrity test, quality control
Procedia PDF Downloads 53519943 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems
Authors: Rodolfo Lorbieski, Silvia Modesto Nassar
Abstract:
Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.Keywords: stacking, multi-layers, ensemble, multi-class
Procedia PDF Downloads 26919942 Nurses' Knowledge and Attitudes about Clinical Governance
Authors: Sedigheh Salemi, Mahnaz Sanjari, Maryam Aalaa, Mohammad Mirzabeigi
Abstract:
Clinical governance is the framework within which the health service provider is required to ongoing accountability and improvement of the quality of their services. This cross-sectional study was conducted in 661 nurses who work in government hospitals from 35 hospitals of 9 provinces in Iran. The study was approved by the Nursing Council and was carried out with the authorization of the Research Ethics Committee. The questionnaire included 24 questions in which 4 questions focused on clinical governance defining from the nurses' perspective. The reliability was evaluated by Cronbach's alpha (α=0/83). Statistical analyzes were performed, using SPSS version 16. Approximately 40% of nurses correctly answered that clinical governance is not "system of punishment and rewards for the staff". The most nurses believed that "clinical efficacy" is one of the main components of clinical governance. A few of nurses correctly responded that "Evidence Based Practice" and "management" is not part of clinical governance. The small number of nurses correctly answered that the "maintenance of patient records" and "to recognize the adverse effects" is not the role of nurse in clinical governance. Most "do not know" answer was to the "maintenance of patient records". The most nurses unanimously believed that the implementation of clinical governance led to "promoting the quality of care". About a third of nurses correctly stated that the implementation of clinical governance will not lead to "an increase in salaries and benefits of the medical team". As a member of the health team, nurses are responsible in terms of participation in quality improvement and it is necessary to create an environment in which clinical care will flourish and serve to preserve the high standards.Keywords: clinical governance, nurses, salary, health team
Procedia PDF Downloads 43019941 A Detection Method of Faults in Railway Pantographs Based on Dynamic Phase Plots
Authors: G. Santamato, M. Solazzi, A. Frisoli
Abstract:
Systems for detection of damages in railway pantographs effectively reduce the cost of maintenance and improve time scheduling. In this paper, we present an approach to design a monitoring tool fitting strong customer requirements such as portability and ease of use. Pantograph has been modeled to estimate its dynamical properties, since no data are available. With the aim to focus on suspensions health, a two Degrees of Freedom (DOF) scheme has been adopted. Parameters have been calculated by means of analytical dynamics. A Finite Element Method (FEM) modal analysis verified the former model with an acceptable error. The detection strategy seeks phase-plots topology alteration, induced by defects. In order to test the suitability of the method, leakage in the dashpot was simulated on the lumped model. Results are interesting because changes in phase plots are more appreciable than frequency-shift. Further calculations as well as experimental tests will support future developments of this smart strategy.Keywords: pantograph models, phase plots, structural health monitoring, damage detection
Procedia PDF Downloads 36219940 Identifying and Review of Effective Factors on Marketing Relationship In National Iranian Drilling Company from Managers’ View
Authors: Hoda Ghorbani
Abstract:
Today, many markets are matured and faced by a congested competition and amount of supply that is quite greater than demand. With respect to such modifications, organizations shall make themselves more equipped beforehand and ready to tackle with their rivals. In this regard, Relationship Marketing tries to lower the cost for attracting new customers by establishment and maintenance long run relations with the current customers and by which they try to increase corporative profitability. Consequently, identifying of relationship marketing and its effective factors is an essential element for maintenance of market and improvement of corporative competition potential. The present study deals with identifying the effective factors on marketing relationship in National Iranian Drilling Company (NIDC) from managers’ point of view. Methodology of this study is of descriptive- survey type. In addition to an extensive review on secondary sources and interview with experienced members in NIDC, researcher identified the related factors and distributed a questionnaire, including 31 questions, among 144 participants from corporative managers and first-rank principals. After gathering information, the related data have been analyzed by using binomial test as well as Binomial Analytic Hierarchy Process (AHP) of pair-wise comparisons. Study results showed that some variable like communication, commitment, Conflict Management and trust have affected on relationship marketing based on their order preference.Keywords: marketing relationship, trust, commitment, communication, conflict management
Procedia PDF Downloads 37119939 Particle Swarm Optimization Based Vibration Suppression of a Piezoelectric Actuator Using Adaptive Fuzzy Sliding Mode Controller
Authors: Jin-Siang Shaw, Patricia Moya Caceres, Sheng-Xiang Xu
Abstract:
This paper aims to integrate the particle swarm optimization (PSO) method with the adaptive fuzzy sliding mode controller (AFSMC) to achieve vibration attenuation in a piezoelectric actuator subject to base excitation. The piezoelectric actuator is a complicated system made of ferroelectric materials and its performance can be affected by nonlinear hysteresis loop and unknown system parameters and external disturbances. In this study, an adaptive fuzzy sliding mode controller is proposed for the vibration control of the system, because the fuzzy sliding mode controller is designed to tackle the unknown parameters and external disturbance of the system, and the adaptive algorithm is aimed for fine-tuning this controller for error converging purpose. Particle swarm optimization method is used in order to find the optimal controller parameters for the piezoelectric actuator. PSO starts with a population of random possible solutions, called particles. The particles move through the search space with dynamically adjusted speed and direction that change according to their historical behavior, allowing the values of the particles to quickly converge towards the best solutions for the proposed problem. In this paper, an initial set of controller parameters is applied to the piezoelectric actuator which is subject to resonant base excitation with large amplitude vibration. The resulting vibration suppression is about 50%. Then PSO is applied to search for an optimal controller in the neighborhood of this initial controller. The performance of the optimal fuzzy sliding mode controller found by PSO indeed improves up to 97.8% vibration attenuation. Finally, adaptive version of fuzzy sliding mode controller is adopted for further improving vibration suppression. Simulation result verifies the performance of the adaptive controller with 99.98% vibration reduction. Namely the vibration of the piezoelectric actuator subject to resonant base excitation can be completely annihilated using this PSO based adaptive fuzzy sliding mode controller.Keywords: adaptive fuzzy sliding mode controller, particle swarm optimization, piezoelectric actuator, vibration suppression
Procedia PDF Downloads 14619938 Developing a Cloud Intelligence-Based Energy Management Architecture Facilitated with Embedded Edge Analytics for Energy Conservation in Demand-Side Management
Authors: Yu-Hsiu Lin, Wen-Chun Lin, Yen-Chang Cheng, Chia-Ju Yeh, Yu-Chuan Chen, Tai-You Li
Abstract:
Demand-Side Management (DSM) has the potential to reduce electricity costs and carbon emission, which are associated with electricity used in the modern society. A home Energy Management System (EMS) commonly used by residential consumers in a down-stream sector of a smart grid to monitor, control, and optimize energy efficiency to domestic appliances is a system of computer-aided functionalities as an energy audit for residential DSM. Implementing fault detection and classification to domestic appliances monitored, controlled, and optimized is one of the most important steps to realize preventive maintenance, such as residential air conditioning and heating preventative maintenance in residential/industrial DSM. In this study, a cloud intelligence-based green EMS that comes up with an Internet of Things (IoT) technology stack for residential DSM is developed. In the EMS, Arduino MEGA Ethernet communication-based smart sockets that module a Real Time Clock chip to keep track of current time as timestamps via Network Time Protocol are designed and implemented for readings of load phenomena reflecting on voltage and current signals sensed. Also, a Network-Attached Storage providing data access to a heterogeneous group of IoT clients via Hypertext Transfer Protocol (HTTP) methods is configured to data stores of parsed sensor readings. Lastly, a desktop computer with a WAMP software bundle (the Microsoft® Windows operating system, Apache HTTP Server, MySQL relational database management system, and PHP programming language) serves as a data science analytics engine for dynamic Web APP/REpresentational State Transfer-ful web service of the residential DSM having globally-Advanced Internet of Artificial Intelligence (AI)/Computational Intelligence. Where, an abstract computing machine, Java Virtual Machine, enables the desktop computer to run Java programs, and a mash-up of Java, R language, and Python is well-suited and -configured for AI in this study. Having the ability of sending real-time push notifications to IoT clients, the desktop computer implements Google-maintained Firebase Cloud Messaging to engage IoT clients across Android/iOS devices and provide mobile notification service to residential/industrial DSM. In this study, in order to realize edge intelligence that edge devices avoiding network latency and much-needed connectivity of Internet connections for Internet of Services can support secure access to data stores and provide immediate analytical and real-time actionable insights at the edge of the network, we upgrade the designed and implemented smart sockets to be embedded AI Arduino ones (called embedded AIduino). With the realization of edge analytics by the proposed embedded AIduino for data analytics, an Arduino Ethernet shield WizNet W5100 having a micro SD card connector is conducted and used. The SD library is included for reading parsed data from and writing parsed data to an SD card. And, an Artificial Neural Network library, ArduinoANN, for Arduino MEGA is imported and used for locally-embedded AI implementation. The embedded AIduino in this study can be developed for further applications in manufacturing industry energy management and sustainable energy management, wherein in sustainable energy management rotating machinery diagnostics works to identify energy loss from gross misalignment and unbalance of rotating machines in power plants as an example.Keywords: demand-side management, edge intelligence, energy management system, fault detection and classification
Procedia PDF Downloads 25119937 Prime Mover Sizing for Base-Loaded Combined Heating and Power Systems
Authors: Djalal Boualili
Abstract:
This article considers the problem of sizing prime movers for combined heating and power (CHP) systems operating at full load to satisfy a fraction of a facility's electric load, i.e. a base load. Prime mover sizing is examined using three criteria: operational cost, carbon dioxide emissions (CDE), and primary energy consumption (PEC). The sizing process leads to consider ratios of conversion factors applied to imported electricity to conversion factors applied to fuel consumed. These ratios are labelled RCost, R CDE, R PEC depending on whether the conversion factors are associated with operational cost, CDE, or PEC, respectively. Analytical results show that in order to achieve savings in operational cost, CDE, or PEC, the ratios must be larger than a unique constant R Min that only depends on the CHP components efficiencies. Savings in operational cost, CDE, or PEC due to CHP operation are explicitly formulated using simple equations. This facilitates the process of comparing the tradeoffs of optimizing the savings of one criterion over the other two – a task that has traditionally been accomplished through computer simulations. A hospital building, located in Chlef, Algeria, was used as an example to apply the methodology presented in this article.Keywords: sizing, heating and power, ratios, energy consumption, carbon dioxide emissions
Procedia PDF Downloads 23119936 Entropy Generation Analysis of Cylindrical Heat Pipe Using Nanofluid
Authors: Morteza Ghanbarpour, Rahmatollah Khodabandeh
Abstract:
In this study, second law of thermodynamic is employed to evaluate heat pipe thermal performance. In fact, nanofluids potential to decrease the entropy generation of cylindrical heat pipes are studied and the results are compared with experimental data. Some cylindrical copper heat pipes of 200 mm length and 6.35 mm outer diameter were fabricated and tested with distilled water and water based Al2O3 nanofluids with volume concentrations of 1-5% as working fluids. Nanofluids are nanotechnology-based colloidal suspensions fabricated by suspending nanoparticles in a base liquid. These fluids have shown potential to enhance heat transfer properties of the base liquids used in heat transfer application. When the working fluid undergoes between different states in heat pipe cycle the entropy is generated. Different sources of irreversibility in heat pipe thermodynamic cycle are investigated and nanofluid effect on each of these sources is studied. Both experimental and theoretical studies reveal that nanofluid is a good choice to minimize the entropy generation in heat pipe thermodynamic cycle which results in higher thermal performance and efficiency of the system.Keywords: heat pipe, nanofluid, thermodynamics, entropy generation, thermal resistance
Procedia PDF Downloads 47019935 A Formal Verification Approach for Linux Kernel Designing
Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan
Abstract:
Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.Keywords: formal approach, type theory, Linux Kernel, software program
Procedia PDF Downloads 13719934 A Modular Solution for Large-Scale Critical Industrial Scheduling Problems with Coupling of Other Optimization Problems
Authors: Ajit Rai, Hamza Deroui, Blandine Vacher, Khwansiri Ninpan, Arthur Aumont, Francesco Vitillo, Robert Plana
Abstract:
Large-scale critical industrial scheduling problems are based on Resource-Constrained Project Scheduling Problems (RCPSP), that necessitate integration with other optimization problems (e.g., vehicle routing, supply chain, or unique industrial ones), thus requiring practical solutions (i.e., modular, computationally efficient with feasible solutions). To the best of our knowledge, the current industrial state of the art is not addressing this holistic problem. We propose an original modular solution that answers the issues exhibited by the delivery of complex projects. With three interlinked entities (project, task, resources) having their constraints, it uses a greedy heuristic with a dynamic cost function for each task with a situational assessment at each time step. It handles large-scale data and can be easily integrated with other optimization problems, already existing industrial tools and unique constraints as required by the use case. The solution has been tested and validated by domain experts on three use cases: outage management in Nuclear Power Plants (NPPs), planning of future NPP maintenance operation, and application in the defense industry on supply chain and factory relocation. In the first use case, the solution, in addition to the resources’ availability and tasks’ logical relationships, also integrates several project-specific constraints for outage management, like, handling of resource incompatibility, updating of tasks priorities, pausing tasks in a specific circumstance, and adjusting dynamic unit of resources. With more than 20,000 tasks and multiple constraints, the solution provides a feasible schedule within 10-15 minutes on a standard computer device. This time-effective simulation corresponds with the nature of the problem and requirements of several scenarios (30-40 simulations) before finalizing the schedules. The second use case is a factory relocation project where production lines must be moved to a new site while ensuring the continuity of their production. This generates the challenge of merging job shop scheduling and the RCPSP with location constraints. Our solution allows the automation of the production tasks while considering the rate expectation. The simulation algorithm manages the use and movement of resources and products to respect a given relocation scenario. The last use case establishes a future maintenance operation in an NPP. The project contains complex and hard constraints, like on Finish-Start precedence relationship (i.e., successor tasks have to start immediately after predecessors while respecting all constraints), shareable coactivity for managing workspaces, and requirements of a specific state of "cyclic" resources (they can have multiple states possible with only one at a time) to perform tasks (can require unique combinations of several cyclic resources). Our solution satisfies the requirement of minimization of the state changes of cyclic resources coupled with the makespan minimization. It offers a solution of 80 cyclic resources with 50 incompatibilities between levels in less than a minute. Conclusively, we propose a fast and feasible modular approach to various industrial scheduling problems that were validated by domain experts and compatible with existing industrial tools. This approach can be further enhanced by the use of machine learning techniques on historically repeated tasks to gain further insights for delay risk mitigation measures.Keywords: deterministic scheduling, optimization coupling, modular scheduling, RCPSP
Procedia PDF Downloads 19819933 Adsorption and Corrosion Inhibition of New Synthesized Thiophene Schiff Base on Mild Steel in HCL Solution
Authors: H. Elmsellem, A. Aouniti, S. Radi, A. Chetouani, B. Hammouti
Abstract:
The synthesis of new organic molecules offers various molecular structures containing heteroatoms and substituents for corrosion protection in acid pickling of metals. The most synthesized compounds are the nitrogen heterocyclic compounds, which are known to be excellent complex or chelate forming substances with metals. The choice of the inhibitor is based on two considerations: first it could be synthesized conveniently from relatively cheap raw materials, secondly, it contains the electron cloud on the aromatic ring or, the electro negative atoms such as nitrogen and oxygen in the relatively long chain compounds. In the present study, (NE)‐2‐methyl‐N‐(thiophen‐2‐ylmethylidene) aniline(T) was synthesized and its inhibiting action on the corrosion of mild steel in 1 M hydrochloric acid was examined by different corrosion methods, such as weight loss, potentiodynamic polarization and electrochemical impedance spectroscopy (EIS). The experimental results suggest that this compound is an efficient corrosion inhibitor and the inhibition efficiency increases with the increase in inhibitor concentration. Adsorption of this compound on mild steel surface obeys Langmuir’s isotherm. Correlation between quantum chemical calculations and inhibition efficiency of the investigated compound is discussed using the Density Functional Theory method (DFT).Keywords: mild steel, Schiff base, inhibition, corrosion, HCl, quantum chemical
Procedia PDF Downloads 33219932 Multi Attribute Failure Mode Analysis of the Catering Systems: A Case Study of Sefako Makgatho Health Sciences University in South Africa
Authors: Mokoena Oratilwe Penwell, Seeletse Solly Matshonisa
Abstract:
The demand for quality products is a vital factor determining the success of a producing company, and the reality of this demand influences customer satisfaction. In Sefako Makgatho Health Sciences University (SMU), concerns over the quality of food being sold have been raised by mostly students and staff who are primary consumers of food being sold by the cafeteria. Suspicions of food poisoning and the occurrence of diarrhea-related to food from the cafeteria, amongst others, have been raised. However, minimal measures have been taken to resolve the issue of food quality. New service providers have been appointed, and still, the same trends are being observed, the quality of food seems to depreciate continuously. This paper uses multi-attribute failure mode analysis (MAFMA) for failure detection and minimization on the machines used for food production by SMU catering company before being sold to both staff, and students so as to improve production plant reliability, and performance. Analytical Hierarchy Process (AHP) will be used for the severity ranking of the weight criterions and development of the hierarchical structure for the cafeteria company. Amongst other potential issues detected, maintenance of the machines and equipment used for food preparations was of concern. Also, the staff lacked sufficient hospitality skills, supervision, and management in the cafeteria needed greater attention to mitigate some of the failures occurring in the food production plant.Keywords: MAFMA, food quality, maintenance, supervision
Procedia PDF Downloads 13519931 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array
Authors: Muhammad M. A. S. Mahmoud
Abstract:
Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control
Procedia PDF Downloads 47119930 An Advanced Exponential Model for Seismic Isolators Having Hardening or Softening Behavior at Large Displacements
Authors: Nicolò Vaiana, Giorgio Serino
Abstract:
In this paper, an advanced Nonlinear Exponential Model (NEM), able to simulate the uniaxial dynamic behavior of seismic isolators having a continuously decreasing tangent stiffness with increasing displacement in the relatively large displacements range and a hardening or softening behavior at large displacements, is presented. The mathematical model is validated by comparing the experimental force-displacement hysteresis loops obtained during cyclic tests, conducted on a helical wire rope isolator and a recycled rubber-fiber reinforced bearing, with those predicted analytically. Good agreement between the experimental and simulated results shows that the proposed model can be an effective numerical tool to predict the force-displacement relationship of seismic isolation devices within the large displacements range. Compared to the widely used Bouc-Wen model, unable to simulate the response of seismic isolators at large displacements, the proposed one allows to avoid the numerical solution of a first order nonlinear ordinary differential equation for each time step of a nonlinear time history analysis, thus reducing the computation effort. Furthermore, the proposed model can simulate the smooth transition of the hysteresis loops from small to large displacements by adopting only one set of five parameters determined from the experimental hysteresis loops having the largest amplitude.Keywords: base isolation, hardening behavior, nonlinear exponential model, seismic isolators, softening behavior
Procedia PDF Downloads 32919929 Application of Medical Information System for Image-Based Second Opinion Consultations–Georgian Experience
Authors: Kldiashvili Ekaterina, Burduli Archil, Ghortlishvili Gocha
Abstract:
Introduction – Medical information system (MIS) is at the heart of information technology (IT) implementation policies in healthcare systems around the world. Different architecture and application models of MIS are developed. Despite of obvious advantages and benefits, application of MIS in everyday practice is slow. Objective - On the background of analysis of the existing models of MIS in Georgia has been created a multi-user web-based approach. This presentation will present the architecture of the system and its application for image based second opinion consultations. Methods – The MIS has been created with .Net technology and SQL database architecture. It realizes local (intranet) and remote (internet) access to the system and management of databases. The MIS is fully operational approach, which is successfully used for medical data registration and management as well as for creation, editing and maintenance of the electronic medical records (EMR). Five hundred Georgian language electronic medical records from the cervical screening activity illustrated by images were selected for second opinion consultations. Results – The primary goal of the MIS is patient management. However, the system can be successfully applied for image based second opinion consultations. Discussion – The ideal of healthcare in the information age must be to create a situation where healthcare professionals spend more time creating knowledge from medical information and less time managing medical information. The application of easily available and adaptable technology and improvement of the infrastructure conditions is the basis for eHealth applications. Conclusion - The MIS is perspective and actual technology solution. It can be successfully and effectively used for image based second opinion consultations.Keywords: digital images, medical information system, second opinion consultations, electronic medical record
Procedia PDF Downloads 45019928 General Time-Dependent Sequenced Route Queries in Road Networks
Authors: Mohammad Hossein Ahmadi, Vahid Haghighatdoost
Abstract:
Spatial databases have been an active area of research over years. In this paper, we study how to answer the General Time-Dependent Sequenced Route queries. Given the origin and destination of a user over a time-dependent road network graph, an ordered list of categories of interests and a departure time interval, our goal is to find the minimum travel time path along with the best departure time that minimizes the total travel time from the source location to the given destination passing through a sequence of points of interests belonging to each of the specified categories of interest. The challenge of this problem is the added complexity to the optimal sequenced route queries, where we assume that first the road network is time dependent, and secondly the user defines a departure time interval instead of one single departure time instance. For processing general time-dependent sequenced route queries, we propose two solutions as Discrete-Time and Continuous-Time Sequenced Route approaches, finding approximate and exact solutions, respectively. Our proposed approaches traverse the road network based on A*-search paradigm equipped with an efficient heuristic function, for shrinking the search space. Extensive experiments are conducted to verify the efficiency of our proposed approaches.Keywords: trip planning, time dependent, sequenced route query, road networks
Procedia PDF Downloads 32119927 Evaluation of JCI Accreditation for Medical Technology in Saudi Arabian Hospitals: A Study Case of PSMMC
Authors: Hamad Albadr
Abstract:
Joint Commission International (JCI) accreditation process intent to improve the safety and quality of care in the international community through the provision of education, publications, consultation, and evaluation services. These standards apply to the entire organization as well as to each department, unit, or service within the organization. Medical Technology that contains both medical equipment and devices, is an essential part of health care. Appropriate management of equipment maintenance for ensuring medical technology safe, the equipment life is maximized, and the total costs are minimized. JCI medical technology evaluation and accreditation use standards, intents, and measurable elements. The paper focuses on evaluation of JCI standards for medical technology in Saudi Arabian hospitals: a Study Case of PSMMC that define the performance expectation, structures, or functions that must be in place for a hospital to be accredited by JCI through measurable elements that indicate a score during the survey process that identify the requirements for full compliance with the standard specially through Facility Management and Safety (FMS) section that require the hospital establishes and implements a program for inspecting, testing, and maintaining medical technology and documenting the results, to ensure that medical technology is available for use and functioning properly, the hospital performs and documents; an inventory of medical technology; regular inspections of medical technology; testing of medical technology according to its use and manufacturers’ requirements; and performance of preventive maintenance.Keywords: joint commission international (JCI) accreditation, medical technology, Saudi Arabia, Saudi Arabian hospitals
Procedia PDF Downloads 55719926 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future
Authors: Gabriel Wainer
Abstract:
Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation
Procedia PDF Downloads 32319925 Finite Element Modeling and Nonlinear Analysis for Seismic Assessment of Off-Diagonal Steel Braced RC Frame
Authors: Keyvan Ramin
Abstract:
The geometric nonlinearity of Off-Diagonal Bracing System (ODBS) could be a complementary system to covering and extending the nonlinearity of reinforced concrete material. Finite element modeling is performed for flexural frame, x-braced frame and the ODBS braced frame system at the initial phase. Then the different models are investigated along various analyses. According to the experimental results of flexural and x-braced frame, the verification is done. Analytical assessments are performed in according to three-dimensional finite element modeling. Non-linear static analysis is considered to obtain performance level and seismic behavior, and then the response modification factors calculated from each model’s pushover curve. In the next phase, the evaluation of cracks observed in the finite element models, especially for RC members of all three systems is performed. The finite element assessment is performed on engendered cracks in ODBS braced frame for various time steps. The nonlinear dynamic time history analysis accomplished in different stories models for three records of Elcentro, Naghan, and Tabas earthquake accelerograms. Dynamic analysis is performed after scaling accelerogram on each type of flexural frame, x-braced frame and ODBS braced frame one by one. The base-point on RC frame is considered to investigate proportional displacement under each record. Hysteresis curves are assessed along continuing this study. The equivalent viscous damping for ODBS system is estimated in according to references. Results in each section show the ODBS system has an acceptable seismic behavior and their conclusions have been converged when the ODBS system is utilized in reinforced concrete frame.Keywords: FEM, seismic behaviour, pushover analysis, geometric nonlinearity, time history analysis, equivalent viscous damping, passive control, crack investigation, hysteresis curve
Procedia PDF Downloads 37819924 Iranian Processed Cheese under Effect of Emulsifier Salts and Cooking Time in Process
Authors: M. Dezyani, R. Ezzati bbelvirdi, M. Shakerian, H. Mirzaei
Abstract:
Sodium Hexametaphosphate (SHMP) is commonly used as an Emulsifying Salt (ES) in process cheese, although rarely as the sole ES. It appears that no published studies exist on the effect of SHMP concentration on the properties of process cheese when pH is kept constant; pH is well known to affect process cheese functionality. The detailed interactions between the added phosphate, Casein (CN), and indigenous Ca phosphate are poorly understood. We studied the effect of the concentration of SHMP (0.25-2.75%) and holding time (0-20 min) on the textural and Rheological properties of pasteurized process Cheddar cheese using a central composite rotatable design. All cheeses were adjusted to pH 5.6. The meltability of process cheese (as indicated by the decrease in loss tangent parameter from small amplitude oscillatory rheology, degree of flow, and melt area from the Schreiber test) decreased with an increase in the concentration of SHMP. Holding time also led to a slight reduction in meltability. Hardness of process cheese increased as the concentration of SHMP increased. Acid-base titration curves indicated that the buffering peak at pH 4.8, which is attributable to residual colloidal Ca phosphate, was shifted to lower pH values with increasing concentration of SHMP. The insoluble Ca and total and insoluble P contents increased as concentration of SHMP increased. The proportion of insoluble P as a percentage of total (indigenous and added) P decreased with an increase in ES concentration because of some of the (added) SHMP formed soluble salts. The results of this study suggest that SHMP chelated the residual colloidal Ca phosphate content and dispersed CN; the newly formed Ca-phosphate complex remained trapped within the process cheese matrix, probably by cross-linking CN. Increasing the concentration of SHMP helped to improve fat emulsification and CN dispersion during cooking, both of which probably helped to reinforce the structure of process cheese.Keywords: Iranian processed cheese, emulsifying salt, rheology, texture
Procedia PDF Downloads 431