Search results for: SERVQUAL methodology
5026 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 2525025 Development of National Scale Hydropower Resource Assessment Scheme Using SWAT and Geospatial Techniques
Authors: Rowane May A. Fesalbon, Greyland C. Agno, Jodel L. Cuasay, Dindo A. Malonzo, Ma. Rosario Concepcion O. Ang
Abstract:
The Department of Energy of the Republic of the Philippines estimates that the country’s energy reserves for 2015 are dwindling– observed in the rotating power outages in several localities. To aid in the energy crisis, a national hydropower resource assessment scheme is developed. Hydropower is a resource that is derived from flowing water and difference in elevation. It is a renewable energy resource that is deemed abundant in the Philippines – being an archipelagic country that is rich in bodies of water and water resources. The objectives of this study is to develop a methodology for a national hydropower resource assessment using hydrologic modeling and geospatial techniques in order to generate resource maps for future reference and use of the government and other stakeholders. The methodology developed for this purpose is focused on two models – the implementation of the Soil and Water Assessment Tool (SWAT) for the river discharge and the use of geospatial techniques to analyze the topography and obtain the head, and generate the theoretical hydropower potential sites. The methodology is highly coupled with Geographic Information Systems to maximize the use of geodatabases and the spatial significance of the determined sites. The hydrologic model used in this workflow is SWAT integrated in the GIS software ArcGIS. The head is determined by a developed algorithm that utilizes a Synthetic Aperture Radar (SAR)-derived digital elevation model (DEM) which has a resolution of 10-meters. The initial results of the developed workflow indicate hydropower potential in the river reaches ranging from pico (less than 5 kW) to mini (1-3 MW) theoretical potential.Keywords: ArcSWAT, renewable energy, hydrologic model, hydropower, GIS
Procedia PDF Downloads 3135024 Investment Projects Selection Problem under Hesitant Fuzzy Environment
Authors: Irina Khutsishvili
Abstract:
In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations, since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.Keywords: In the present research, a decision support methodology for the multi-attribute group decision-making (MAGDM) problem is developed, namely for the selection of investment projects. The objective of the investment project selection problem is to choose the best project among the set of projects, seeking investment, or to rank all projects in descending order. The project selection is made considering a set of weighted attributes. To evaluate the attributes in our approach, expert assessments are used. In the proposed methodology, lingual expressions (linguistic terms) given by all experts are used as initial attribute evaluations since they are the most natural and convenient representation of experts' evaluations. Then lingual evaluations are converted into trapezoidal fuzzy numbers, and the aggregate trapezoidal hesitant fuzzy decision matrix will be built. The case is considered when information on the attribute weights is completely unknown. The attribute weights are identified based on the De Luca and Termini information entropy concept, determined in the context of hesitant fuzzy sets. The decisions are made using the extended Technique for Order Performance by Similarity to Ideal Solution (TOPSIS) method under a hesitant fuzzy environment. Hence, a methodology is based on a trapezoidal valued hesitant fuzzy TOPSIS decision-making model with entropy weights. The ranking of alternatives is performed by the proximity of their distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). For this purpose, the weighted hesitant Hamming distance is used. An example of investment decision-making is shown that clearly explains the procedure of the proposed methodology.
Procedia PDF Downloads 1175023 Barriers and Enablers to Public Innovation in the Central Region of Colombia: A Characterization from Measurement through the Item Response Methodology and Comparative Analysis
Authors: Yessenia Parrado, Ana Barbosa, Daniela Mahe, Sebastian Toro, Jhon Garcia
Abstract:
The purpose of this work is to present the identification and characterization of the barriers and enablers to public innovation in the Central Region of Colombia from a mixed methodology in a research carried out in 2020 by the Laboratory of Innovation, Creativity and New Technologies of the National University of Colombia in alliance with the National Planning Department. Based on the research, the index of barriers to regional and departmental public innovation was built, which reflects the level of difficulty of the territorial entities to overcome the barriers present around three dimensions: organizational structure of the entity, generation of public value, and governance processes. The index was built from the item response methodology and the multiple correspondence analysis from the application of an institutional information form for public entities and a perception form for public servants. This investigation had the participation of 36 entities and 1038 servers and servants from the departments of Huila, Meta, Boyacá, Cundinamarca, Tolima, and the Capital District. In this exercise, it was identified that the departmental indices range between 13 and 44 and that the regional index was 30 out of 100. From the analysis of the information, it was possible to establish that the main barriers are the lack of specialized agencies for public innovation exercises, lack of qualified personnel and work methodologies for public innovation, inadequate information management, lack of feedback between the learning from governmental and non-governmental entities, the inability of the initiatives to generate binding participation mechanisms and the lack of qualification of citizens to participate in these processes.Keywords: item response, public innovation, quantitative analysis, compared analysis
Procedia PDF Downloads 1255022 Advanced Mechatronic Design of Robot Manipulator Using Hardware-In-The-Loop Simulation
Authors: Reza Karami, Ali Akbar Ebrahimi
Abstract:
This paper discusses concurrent engineering of robot manipulators, based on the Holistic Concurrent Design (HCD) methodology and by using a hardware-in-the-loop simulation platform. The methodology allows for considering numerous design variables with different natures concurrently. It redefines the ultimate goal of design based on the notion of satisfaction, resulting in the simplification of the multi-objective constrained optimization process. It also formalizes the effect of designer’s subjective attitude in the process. To enhance modeling efficiency for both computation and accuracy, a hardware-in-the-loop simulation platform is used, which involves physical joint modules and the control unit in addition to the software modules. This platform is implemented in the HCD design architecture to reliably evaluate the design attributes and performance super criterion during the design process. The resulting overall architecture is applied to redesigning kinematic, dynamic and control parameters of an industrial robot manipulator.Keywords: concurrent engineering, hardware-in-the-loop simulation, robot manipulator, multidisciplinary systems, mechatronics
Procedia PDF Downloads 4545021 Seismic Performance of Isolated Bridge Configurations with Soil Structure Interaction
Authors: Davide Forcellini
Abstract:
The most recent development of earthquake engineering is based on concept of design consisting in prescribed performance rather than the more traditional prescriptive approaches. The paper aims to assess the effects of isolation devices and soil structure interaction on a benchmark bridge adopting a Performance-Based Earthquake Engineering methodology. Several isolated configurations of abutments and pier connections are compared performing the most representative isolation devices. Isolation systems suitability depends on many factors, mainly connected with ground effects. In this regard, the second purpose of this paper is to assess the effects of soil-structure interaction (SSI) on the studied bridge configurations. Contributions of isolation technique and soil structure interaction are assessed evaluating the resistance effects applied to Peak Ground Acceleration (PGA) levels in terms of cost and time repair quantities.Keywords: base isolation, bridge, earthquake engineering, non linearity, PBEE methodology, seismic assessment, soil structure interaction
Procedia PDF Downloads 4305020 Degumming of Eri Silk Fabric with Ionic Liquid
Authors: Shweta K. Vyas, Rakesh Musale, Sanjeev R. Shukla
Abstract:
Eri silk is a non mulberry silk which is obtained without killing the silkworms and hence it is also known as Ahmisa silk. In the present study, the results on degumming of eri silk with alkaline peroxide have been compared with those obtained by using ionic liquid (IL) 1-Butyl-3-methylimidazolium chloride [BMIM]Cl. Experiments were designed to find out the optimum processing parameters for degumming of eri silk by response surface methodology. The statistical software, Design-Expert 6.0 was used for regression analysis and graphical analysis of the responses obtained by running the set of designed experiments. Analysis of variance (ANOVA) was used to estimate the statistical parameters. The polynomial equation of quadratic order was employed to fit the experimental data. The quality and model terms were evaluated by F-test. Three dimensional surface plots were prepared to study the effect of variables on different responses. The optimum conditions for IL treatment were selected from predicted combinations and the experiments were repeated under these conditions to determine the reproducibility.Keywords: silk degumming, ionic liquid, response surface methodology, ANOVA
Procedia PDF Downloads 5935019 Model-Based Process Development for the Comparison of a Radial Riveting and Roller Burnishing Process in Mechanical Joining Technology
Authors: Tobias Beyer, Christoph Friedrich
Abstract:
Modern simulation methodology using finite element models is nowadays a recognized tool for product design/optimization. Likewise, manufacturing process design is increasingly becoming the focus of simulation methodology in order to enable sustainable results based on reduced real-life tests here as well. In this article, two process simulations -radial riveting and roller burnishing- used for mechanical joining of components are explained. In the first step, the required boundary conditions are developed and implemented in the respective simulation models. This is followed by process space validation. With the help of the validated models, the interdependencies of the input parameters are investigated and evaluated by means of sensitivity analyses. Limit case investigations are carried out and evaluated with the aid of the process simulations. Likewise, a comparison of the two joining methods to each other becomes possible.Keywords: FEM, model-based process development, process simulation, radial riveting, roller burnishing, sensitivity analysis
Procedia PDF Downloads 1085018 The Cardiac Diagnostic Prediction Applied to a Designed Holter
Authors: Leonardo Juan Ramírez López, Javier Oswaldo Rodriguez Velasquez
Abstract:
We have designed a Holter that measures the heart´s activity for over 24 hours, implemented a prediction methodology, and generate alarms as well as indicators to patients and treating physicians. Various diagnostic advances have been developed in clinical cardiology thanks to Holter implementation; however, their interpretation has largely been conditioned to clinical analysis and measurements adjusted to diverse population characteristics, thus turning it into a subjective examination. This, however, requires vast population studies to be validated that, in turn, have not achieved the ultimate goal: mortality prediction. Given this context, our Insight Research Group developed a mathematical methodology that assesses cardiac dynamics through entropy and probability, creating a numerical and geometrical attractor which allows quantifying the normalcy of chronic and acute disease as well as the evolution between such states, and our Tigum Research Group developed a holter device with 12 channels and advanced computer software. This has been shown in different contexts with 100% sensitivity and specificity results.Keywords: attractor , cardiac, entropy, holter, mathematical , prediction
Procedia PDF Downloads 1695017 Multidimensional Inequality and Deprivation Among Tribal Communities of Andhra Pradesh, India
Authors: Sanjay Sinha, Mohd Umair Khan
Abstract:
The level of income inequality in India has been worrisome as the World Inequality Report termed it as a “poor and unequal country, with an affluent elite”. As important as income is to understand inequality and deprivation, it is just one dimension. But the historical roots and current realities of inequality and deprivation in India lies in many of the non-income dimensions such as housing, nutrition, education, agency, sense of inclusion etc. which are often ignored, especially in solution-oriented research. The level of inequality and deprivation among the tribal is one such case. There is a corpus of literature establishing that the tribal communities in India are disadvantageous on various grounds. Given their rural geography, issues of access and quality of basic facilities such as education and healthcare are often unaddressed. COVID-19 has further exacerbated this challenge and climate change will make it even more worrying. With this background, a succinct measurement tool at the village level is necessary to design short to medium-term actions with reference to risk mitigation for tribal communities. This research paper examines the level of inequality and deprivation among the tribal communities in the rural areas of Andhra Pradesh state of India using a Multidimensional Inequality and Deprivation Index based on the Alkire-Foster methodology. The methodology is theoretically grounded in the capability approach propounded by Amartya Sen, emphasizing on achieving the “beings and doings” (functionings) an individual reason to value. In the index, the authors have five domains, including Livelihood, Food Security, Education, Health and Housing and these domains are divided into sixteen indicators. This assessment is followed by domain-wise short-term and long-term solutions.Keywords: Andhra Pradesh, Alkire-Foster methodology, deprivation, inequality, multidimensionality, poverty, tribal
Procedia PDF Downloads 1595016 Effective Energy Saving of a Large Building through Multiple Approaches
Authors: Choo Hong Ang
Abstract:
The most popular approach to save energy for large commercial buildings in Malaysia is to replace the existing chiller plant of high kW/ton to one of lower kW/ton. This approach, however, entails large capital outlay with a long payment period of up to 7 years. This paper shows that by using multiple approaches, other than replacing the existing chiller plant, an energy saving of up to 20 %, is possible. The main methodology adopted was to identify and then plugged all heat ingress paths into a building, including putting up glass structures to prevent mixing of internal air-conditioned air with the ambient environment, and replacing air curtains with glass doors. This methodology could save up to 10 % energy bill. Another methodology was to change fixed speed motors of air handling units (AHU) to variable speed drive (VSD) and changing escalators to motion-sensor type. Other methodologies included reducing heat load by blocking air supply to non-occupied parcels, rescheduling chiller plant operation, changing of fluorescent lights to LED lights, and conversion from tariff B to C1. A case example of Komtar, the tallest building in Penang, is given here. The total energy bill for Komtar was USD2,303,341 in 2016 but was reduced to USD 1,842,927.39 in 2018, a significant saving of USD460,413.86 or 20 %. In terms of kWh, there was a reduction from 18, 302,204.00 kWh in 2016 to 14,877,105.00 kWh in 2018, a reduction of 3,425,099.00 kWh or 18.71 %. These methodologies used were relatively low cost and the payback period was merely 24 months. With this achievement, the Komtar building was awarded champion of the Malaysian National Energy Award 2019 and second runner up of the Asean Energy Award. This experience shows that a strong commitment to energy saving is the key to effective energy saving.Keywords: chiller plant, energy saving measures, heat ingress, large building
Procedia PDF Downloads 1055015 The Phenomena of False Cognates and Deceptive Cognates: Issues to Foreign Language Learning and Teaching Methodology Based on Set Theory
Authors: Marilei Amadeu Sabino
Abstract:
The aim of this study is to establish differences between the terms ‘false cognates’, ‘false friends’ and ‘deceptive cognates’, usually considered to be synonyms. It will be shown they are not synonyms, since they do not designate the same linguistic process or phenomenon. Despite their differences in meaning, many pairs of formally similar words in two (or more) different languages are true cognates, although they are usually known as ‘false’ cognates – such as, for instance, the English and Italian lexical items ‘assist x assistere’; ‘attend x attendere’; ‘argument x argomento’; ‘apology x apologia’; ‘camera x camera’; ‘cucumber x cocomero’; ‘fabric x fabbrica’; ‘factory x fattoria’; ‘firm x firma’; ‘journal x giornale’; ‘library x libreria’; ‘magazine x magazzino’; ‘parent x parente’; ‘preservative x preservativo’; ‘pretend x pretendere’; ‘vacancy x vacanza’, to name but a few examples. Thus, one of the theoretical objectives of this paper is firstly to elaborate definitions establishing a distinction between the words that are definitely ‘false cognates’ (derived from different etyma) and those that are just ‘deceptive cognates’ (derived from the same etymon). Secondly, based on Set Theory and on the concepts of equal sets, subsets, intersection of sets and disjoint sets, this study is intended to elaborate some theoretical and practical questions that will be useful in identifying more precisely similarities and differences between cognate words of different languages, and according to graphic interpretation of sets it will be possible to classify them and provide discernment about the processes of semantic changes. Therefore, these issues might be helpful not only to the Learning of Second and Foreign Languages, but they could also give insights into Foreign and Second Language Teaching Methodology. Acknowledgements: FAPESP – São Paulo State Research Support Foundation – the financial support offered (proc. n° 2017/02064-7).Keywords: deceptive cognates, false cognates, foreign language learning, teaching methodology
Procedia PDF Downloads 3375014 Mourning Motivations for Celebrities in Instagram: A Case Study of Mohammadreza Shajarian's Death
Authors: Zahra Afshordi
Abstract:
Instagram, as an everyday life social network, hosts from the ultrasound image of an unborn fetus to the pictures of newly placed gravestones and funerals. It is a platform that allows its users to create a second identity independently from and at the same time in relation to the real space identity. The motives behind this identification are what this article is about. This article studies the motivations of Instagram users mourning for celebrities with a focus on the death of MohammadReza Shajarian. The Shajarian’s death had a wide reflection on Instagram Persian-speaking users. The purpose of this qualitative survey is to comprehend and study the user’s motivations in posting mourning and memorializing content. The methodology of the essay is a hybrid methodology consisting of content analysis and open-ended interviews. The results highlight that users' motives are more than just simple sympathy and include political protest, gaining cultural capital, reaching social status, and escaping from solitude.Keywords: case study, celebrity, identity, Instagram, mourning, qualitative survey
Procedia PDF Downloads 1565013 Selection of Pichia kudriavzevii Strain for the Production of Single-Cell Protein from Cassava Processing Waste
Authors: Phakamas Rachamontree, Theerawut Phusantisampan, Natthakorn Woravutthikul, Peerapong Pornwongthong, Malinee Sriariyanun
Abstract:
A total of 115 yeast strains isolated from local cassava processing wastes were measured for crude protein content. Among these strains, the strain MSY-2 possessed the highest protein concentration (>3.5 mg protein/mL). By using molecular identification tools, it was identified to be a strain of Pichia kudriavzevii based on similarity of D1/D2 domain of 26S rDNA region. In this study, to optimize the protein production by MSY-2 strain, Response Surface Methodology (RSM) was applied. The tested parameters were the carbon content, nitrogen content, and incubation time. Here, the value of regression coefficient (R2) = 0.7194 could be explained by the model, which is high to support the significance of the model. Under the optimal condition, the protein content was produced up to 3.77 g per L of the culture and MSY-2 strain contain 66.8 g protein per 100 g of cell dry weight. These results revealed the plausibility of applying the novel strain of yeast in single-cell protein production.Keywords: single cell protein, response surface methodology, yeast, cassava processing waste
Procedia PDF Downloads 4035012 Vibration Propagation in Structures Through Structural Intensity Analysis
Authors: Takhchi Jamal, Ouisse Morvan, Sadoulet-Reboul Emeline, Bouhaddi Noureddine, Gagliardini Laurent, Bornet Frederic, Lakrad Faouzi
Abstract:
Structural intensity is a technique that can be used to indicate both the magnitude and direction of power flow through a structure from the excitation source to the dissipation sink. However, current analysis is limited to the low frequency range. At medium and high frequencies, a rotational component appear in the field, masking the energy flow and make its understanding difficult or impossible. The objective of this work is to implement a methodology to filter out the rotational components of the structural intensity field in order to fully understand the energy flow in complex structures. The approach is based on the Helmholtz decomposition. It allows to decompose the structural intensity field into rotational, irrotational, and harmonic components. Only the irrotational component is needed to describe the net power flow from a source to a dissipative zone in the structure. The methodology has been applied on academic structures, and it allows a good analysis of the energy transfer paths.Keywords: structural intensity, power flow, helmholt decomposition, irrotational intensity
Procedia PDF Downloads 1785011 Adjusting Electricity Demand Data to Account for the Impact of Loadshedding in Forecasting Models
Authors: Migael van Zyl, Stefanie Visser, Awelani Phaswana
Abstract:
The electricity landscape in South Africa is characterized by frequent occurrences of loadshedding, a measure implemented by Eskom to manage electricity generation shortages by curtailing demand. Loadshedding, classified into stages ranging from 1 to 8 based on severity, involves the systematic rotation of power cuts across municipalities according to predefined schedules. However, this practice introduces distortions in recorded electricity demand, posing challenges to accurate forecasting essential for budgeting, network planning, and generation scheduling. Addressing this challenge requires the development of a methodology to quantify the impact of loadshedding and integrate it back into metered electricity demand data. Fortunately, comprehensive records of loadshedding impacts are maintained in a database, enabling the alignment of Loadshedding effects with hourly demand data. This adjustment ensures that forecasts accurately reflect true demand patterns, independent of loadshedding's influence, thereby enhancing the reliability of electricity supply management in South Africa. This paper presents a methodology for determining the hourly impact of load scheduling and subsequently adjusting historical demand data to account for it. Furthermore, two forecasting models are developed: one utilizing the original dataset and the other using the adjusted data. A comparative analysis is conducted to evaluate forecast accuracy improvements resulting from the adjustment process. By implementing this methodology, stakeholders can make more informed decisions regarding electricity infrastructure investments, resource allocation, and operational planning, contributing to the overall stability and efficiency of South Africa's electricity supply system.Keywords: electricity demand forecasting, load shedding, demand side management, data science
Procedia PDF Downloads 615010 An Investigation on Material Removal Rate of EDM Process: A Response Surface Methodology Approach
Authors: Azhar Equbal, Anoop Kumar Sood, M. Asif Equbal, M. Israr Equbal
Abstract:
In the present work response surface methodology (RSM) based central composite design (CCD) is used for analyzing the electrical discharge machining (EDM) process. For experimentation, mild steel is selected as work piece and copper is used as electrode. Three machining parameters namely current (I), spark on time (Ton) and spark off time (Toff) are selected as the input variables. The output or response chosen is material removal rate (MRR) which is to be maximized. To reduce the number of runs face centered central composite design (FCCCD) was used. ANOVA was used to determine the significance of parameter and interactions. The suitability of model is tested using Anderson darling (AD) plot. The results conclude that different parameters considered i.e. current, pulse on and pulse off time; all have dominant effect on the MRR. At last, the optimized parameter setting for maximizing MRR is found through main effect plot analysis.Keywords: EDM, electrode, MRR, RSM, ANOVA
Procedia PDF Downloads 3055009 Structural Evaluation of Airfield Pavement Using Finite Element Analysis Based Methodology
Authors: Richard Ji
Abstract:
Nondestructive deflection testing has been accepted widely as a cost-effective tool for evaluating the structural condition of airfield pavements. Backcalculation of pavement layer moduli can be used to characterize the pavement existing condition in order to compute the load bearing capacity of pavement. This paper presents an improved best-fit backcalculation methodology based on deflection predictions obtained using finite element method (FEM). The best-fit approach is based on minimizing the squared error between falling weight deflectometer (FWD) measured deflections and FEM predicted deflections. Then, concrete elastic modulus and modulus of subgrade reaction were back-calculated using Heavy Weight Deflectometer (HWD) deflections collected at the National Airport Pavement Testing Facility (NAPTF) test site. It is an alternative and more versatile method in considering concrete slab geometry and HWD testing locations compared to methods currently available.Keywords: nondestructive testing, pavement moduli backcalculation, finite element method, concrete pavements
Procedia PDF Downloads 1665008 Graphical User Interface for Presting Matlab Work for Reduction of Chromatic Disperion Using Digital Signal Processing for Optical Communication
Authors: Muhammad Faiz Liew Abdullah, Bhagwan Das, Nor Shahida, Abdul Fattah Chandio
Abstract:
This study presents the designed features of Graphical User Interface (GUI) for chromatic dispersion (CD) reduction using digital signal processing (DSP) techniques. GUI is specially designed for windows platform. The obtained simulation results from matlab are presented via this GUI. After importing results from matlab in GUI, It will present your work on any windows7 and onwards versions platforms without matlab software. First part of the GUI contains the research methodology block diagram and in the second part, output for each stage is shown in separate reserved area for the result display. Each stage of methodology has the captions to display the results. This GUI will be very helpful during presentations instead of making slides this GUI will present all your work easily in the absence of other software’s such as Matlab, Labview, MS PowerPoint. GUI is designed using C programming in MS Visio Studio.Keywords: Matlab simulation results, C programming, MS VISIO studio, chromatic dispersion
Procedia PDF Downloads 4625007 Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility
Authors: Juliana Barcelos Cordeiro, Khashayar Mahani, Farbod Farzan, Mohsen A. Jafari
Abstract:
Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.Keywords: energy consumption forecasting, energy efficiency, load disaggregation, pattern recognition approach
Procedia PDF Downloads 2775006 Exploring the Non-Verbalizable in Conservation Grazing: The Contradictions Illuminated by a ‘Go-Along’ Methodology
Authors: James Ormrod
Abstract:
This paper is concerned with volunteer livestock checking. Based on a pilot study consisting of ‘go-along’ interviews with livestock checkers, it argues that there are limitations to the insights that can be generated from approaches to ‘discourse analysis’ that would focus only on the verbalizable aspects of the practice. Volunteer livestock checking takes place across Europe as part of conservation projects aimed at maintaining particular habitats through the reintroduction of grazing animals. Volunteers are variously called ‘urban shepherds’, because these practices often take place on urban fringes, or ‘lookerers’, as their role is to make visual checks on the animals. Pilot research that took place on the South Downs (a chalk downland habitat on the South Coast of the UK) involved researchers accompanying volunteers as they checked on livestock. They were asked to give an account of what they were doing and then answer semi-structured interview questions. Participants drew on popular discourses on conservation and biodiversity, as framed by the local council who run the programme. They also framed their relationships to the animals in respect to the more formal limitations of their role as identified through the conservation programme. And yet these discourses, significant as they are, do not adequately explain why volunteers are drawn to, and emotionally invested in, lookering. The methodology employed allowed participants instead to gesture to features of the landscape and to recall memories, and for the researchers to see how volunteers interacted with the animals and the landscape in embodied and emotionally loaded ways. The paper argues that a psychosocial perspective that pays attention to the contradictions and tensions made visible through this methodology helps develop a fuller understanding of volunteer livestock checking as a social practice.Keywords: conservation, human-animal relations, lookering, volunteering
Procedia PDF Downloads 1335005 The Customization of 3D Last Form Design Based on Weighted Blending
Authors: Shih-Wen Hsiao, Chu-Hsuan Lee, Rong-Qi Chen
Abstract:
When it comes to last, it is regarded as the critical foundation of shoe design and development. Not only the last relates to the comfort of shoes wearing but also it aids the production of shoe styling and manufacturing. In order to enhance the efficiency and application of last development, a computer aided methodology for customized last form designs is proposed in this study. The reverse engineering is mainly applied to the process of scanning for the last form. Then the minimum energy is used for the revision of surface continuity, the surface of the last is reconstructed with the feature curves of the scanned last. When the surface of a last is reconstructed, based on the foundation of the proposed last form reconstruction module, the weighted arithmetic mean method is applied to the calculation on the shape morphing which differs from the grading for the control mesh of last, and the algorithm of subdivision is used to create the surface of last mesh, thus the feet-fitting 3D last form of different sizes is generated from its original form feature with functions remained. Finally, the practicability of the proposed methodology is verified through later case studies.Keywords: 3D last design, customization, reverse engineering, weighted morphing, shape blending
Procedia PDF Downloads 3395004 Optimization of Operational Water Quality Parameters in a Drinking Water Distribution System Using Response Surface Methodology
Authors: Sina Moradi, Christopher W. K. Chow, John Van Leeuwen, David Cook, Mary Drikas, Patrick Hayde, Rose Amal
Abstract:
Chloramine is commonly used as a disinfectant in drinking water distribution systems (DWDSs), particularly in Australia and the USA. Maintaining a chloramine residual throughout the DWDS is important in ensuring microbiologically safe water is supplied at the customer’s tap. In order to simulate how chloramine behaves when it moves through the distribution system, a water quality network model (WQNM) can be applied. In this work, the WQNM was based on mono-chloramine decomposition reactions, which enabled prediction of mono-chloramine residual at different locations through a DWDS in Australia, using the Bentley commercial hydraulic package (Water GEMS). The accuracy of WQNM predictions is influenced by a number of water quality parameters. Optimization of these parameters in order to obtain the closest results in comparison with actual measured data in a real DWDS would result in both cost reduction as well as reduction in consumption of valuable resources such as energy and materials. In this work, the optimum operating conditions of water quality parameters (i.e. temperature, pH, and initial mono-chloramine concentration) to maximize the accuracy of mono-chloramine residual predictions for two water supply scenarios in an entire network were determined using response surface methodology (RSM). To obtain feasible and economical water quality parameters for highest model predictability, Design Expert 8.0 software (Stat-Ease, Inc.) was applied to conduct the optimization of three independent water quality parameters. High and low levels of the water quality parameters were considered, inevitably, as explicit constraints, in order to avoid extrapolation. The independent variables were pH, temperature and initial mono-chloramine concentration. The lower and upper limits of each variable for two water supply scenarios were defined and the experimental levels for each variable were selected based on the actual conditions in studied DWDS. It was found that at pH of 7.75, temperature of 34.16 ºC, and initial mono-chloramine concentration of 3.89 (mg/L) during peak water supply patterns, root mean square error (RMSE) of WQNM for the whole network would be minimized to 0.189, and the optimum conditions for averaged water supply occurred at pH of 7.71, temperature of 18.12 ºC, and initial mono-chloramine concentration of 4.60 (mg/L). The proposed methodology to predict mono-chloramine residual can have a great potential for water treatment plant operators in accurately estimating the mono-chloramine residual through a water distribution network. Additional studies from other water distribution systems are warranted to confirm the applicability of the proposed methodology for other water samples.Keywords: chloramine decay, modelling, response surface methodology, water quality parameters
Procedia PDF Downloads 2245003 Towards Safety-Oriented System Design: Preventing Operator Errors by Scenario-Based Models
Authors: Avi Harel
Abstract:
Most accidents are commonly attributed in hindsight to human errors, yet most methodologies for safety focus on technical issues. According to the Black Swan theory, this paradox is due to insufficient data about the ways systems fail. The article presents a study of the sources of errors, and proposes a methodology for utility-oriented design, comprising methods for coping with each of the sources identified. Accident analysis indicates that errors typically result from difficulties of operating in exceptional conditions. Therefore, following STAMP, the focus should be on preventing exceptions. Exception analysis indicates that typically they involve an improper account of the operational scenario, due to deficiencies in the system integration. The methodology proposes a model, which is a formal definition of the system operation, as well as principles and guidelines for safety-oriented system integration. The article calls to develop and integrate tools for recording and analysis of the system activity during the operation, required to implement validate the model.Keywords: accidents, complexity, errors, exceptions, interaction, modeling, resilience, risks
Procedia PDF Downloads 1955002 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities
Procedia PDF Downloads 3195001 Predicting Machine-Down of Woodworking Industrial Machines
Authors: Matteo Calabrese, Martin Cimmino, Dimos Kapetis, Martina Manfrin, Donato Concilio, Giuseppe Toscano, Giovanni Ciandrini, Giancarlo Paccapeli, Gianluca Giarratana, Marco Siciliano, Andrea Forlani, Alberto Carrotta
Abstract:
In this paper we describe a machine learning methodology for Predictive Maintenance (PdM) applied on woodworking industrial machines. PdM is a prominent strategy consisting of all the operational techniques and actions required to ensure machine availability and to prevent a machine-down failure. One of the challenges with PdM approach is to design and develop of an embedded smart system to enable the health status of the machine. The proposed approach allows screening simultaneously multiple connected machines, thus providing real-time monitoring that can be adopted with maintenance management. This is achieved by applying temporal feature engineering techniques and training an ensemble of classification algorithms to predict Remaining Useful Lifetime of woodworking machines. The effectiveness of the methodology is demonstrated by testing an independent sample of additional woodworking machines without presenting machine down event.Keywords: predictive maintenance, machine learning, connected machines, artificial intelligence
Procedia PDF Downloads 2265000 Analysis of Surface Hardness, Surface Roughness and near Surface Microstructure of AISI 4140 Steel Worked with Turn-Assisted Deep Cold Rolling Process
Authors: P. R. Prabhu, S. M. Kulkarni, S. S. Sharma, K. Jagannath, Achutha Kini U.
Abstract:
In the present study, response surface methodology has been used to optimize turn-assisted deep cold rolling process of AISI 4140 steel. A regression model is developed to predict surface hardness and surface roughness using response surface methodology and central composite design. In the development of predictive model, deep cold rolling force, ball diameter, initial roughness of the workpiece, and number of tool passes are considered as model variables. The rolling force and the ball diameter are the significant factors on the surface hardness and ball diameter and numbers of tool passes are found to be significant for surface roughness. The predicted surface hardness and surface roughness values and the subsequent verification experiments under the optimal operating conditions confirmed the validity of the predicted model. The absolute average error between the experimental and predicted values at the optimal combination of parameter settings for surface hardness and surface roughness is calculated as 0.16% and 1.58% respectively. Using the optimal processing parameters, the hardness is improved from 225 to 306 HV, which resulted in an increase in the near surface hardness by about 36% and the surface roughness is improved from 4.84µm to 0.252 µm, which resulted in decrease in the surface roughness by about 95%. The depth of compression is found to be more than 300µm from the microstructure analysis and this is in correlation with the results obtained from the microhardness measurements. Taylor Hobson Talysurf tester, micro Vickers hardness tester, optical microscopy and X-ray diffractometer are used to characterize the modified surface layer.Keywords: hardness, response surface methodology, microstructure, central composite design, deep cold rolling, surface roughness
Procedia PDF Downloads 4204999 Interventions and Supervision in Mental Health Services: Experiences of a Working Group in Brazil
Authors: Sonia Alberti
Abstract:
The Regional Conference to Restructure Psychiatric Care in Latin America, convened by the Pan American Health Organization (PAHO) in 1990, oriented the Brazilian Federal Act in 2001 that stipulated the psychiatric reform which requires deinstitutionalization and community-based treatment. Since then, the 15 years’ experience of different working teams in mental health led an academic working group – supervisors from personal practices, professors and researchers – to discuss certain clinical issues, as well as supervisions, and to organize colloquia in different cities as a methodology. These colloquia count on the participation of different working teams from the cities in which they are held, with team members with different levels of educational degrees and prior experiences, in order to increase dialogue right where it does not always appear to be possible. The principal aim of these colloquia is to gain interlocution between practitioners and academics. Working with the theory of case constructions, this methodology revealed itself helpful in unfolding new solutions. The paper also observes that there is not always harmony between what the psychiatric reform demands and clinical ethics.Keywords: mental health, supervision, clinical cases, Brazilian experience
Procedia PDF Downloads 2734998 Application of Life Cycle Assessment “LCA” Approach for a Sustainable Building Design under Specific Climate Conditions
Authors: Djeffal Asma, Zemmouri Noureddine
Abstract:
In order for building designer to be able to balance environmental concerns with other performance requirements, they need clear and concise information. For certain decisions during the design process, qualitative guidance, such as design checklists or guidelines information may not be sufficient for evaluating the environmental benefits between different building materials, products and designs. In this case, quantitative information, such as that generated through a life cycle assessment, provides the most value. LCA provides a systematic approach to evaluating the environmental impacts of a product or system over its entire life. In the case of buildings life cycle includes the extraction of raw materials, manufacturing, transporting and installing building components or products, operating and maintaining the building. By integrating LCA into building design process, designers can evaluate the life cycle impacts of building design, materials, components and systems and choose the combinations that reduce the building life cycle environmental impact. This article attempts to give an overview of the integration of LCA methodology in the context of building design, and focuses on the use of this methodology for environmental considerations concerning process design and optimization. A multiple case study was conducted in order to assess the benefits of the LCA as a decision making aid tool during the first stages of the building design under specific climate conditions of the North East region of Algeria. It is clear that the LCA methodology can help to assess and reduce the impact of a building design and components on the environment even if the process implementation is rather long and complicated and lacks of global approach including human factors. It is also demonstrated that using LCA as a multi objective optimization of building process will certainly facilitates the improvement in design and decision making for both new design and retrofit projects.Keywords: life cycle assessment, buildings, sustainability, elementary schools, environmental impacts
Procedia PDF Downloads 5464997 Finding the Elastic Field in an Arbitrary Anisotropic Media by Implementing Accurate Generalized Gaussian Quadrature Solution
Authors: Hossein Kabir, Amir Hossein Hassanpour Mati-Kolaie
Abstract:
In the current study, the elastic field in an anisotropic elastic media is determined by implementing a general semi-analytical method. In this specific methodology, the displacement field is computed as a sum of finite functions with unknown coefficients. These aforementioned functions satisfy exactly both the homogeneous and inhomogeneous boundary conditions in the proposed media. It is worth mentioning that the unknown coefficients are determined by implementing the principle of minimum potential energy. The numerical integration is implemented by employing the Generalized Gaussian Quadrature solution. Furthermore, with the aid of the calculated unknown coefficients, the displacement field, as well as the other parameters of the elastic field, are obtainable as well. Finally, the comparison of the previous analytical method with the current semi-analytical method proposes the efficacy of the present methodology.Keywords: anisotropic elastic media, semi-analytical method, elastic field, generalized gaussian quadrature solution
Procedia PDF Downloads 321