Search results for: integrated definition for process description capture (IDEF3) method
32149 Paternity Index Analysis on Disputed Paternity Cases at Sardjito Hospital Yogyakarta, Indonesia
Authors: Taufik Hidayat, Yudha Nurhantari, Bambang U. D. Rianto
Abstract:
Introduction: The examination of the Short Tandem Repeats (STR) locus on nuclear DNA is very useful in solving the paternity cases. The purpose of this study is to know the description of paternity cases and paternity index/probability of paternity analysis based on Indonesian allele frequency at Sardjito Hospital Yogyakarta. Method: This was an observational study with cross-sectional analytic method. Population and sample were all cases of disputed paternity from January 2011 to June 2015 that fulfill the inclusion and exclusion criteria and were examined at Forensic Medicine Unit of Sardjito Hospital, Medical Faculty of Gadjah Mada University. The paternity index was calculated with EasyDNA Program by Fung (2013). Analysis of the study was conducted by comparing the results through unpaired categorical test using Kolmogorov-Smirnov test. This study was designed with 95% confidence interval (CI) with α = 5% and significance level is p < 0,05. Results: From 42 disputed paternity cases we obtained trio paternity cases were 32 cases (76.2%) and duo without a mother was 10 cases (23.8%). The majority of the fathers' estimated ages were 21-30 years (33.3%) and the mother's age was 31-40 years (38.1%). The majority of the ages of children examined for paternity were under 12 months (47.6%). The majority of ethnic clients are Javanese. Conclusion of inclusion was 57.1%, and exclusion was 42.9%. The Kolmogorov-Smirnov test obtained p-value = 0.673. Conclusion: There is no significant difference between paternity index/probability of paternity based on Indonesian allele frequency between trio and duo of paternity.Keywords: disputed paternity, paternity index, probability of paternity, short tandem
Procedia PDF Downloads 17332148 Elvis Improved Method for Solving Simultaneous Equations in Two Variables with Some Applications
Authors: Elvis Adam Alhassan, Kaiyu Tian, Akos Konadu, Ernest Zamanah, Michael Jackson Adjabui, Ibrahim Justice Musah, Esther Agyeiwaa Owusu, Emmanuel K. A. Agyeman
Abstract:
In this paper, how to solve simultaneous equations using the Elvis improved method is shown. The Elvis improved method says; to make one variable in the first equation the subject; make the same variable in the second equation the subject; equate the results and simplify to obtain the value of the unknown variable; put the value of the variable found into one equation from the first or second steps and simplify for the remaining unknown variable. The difference between our Elvis improved method and the substitution method is that: with Elvis improved method, the same variable is made the subject in both equations, and the two resulting equations equated, unlike the substitution method where one variable is made the subject of only one equation and substituted into the other equation. After describing the Elvis improved method, findings from 100 secondary students and the views of 5 secondary tutors to demonstrate the effectiveness of the method are presented. The study's purpose is proved by hypothetical examples.Keywords: simultaneous equations, substitution method, elimination method, graphical method, Elvis improved method
Procedia PDF Downloads 13732147 Environmental Life Cycle Assessment of Circular, Bio-Based and Industrialized Building Envelope Systems
Authors: N. Cihan KayaçEtin, Stijn Verdoodt, Alexis Versele
Abstract:
The construction industry is accounted for one-third of all waste generated in the European Union (EU) countries. The Circular Economy Action Plan of the EU aims to tackle this issue and aspires to enhance the sustainability of the construction industry by adopting more circular principles and bio-based material use. The Interreg Circular Bio-Based Construction Industry (CBCI) project was conceived to research how this adoption can be facilitated. For this purpose, an approach is developed that integrates technical, legal and social aspects and provides business models for circular designing and building with bio-based materials. In the scope of the project, the research outputs are to be displayed in a real-life setting by constructing a demo terraced single-family house, the living lab (LL) located in Ghent (Belgium). The realization of the LL is conducted in a step-wise approach that includes iterative processes for design, description, criteria definition and multi-criteria assessment of building components. The essence of the research lies within the exploratory approach to the state-of-art building envelope and technical systems options for achieving an optimum combination for a circular and bio-based construction. For this purpose, nine preliminary designs (PD) for building envelope are generated, which consist of three basic construction methods: masonry, lightweight steel construction and wood framing construction supplemented with bio-based construction methods like cross-laminated timber (CLT) and massive wood framing. A comparative analysis on the PDs was conducted by utilizing several complementary tools to assess the circularity. This paper focuses on the life cycle assessment (LCA) approach for evaluating the environmental impact of the LL Ghent. The adoption of an LCA methodology was considered critical for providing a comprehensive set of environmental indicators. The PDs were developed at the component level, in particular for the (i) inclined roof, (ii-iii) front and side façade, (iv) internal walls and (v-vi) floors. The assessment was conducted on two levels; component and building level. The options for each component were compared at the first iteration and then, the PDs as an assembly of components were further analyzed. The LCA was based on a functional unit of one square meter of each component and CEN indicators were utilized for impact assessment for a reference study period of 60 years. A total of 54 building components that are composed of 31 distinct materials were evaluated in the study. The results indicate that wood framing construction supplemented with bio-based construction methods performs environmentally better than the masonry or steel-construction options. An analysis on the correlation between the total weight of components and environmental impact was also conducted. It was seen that masonry structures display a high environmental impact and weight, steel structures display low weight but relatively high environmental impact and wooden framing construction display low weight and environmental impact. The study provided valuable outputs in two levels: (i) several improvement options at component level with substitution of materials with critical weight and/or impact per unit, (ii) feedback on environmental performance for the decision-making process during the design phase of a circular single family house.Keywords: circular and bio-based materials, comparative analysis, life cycle assessment (LCA), living lab
Procedia PDF Downloads 18332146 Investigation of the Properties of Biochar Obtained by Dry and Wet Torrefaction in a Fixed and in a Fluidized Bed
Authors: Natalia Muratova, Dmitry Klimov, Rafail Isemin, Sergey Kuzmin, Aleksandr Mikhalev, Oleg Milovanov
Abstract:
We investigated the processing of poultry litter into biochar using dry torrefaction methods (DT) in a fixed and fluidized bed of quartz sand blown with nitrogen, as well as wet torrefaction (WT) in a fluidized bed in a medium of water steam at a temperature of 300 °C. Torrefaction technology affects the duration of the heat treatment process and the characteristics of the biochar: the process of separating CO₂, CO, H₂ and CH₄ from a portion of fresh poultry litter during torrefaction in a fixed bed is completed after 2400 seconds, but in a fluidized bed — after 480 seconds. During WT in a fluidized bed of quartz sand, this process ends in 840 seconds after loading a portion of fresh litter, but in a fluidized bed of litter particles previously subjected to torrefaction, the process ends in 350 - 450 seconds. In terms of the ratio between (H/C) and (O/C), the litter obtained after DT and WT treatment corresponds to lignite. WT in a fluidized bed allows one to obtain biochar, in which the specific pore area is two times larger than the specific pore area of biochar obtained after DT in a fluidized bed. Biochar, obtained as a result of the poultry litter treatment in a fluidized bed using DT or WT method, is recommended to be used not only as a biofuel but also as an adsorbent or the soil fertilizer.Keywords: biochar, poultry litter, dry and wet torrefaction, fixed bed, fluidized bed
Procedia PDF Downloads 15732145 RASPE: Risk Advisory Smart System for Pipeline Projects in Egypt
Authors: Nael Y. Zabel, Maged E. Georgy, Moheeb E. Ibrahim
Abstract:
A knowledge-based expert system with the acronym RASPE is developed as an application tool to help decision makers in construction companies make informed decisions about managing risks in pipeline construction projects. Choosing to use expert systems from all available artificial intelligence techniques is due to the fact that an expert system is more suited to representing a domain’s knowledge and the reasoning behind domain-specific decisions. The knowledge-based expert system can capture the knowledge in the form of conditional rules which represent various project scenarios and potential risk mitigation/response actions. The built knowledge in RASPE is utilized through the underlying inference engine that allows the firing of rules relevant to a project scenario into consideration. This paper provides an overview of the knowledge acquisition process and goes about describing the knowledge structure which is divided up into four major modules. The paper shows one module in full detail for illustration purposes and concludes with insightful remarks.Keywords: expert system, knowledge management, pipeline projects, risk mismanagement
Procedia PDF Downloads 31132144 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes
Authors: Hyun-Woo Cho
Abstract:
The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.Keywords: process data, data mining, process operation, real-time monitoring
Procedia PDF Downloads 64032143 Microstructures Evolution of a Nano/Ultrafine Grained Low Carbon Steel Produced by Martensite Treatment Using Accumulative Roll Bonding
Authors: Mehdi Salari
Abstract:
This work introduces a new experimental method of martensite treatment contains accumulative roll-bonding used for producing the nano/ultrafine grained structure in low carbon steel. The ARB process up to 4 cycles was performed under unlubricated conditions, while the annealing process was carried out in the temperature range of 450–550°C for 30–100 min. The microstructures of the deformed and annealed specimens were investigated. The results showed that in the annealed specimen at 450°C for 30 or 60 min, recrystallization couldn’t be completed. Decrease in time and temperature intensified the volume fraction of the martensite cell blocks. Fully equiaxed nano/ultrafine grained ferrite was developed from the martensite cell blocks during the annealing at temperature around 500°C for 100 min.Keywords: martensite process, accumulative roll bonding, recrystallization, nanostructure, plain carbon steel
Procedia PDF Downloads 37932142 Ecological impacts of Cage Farming: A Case Study of Lake Victoria, Kenya
Authors: Mercy Chepkirui, Reuben Omondi, Paul Orina, Albert Getabu, Lewis Sitoki, Jonathan Munguti
Abstract:
Globally, the decline in capture fisheries as a result of the growing population and increasing awareness of the nutritional benefits of white meat has led to the development of aquaculture. This is anticipated to meet the increasing call for more food for the human population, which is likely to increase further by 2050. Statistics showed that more than 50% of the global future fish diet will come from aquaculture. Aquaculture began commercializing some decades ago; this is accredited to technological advancement from traditional to modern cultural systems, including cage farming. Cage farming technology has been rapidly growing since its inception in Lake Victoria, Kenya. Currently, over 6,000 cages have been set up in Kenyan waters, and this offers an excellent opportunity for recognition of Kenya’s government tactic to eliminate food insecurity and malnutrition, create employment and promote a Blue Economy. However, being an open farming enterprise is likely to emit large bulk of waste hence altering the ecosystem integrity of the lake. This is through increased chlorophyll-a pigments, alteration of the plankton community, macroinvertebrates, fish genetic pollution, transmission of fish diseases and pathogens. Cage farming further increases the nutrient loads leading to the production of harmful algal blooms, thus negatively affecting aquatic and human life. Despite the ecological transformation, cage farming provides a platform for the achievement of the Sustainable Development Goals of 2030, especially the achievement of food security and nutrition. Therefore, there is a need for Integrated Multitrophic Aquaculture as part of Blue Transformation for ecosystem monitoring.Keywords: aquaculture, ecosystem, blue economy, food security
Procedia PDF Downloads 7932141 Application of Nonlinear Model to Optimize the Coagulant Dose in Drinking Water Treatment
Authors: M. Derraz, M.Farhaoui
Abstract:
In the water treatment processes, the determination of the optimal dose of the coagulant is an issue of particular concern. Coagulant dosing is correlated to raw water quality which depends on some parameters (turbidity, ph, temperature, conductivity…). The objective of this study is to provide water treatment operators with a tool that enables to predict and replace, sometimes, the manual method (jar testing) used in this plant to predict the optimum coagulant dose. The model is constructed using actual process data for a water treatment plant located in the middle of Morocco (Meknes).Keywords: coagulation process, aluminum sulfate, model, coagulant dose
Procedia PDF Downloads 27732140 Removal of Tar Contents in Syngas by Using Different Fuel from Downdraft Biomass Gasification System
Authors: Muhammad Awais, Wei Li, Anjum Munir
Abstract:
Biomass gasification is a process of converting solid biomass ingredients into a combustible gas which can be used in electricity generation. Regardless of their applications in many fields, biomass gasification technology is still facing many cleaning issues of syngas. Tar production in biomass gasification process is one of the biggest challenges for this technology. The aimed of this study is to evaluate the tar contents in syngas produced from wood chips, corn cobs, coconut shells and mixture of corn cobs and wood chips as biomass fuel and tar removal efficiency of different cleaning units integrated with gassifier. Performance of different cleaning units, i.e., cyclone separator, wet scrubber, biomass filter, and auxiliary filter was tested under two biomass fuels. Results of this study indicate that wood chips produced less tar of 1736 mg/Nm³ as compared to corn cobs which produced tor 2489 mg/Nm³. It is also observed that coconut shells produced a high amount of tar. It was observed that when wood chips were used as a fuel, syngas tar contents were reduced from 6600 to 112 mg/Nm³ while in case of corn cob, they were reduced from 7500 mg/Nm³ to 220 mg/Nm³. Overall tar removal efficiencies of cyclone separator, wet scrubber, biomass filter, and auxiliary filter was 72%, 63%, 74%, 35% respectively.Keywords: biomass, gasification, tar, cleaning system, biomass filter
Procedia PDF Downloads 17432139 Iron Catalyst for Decomposition of Methane: Influence of Al/Si Ratio Support
Authors: A. S. Al-Fatesh, A. A. Ibrahim, A. M. AlSharekh, F. S. Alqahtani, S. O. Kasim, A. H. Fakeeha
Abstract:
Hydrogen is the expected future fuel since it produces energy without any pollution. It can be used as a fuel directly or through the fuel cell. It is also used in chemical and petrochemical industry as reducing agent or in hydrogenation processes. It is produced by different methods such as reforming of hydrocarbon, electrolytic method and methane decomposition. The objective of the present paper is to study the decomposition of methane reaction at 700°C and 800°C. The catalysts were prepared via impregnation method using 20%Fe and different proportions of combined alumina and silica support using the following ratios [100%, 90%, 80%, and 0% Al₂O₃/SiO₂]. The prepared catalysts were calcined and activated at 600 OC and 500 OC respectively. The reaction was carried out in fixed bed reactor at atmospheric pressure using 0.3g of catalyst and feed gas ratio of 1.5/1 CH₄/N₂ with a total flow rate 25 mL/min. Catalyst characterizations (TPR, TGA, BET, XRD, etc.) have been employed to study the behavior of catalysts before and after the reaction. Moreover, a brief description of the weight loss and the CH₄ conversions versus time on stream relating the different support ratios over 20%Fe/Al₂O₃/SiO₂ catalysts has been added as well. The results of TGA analysis provided higher weights losses for catalysts operated at 700°C than 800°C. For the 90% Al₂O₃/SiO₂, the activity decreases with the time on stream using 800°C reaction temperature from 73.9% initial CH₄ conversion to 46.3% for a period of 300min, whereas the activity for the same catalyst increases from 47.1% to 64.8% when 700°C reaction temperature is employed. Likewise, for 80% Al₂O₃/SiO₂ the trend of activity is similar to that of 90% Al₂O₃/SiO₂ but with a different rate of activity variation. It can be inferred from the activity results that the ratio of Al₂O₃ to SiO₂ is crucial and it is directly proportional with the activity. Whenever the Al/Si ratio decreases the activity declines. Indeed, the CH₄ conversion of 100% SiO₂ support was less than 5%.Keywords: Al₂O₃, SiO₂, CH₄ decomposition, hydrogen, iron
Procedia PDF Downloads 17932138 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 6632137 Care as a Situated Universal: Defining Care as a Practical Phenomenology Study
Authors: Amanda Aliende da Matta
Abstract:
This communication presents an aspect of phenomenon selection in an applied hermeneutic phenomenology study on care and vulnerability: the need to consider it as a situated universal. For that, we will first present the study and its methodology. Secondly, we will expose the need to understand phenomena as situation-defined, incorporating feminist thought. In an informatics class for 14 year olds, we explained the exercise: students have to make a 5 slide presentation about a topic of their choice. A does it on streetwear, B on Cristiano Ronaldo, C on Marvel, but J did it on Down Syndrome. Introducing it to the class, J explains the physical and cognitive differences caused by trisomy; when asked to explain it further, he says: "they are angels, teacher," and shows us a poster on his cellphone that says: if you laugh at a different child he will laugh with you because his innocence outweighs your ignorance. The anecdote shows, better than any theoretical explanation, something that some vulnerable people have; something beautiful and special but difficult to define. Let's call this something caring. The research has the main objective of accounting for the experience of caregiving in vulnerability, and it will be carried out with Applied Hermeneutic Phenomenology (AHP). The method's objective is to investigate the lived human experience in its pre-reflexive dimension to know its meaning structures. Contrary to other research methods, AHP does not produce theory about a specific context but seeks the meaning of the lived experience, in its characteristic of human experience. However, it is necessary that we understand care as defined in a concrete situation. We cannot start the research with an a priori definitive concept of care, or we would fall into the mistake of closing ourselves to only what we already know, as explained by Levinas. We incorporate, then, the notion of situated universals. Loyal to phenomenology, the definition of the phenomenon should start with an investigation of the word's etymology: the word cura, in its etymological root, means care. And care comes from the Latin word cogitātus/cōgĭto, which means "to pursue something in mind" and "to consider thoroughly." The verb cōgĭto, meanwhile, is composed of co- (altogether) and agitare (to deal with or think committedly about something, to concern oneself with) / ăgĭto (to set in motion, to move). Care, therefore, has in its origin a meditation on something, a concern about something, a verb that has a sense of action and movement. To care is to act out of concern for something/someone. This etymology, though, is not the final definition of the phenomenon, but only its skeleton. It needs to be embodied in the concrete situation to become a possible lived experience. And that means that the lived experience descriptions (LEDs) should be selected by taking into consideration how and if care was engendered in that concrete experience. Defining the phenomenon has to take into consideration situated knowledge.Keywords: applied hermeneutic phenomenology, care ethics, hermeneutics, phenomenology, situated universalism
Procedia PDF Downloads 8832136 A Process to Support Multidisciplinary Teams to Design Serious Games
Authors: Naza Djafarova, Tony Bates, Margaret Verkuyl, Leonora Zefi, Ozgur Turetken, Alex Ferworn, Mastrilli Paula, Daria Romaniuk, Kosha Bramesfeld, Anastasia Dimitriadou, Cheryl To
Abstract:
Designing serious games for education is a challenging and resource-intensive effort. If a well-designed process that balances pedagogical principles with game mechanics is in place, it can help to simplify the design process of serious games and increase efficiency. Multidisciplinary teams involved in designing serious games can benefit tremendously from such a process in their endeavours to develop and implement these games at undergraduate and graduate levels. This paper presentation will outline research results on identified gaps within existing processes and frameworks and present an adapted process that emerged from the research. The research methodology was based on a survey, semi-structured interviews and workshops for testing the adapted process for game design. Based on the findings, the authors propose a simple process for the pre-production stage of serious game design that may help guide multidisciplinary teams in their work. This process was used to facilitate team brainstorming, and is currently being tested to assess if multidisciplinary teams find value in using it in their process of designing serious games.Keywords: serious game-design, multidisciplinary team, game design framework, learning games, multidisciplinary game design process
Procedia PDF Downloads 42932135 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 12032134 Decision Support System for the Management and Maintenance of Sewer Networks
Authors: A. Bouamrane, M. T. Bouziane, K. Boutebba, Y. Djebbar
Abstract:
This paper aims to develop a decision support tool to provide solutions to the problems of sewer networks management/maintenance in order to assist the manager to sort sections upon priority of intervention by taking account of the technical, economic, social and environmental standards as well as the managers’ strategy. This solution uses the Analytic Network Process (ANP) developed by Thomas Saaty, coupled with a set of tools for modelling and collecting integrated data from a geographic information system (GIS). It provides to the decision maker a tool adapted to the reality on the ground and effective in usage compared to the means and objectives of the manager.Keywords: multi-criteria decision support, maintenance, Geographic Information System, modelling
Procedia PDF Downloads 63732133 Building Information Modelling Based Value for Money Assessment in Public-Private Partnership
Authors: Guoqian Ren, Haijiang Li, Jisong Zhang
Abstract:
Over the past 40 years, urban development has undergone large-scale, high-speed expansion, beyond what was previously considered normal and in a manner not proportionally related to population growth or physical considerations. With more scientific and refined decision-making in the urban construction process, new urbanization approaches, aligned with public-private partnerships (PPPs) which evolved in the early 1990s, have become acceptable and, in some situations, even better solutions to outstanding urban municipal construction projects, especially in developing countries. However, as the main driving force to deal with urban public services, PPPs are still problematic regarding value for money (VFM) process in most large-scale construction projects. This paper therefore reviews recent PPP articles in popular project management journals and relevant toolkits, published in the last 10 years, to identify the indicators that influence VFM within PPPs across regions. With increasing concerns about profitability and environmental and social impacts, the current PPP structure requires a more integrated platform to manage multi-performance project life cycles. Building information modelling (BIM), a popular approach to the procurement process in AEC sectors, provides the potential to ensure VFM while also working in tandem with the semantic approach to holistically measure life cycle costs (LCC) and achieve better sustainability. This paper suggests that BIM applied to the entire PPP life cycle could support holistic decision-making regarding VFM processes and thus meet service targets.Keywords: public-private partnership, value for money, building information modelling, semantic approach
Procedia PDF Downloads 20932132 Comparative Performance of Retting Methods on Quality Jute Fibre Production and Water Pollution for Environmental Safety
Authors: A. K. M. Zakir Hossain, Faruk-Ul Islam, Muhammad Alamgir Chowdhury, Kazi Morshed Alam, Md. Rashidul Islam, Muhammad Humayun Kabir, Noshin Ara Tunazzina, Taufiqur Rahman, Md. Ashik Mia, Ashaduzzaman Sagar
Abstract:
The jute retting process is one of the key factors for the excellent jute fibre production as well as maintaining water quality. The traditional method of jute retting is time-consuming and hampers the fish cultivation by polluting the water body. Therefore, a low cost, time-saving, environment-friendly, and improved technique is essential for jute retting to overcome this problem. Thus the study was focused to compare the extent of water pollution and fibre quality of two retting systems, i.e., traditional retting practices over-improved retting method (macha retting) by assessing different physico-chemical and microbiological properties of water and fibre quality parameters. Water samples were collected from the top and bottom of the retting place at the early, mid, and final stages of retting from four districts of Bangladesh viz., Gaibandha, Kurigram, Lalmonirhat, and Rangpur. Different physico-chemical parameters of water samples viz., pH, dissolved oxygen (DO), conductivity (CD), total dissolved solids (TDS), hardness, calcium, magnesium, carbonate, bicarbonate, chloride, phosphorus and sulphur content were measured. Irrespective of locations, the DO of the final stage retting water samples was very low as compared to the mid and early stage, and the DO of traditional jute retting method was significantly lower than the improved macha method. The pH of the water samples was slightly more acidic in the traditional retting method than that of the improved macha method. Other physico-chemical parameters of the water sample were found higher in the traditional method over-improved macha retting in all the stages of retting. Bacterial species were isolated from the collected water samples following the dilution plate technique. Microbiological results revealed that water samples of improved macha method contained more bacterial species that are supposed to involve in jute retting as compared to water samples of the traditional retting method. The bacterial species were then identified by the sequencing of 16SrDNA. Most of the bacterial species identified belong to the genera Pseudomonas, Bacillus, Pectobacterium, and Stenotrophomonas. In addition, the tensile strength of the jute fibre was tested, and the results revealed that the improved macha method showed higher mechanical strength than the traditional method in most of the locations. The overall results indicate that the water and fibre quality were found better in the improved macha retting method than the traditional method. Therefore, a time-saving and cost-friendly improved macha retting method can be widely adopted for the jute retting process to get the quality jute fiber and to keep the environment clean and safe.Keywords: jute retting methods, physico-chemical parameters, retting microbes, tensile strength, water quality
Procedia PDF Downloads 15732131 Method for Auto-Calibrate Projector and Color-Depth Systems for Spatial Augmented Reality Applications
Authors: R. Estrada, A. Henriquez, R. Becerra, C. Laguna
Abstract:
Spatial Augmented Reality is a variation of Augmented Reality where the Head-Mounted Display is not required. This variation of Augmented Reality is useful in cases where the need for a Head-Mounted Display itself is a limitation. To achieve this, Spatial Augmented Reality techniques substitute the technological elements of Augmented Reality; the virtual world is projected onto a physical surface. To create an interactive spatial augmented experience, the application must be aware of the spatial relations that exist between its core elements. In this case, the core elements are referred to as a projection system and an input system, and the process to achieve this spatial awareness is called system calibration. The Spatial Augmented Reality system is considered calibrated if the projected virtual world scale is similar to the real-world scale, meaning that a virtual object will maintain its perceived dimensions when projected to the real world. Also, the input system is calibrated if the application knows the relative position of a point in the projection plane and the RGB-depth sensor origin point. Any kind of projection technology can be used, light-based projectors, close-range projectors, and screens, as long as it complies with the defined constraints; the method was tested on different configurations. The proposed procedure does not rely on a physical marker, minimizing the human intervention on the process. The tests are made using a Kinect V2 as an input sensor and several projection devices. In order to test the method, the constraints defined were applied to a variety of physical configurations; once the method was executed, some variables were obtained to measure the method performance. It was demonstrated that the method obtained can solve different arrangements, giving the user a wide range of setup possibilities.Keywords: color depth sensor, human computer interface, interactive surface, spatial augmented reality
Procedia PDF Downloads 12432130 Towards Automatic Calibration of In-Line Machine Processes
Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales
Abstract:
In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820Keywords: data model, machine learning, industrial winding, calibration
Procedia PDF Downloads 24132129 Superhydrophobic, Heteroporous Flexible Ceramic for Micro-Emulsion Separation, Oil Sorption, and Recovery of Fats, Oils, and Grease from Restaurant Wastewater
Authors: Jhoanne Pedres Boñgol, Zhang Liu, Yuyin Qiu, King Lun Yeung
Abstract:
Flexible ceramic sorbent material can be a viable technology to capture and recover emulsified fats, oils, and grease (FOG) that often cause sanitary sewer overflows. This study investigates the sorption capacity and recovery rate of ceramic material in surfactant-stabilized oil-water emulsion by synthesizing silica aerogel: SiO₂–X via acid-base sol-gel method followed by ambient pressure drying. The SiO₂–X is amorphous, microstructured, lightweight, flexible, and highly oleophilic. It displays spring-back behavior apparent at 80% compression with compressive strength of 0.20 MPa and can stand a weight of 1000 times its own. The contact angles measured at 0° and 177° in oil and water, respectively, confirm its oleophilicity and hydrophobicity while its thermal stability even at 450 °C is confirmed via TGA. In pure oil phase, the qe,AV. of 1x1 mm SiO₂–X is 7.5 g g⁻¹ at tqe= 10 min, and a qe,AV. of 6.05 to 6.76 g g⁻¹ at tqe= 24 hrs in O/W emulsion. The filter ceramic can be reused 50 x with 75-80 % FOG recovery by manual compression.Keywords: adsorption, aerogel, emulsion, FOG
Procedia PDF Downloads 15732128 Automated Manual Handling Risk Assessments: Practitioner Experienced Determinants of Automated Risk Analysis and Reporting Being a Benefit or Distraction
Authors: S. Cowley, M. Lawrance, D. Bick, R. McCord
Abstract:
Technology that automates manual handling (musculoskeletal disorder or MSD) risk assessments is increasingly available to ergonomists, engineers, generalist health and safety practitioners alike. The risk assessment process is generally based on the use of wearable motion sensors that capture information about worker movements for real-time or for posthoc analysis. Traditionally, MSD risk assessment is undertaken with the assistance of a checklist such as that from the SafeWork Australia code of practice, the expert assessor observing the task and ideally engaging with the worker in a discussion about the detail. Automation enables the non-expert to complete assessments and does not always require the assessor to be there. This clearly has cost and time benefits for the practitioner but is it an improvement on the assessment by the human. Human risk assessments draw on the knowledge and expertise of the assessor but, like all risk assessments, are highly subjective. The complexity of the checklists and models used in the process can be off-putting and sometimes will lead to the assessment becoming the focus and the end rather than a means to an end; the focus on risk control is lost. Automated risk assessment handles the complexity of the assessment for the assessor and delivers a simple risk score that enables decision-making regarding risk control. Being machine-based, they are objective and will deliver the same each time they assess an identical task. However, the WHS professional needs to know that this emergent technology asks the right questions and delivers the right answers. Whether it improves the risk assessment process and results or simply distances the professional from the task and the worker. They need clarity as to whether automation of manual task risk analysis and reporting leads to risk control or to a focus on the worker. Critically, they need evidence as to whether automation in this area of hazard management leads to better risk control or just a bigger collection of assessments. Practitioner experienced determinants of this automated manual task risk analysis and reporting being a benefit or distraction will address an understanding of emergent risk assessment technology, its use and things to consider when making decisions about adopting and applying these technologies.Keywords: automated, manual-handling, risk-assessment, machine-based
Procedia PDF Downloads 11932127 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 18832126 Guidelines to Designing Generic Protocol for Responding to Chemical, Biological, Radiological and Nuclear Incidents
Authors: Mohammad H. Yarmohammadian, Mehdi Nasr Isfahani, Elham Anbari
Abstract:
Introduction: The awareness of using chemical, biological, and nuclear agents in everyday industrial and non-industrial incidents has increased recently; release of these materials can be accidental or intentional. Since hospitals are the forefronts of confronting Chemical, Biological, Radiological and Nuclear( CBRN) incidents, the goal of the present research was to provide a generic protocol for CBRN incidents through a comparative review of CBRN protocols and guidelines of different countries and reviewing various books, handbooks and papers. Method: The integrative approach or research synthesis was adopted in this study. First a simple narrative review of programs, books, handbooks, and papers about response to CBRN incidents in different countries was carried out. Then the most important and functional information was discussed in the form of a generic protocol in focus group sessions and subsequently confirmed. Results: Findings indicated that most of the countries had various protocols, guidelines, and handbooks for hazardous materials or CBRN incidents. The final outcome of the research synthesis was a 50 page generic protocol whose main topics included introduction, definition and classification of CBRN agents, four major phases of incident and disaster management cycle, hospital response management plan, equipment, and recommended supplies and antidotes for decontamination (radiological/nuclear, chemical, biological); each of these also had subtopics. Conclusion: In the majority of international protocols, guidelines, handbooks and also international and Iranian books and papers, there is an emphasis on the importance of incident command system, determining the safety degree of decontamination zones, maps of decontamination zones, decontamination process, triage classifications, personal protective equipment, and supplies and antidotes for decontamination; these are the least requirements for such incidents and also consistent with the provided generic protocol.Keywords: hospital, CBRN, decontamination, generic protocol, CBRN Incidents
Procedia PDF Downloads 29532125 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility
Authors: Akash Verma, Sujit Kumar Samanta
Abstract:
This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization
Procedia PDF Downloads 4332124 Pre-Eliminary Design Adjustable Workstation for Piston Assembly Line Considering Anthropometric for Indonesian People
Authors: T. Yuri M. Zagloel, Inaki M. Hakim, Syarafi A. M.
Abstract:
Manufacturing process has been considered as one of the most important activity in business process. It correlates with productivity and quality of the product so industries could fulfill customer’s demand. With the increasing demand from customer, industries must improve their manufacturing ability such as shorten lead time and reduce wastes on their process. Lean manufacturing has been considered as one of the tools to waste elimination in manufacturing or service industri. Workforce development is one practice in lean manufacturing that can reduce waste generated from operator such as waste of unnecessary motion. Anthropometric approach is proposed to determine the recommended measurement in operator’s work area. The method will get some dimensions from Indonesia people that related to piston workstation. The result from this research can be obtained new design for the workarea considering ergonomic aspect.Keywords: adjustable, anthropometric, ergonomic, waste
Procedia PDF Downloads 40032123 Part Variation Simulations: An Industrial Case Study with an Experimental Validation
Authors: Narendra Akhadkar, Silvestre Cano, Christophe Gourru
Abstract:
Injection-molded parts are widely used in power system protection products. One of the biggest challenges in an injection molding process is shrinkage and warpage of the molded parts. All these geometrical variations may have an adverse effect on the quality of the product, functionality, cost, and time-to-market. The situation becomes more challenging in the case of intricate shapes and in mass production using multi-cavity tools. To control the effects of shrinkage and warpage, it is very important to correctly find out the input parameters that could affect the product performance. With the advances in the computer-aided engineering (CAE), different tools are available to simulate the injection molding process. For our case study, we used the MoldFlow insight tool. Our aim is to predict the spread of the functional dimensions and geometrical variations on the part due to variations in the input parameters such as material viscosity, packing pressure, mold temperature, melt temperature, and injection speed. The input parameters may vary during batch production or due to variations in the machine process settings. To perform the accurate product assembly variation simulation, the first step is to perform an individual part variation simulation to render realistic tolerance ranges. In this article, we present a method to simulate part variations coming from the input parameters variation during batch production. The method is based on computer simulations and experimental validation using the full factorial design of experiments (DoE). The robustness of the simulation model is verified through input parameter wise sensitivity analysis study performed using simulations and experiments; all the results show a very good correlation in the material flow direction. There exists a non-linear interaction between material and the input process variables. It is observed that the parameters such as packing pressure, material, and mold temperature play an important role in spread on functional dimensions and geometrical variations. This method will allow us in the future to develop accurate/realistic virtual prototypes based on trusted simulated process variation and, therefore, increase the product quality and potentially decrease the time to market.Keywords: correlation, molding process, tolerance, sensitivity analysis, variation simulation
Procedia PDF Downloads 17832122 Integrated Evaluation of Green Design and Green Manufacturing Processes Using a Mathematical Model
Authors: Yuan-Jye Tseng, Shin-Han Lin
Abstract:
In this research, a mathematical model for integrated evaluation of green design and green manufacturing processes is presented. To design a product, there can be alternative options to design the detailed components to fulfill the same product requirement. In the design alternative cases, the components of the product can be designed with different materials and detailed specifications. If several design alternative cases are proposed, the different materials and specifications can affect the manufacturing processes. In this paper, a new concept for integrating green design and green manufacturing processes is presented. A green design can be determined based the manufacturing processes of the designed product by evaluating the green criteria including energy usage and environmental impact, in addition to the traditional criteria of manufacturing cost. With this concept, a mathematical model is developed to find the green design and the associated green manufacturing processes. In the mathematical model, the cost items include material cost, manufacturing cost, and green related cost. The green related cost items include energy cost and environmental cost. The objective is to find the decisions of green design and green manufacturing processes to achieve the minimized total cost. In practical applications, the decision-making can be made to select a good green design case and its green manufacturing processes. In this presentation, an example product is illustrated. It shows that the model is practical and useful for integrated evaluation of green design and green manufacturing processes.Keywords: supply chain management, green supply chain, green design, green manufacturing, mathematical model
Procedia PDF Downloads 80732121 Artificial Generation of Visual Evoked Potential to Enhance Visual Ability
Authors: A. Vani, M. N. Mamatha
Abstract:
Visual signal processing in human beings occurs in the occipital lobe of the brain. The signals that are generated in the brain are universal for all the human beings and they are called Visual Evoked Potential (VEP). Generally, the visually impaired people lose sight because of severe damage to only the eyes natural photo sensors, but the occipital lobe will still be functioning. In this paper, a technique of artificially generating VEP is proposed to enhance the visual ability of the subject. The system uses the electrical photoreceptors to capture image, process the image, to detect and recognize the subject or object. This voltage is further processed and can transmit wirelessly to a BIOMEMS implanted into occipital lobe of the patient’s brain. The proposed BIOMEMS consists of array of electrodes that generate the neuron potential which is similar to VEP of normal people. Thus, the neurons get the visual data from the BioMEMS which helps in generating partial vision or sight for the visually challenged patient.Keywords: BioMEMS, neuro-prosthetic, openvibe, visual evoked potential
Procedia PDF Downloads 31532120 Effects of Aerodynamic on Suspended Cables Using Non-Linear Finite Element Approach
Authors: Justin Nwabanne, Sam Omenyi, Jeremiah Chukwuneke
Abstract:
This work presents structural nonlinear static analysis of a horizontal taut cable using Finite Element Analysis (FEA) method. The FEA was performed analytically to determine the tensions at each nodal point and subsequently, performed based on finite element displacement method computationally using the FEA software, ANSYS 14.0 to determine their behaviour under the influence of aerodynamic forces imposed on the cable. The convergence procedure is adapted into the method to prevent excessive displacements through the computations. The work compared the two FEA cases by examining the effectiveness of the analytical model in describing the response with few degrees of freedom and the ability of the nonlinear finite element procedure adopted to capture the complex features of cable dynamics with reference to the aerodynamic external influence. Results obtained from this work explain that the analytic FEM results without aerodynamic influence show a parabolic response with an optimum deflection at nodal points 12 and 13 with the cable weight at nodes 12 and 13 having the value -1.002936N while for the cable tension shows an optimum deflection value for nodes 12 and 13 at -189396.97kg/km. The maximum displacement for the cable system was obtained from ANSYS 14.0 as 4483.83 mm for X, Y and Z components of displacements at node number 2 while the maximum displacement obtained is 4218.75mm for all the directional components. The dynamic behaviour of a taut cable investigated has application in a typical power transmission line. Aerodynamic influences on the cables were considered using FEA approach by employing ANSYS 14.0 showed a complex modal behaviour as expected.Keywords: aerodynamics, cable tension and weight, finite element analysis, nodal, non-linear model, optimum deflection, suspended cable, transmission line
Procedia PDF Downloads 278