Search results for: process modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16573

Search results for: process modelling

15313 On the Problems of Human Concept Learning within Terminological Systems

Authors: Farshad Badie

Abstract:

The central focus of this article is on the fact that knowledge is constructed from an interaction between humans’ experiences and over their conceptions of constructed concepts. Logical characterisation of ‘human inductive learning over human’s constructed concepts’ within terminological systems and providing a logical background for theorising over the Human Concept Learning Problem (HCLP) in terminological systems are the main contributions of this research. This research connects with the topics ‘human learning’, ‘epistemology’, ‘cognitive modelling’, ‘knowledge representation’ and ‘ontological reasoning’.

Keywords: human concept learning, concept construction, knowledge construction, terminological systems

Procedia PDF Downloads 325
15312 Effect of Electromagnetic Fields on Protein Extraction from Shrimp By-Products for Electrospinning Process

Authors: Guido Trautmann-Sáez, Mario Pérez-Won, Vilbett Briones, María José Bugueño, Gipsy Tabilo-Munizaga, Luis Gonzáles-Cavieres

Abstract:

Shrimp by-products are a valuable source of protein. However, traditional protein extraction methods have limitations in terms of their efficiency. Protein extraction from shrimp (Pleuroncodes monodon) industrial by-products assisted with ohmic heating (OH), microwave (MW) and pulsed electric field (PEF). It was performed by chemical method (using NaOH and HCl 2M) assisted with OH, MW and PEF in a continuous flow system (5 ml/s). Protein determination, differential scanning calorimetry (DSC) and Fourier-transform infrared (FTIR). Results indicate a 19.25% (PEF) 3.65% (OH) and 28.19% (MW) improvement in protein extraction efficiency. The most efficient method was selected for the electrospinning process and obtaining fiber.

Keywords: electrospinning process, emerging technology, protein extraction, shrimp by-products

Procedia PDF Downloads 89
15311 Pattern Recognition Using Feature Based Die-Map Clustering in the Semiconductor Manufacturing Process

Authors: Seung Hwan Park, Cheng-Sool Park, Jun Seok Kim, Youngji Yoo, Daewoong An, Jun-Geol Baek

Abstract:

Depending on the big data analysis becomes important, yield prediction using data from the semiconductor process is essential. In general, yield prediction and analysis of the causes of the failure are closely related. The purpose of this study is to analyze pattern affects the final test results using a die map based clustering. Many researches have been conducted using die data from the semiconductor test process. However, analysis has limitation as the test data is less directly related to the final test results. Therefore, this study proposes a framework for analysis through clustering using more detailed data than existing die data. This study consists of three phases. In the first phase, die map is created through fail bit data in each sub-area of die. In the second phase, clustering using map data is performed. And the third stage is to find patterns that affect final test result. Finally, the proposed three steps are applied to actual industrial data and experimental results showed the potential field application.

Keywords: die-map clustering, feature extraction, pattern recognition, semiconductor manufacturing process

Procedia PDF Downloads 402
15310 Optimization of Bio-Diesel Production from Rubber Seed Oils

Authors: Pawit Tangviroon, Apichit Svang-Ariyaskul

Abstract:

Rubber seed oil is an attractive alternative feedstock for biodiesel production because it is not related to food-chain plant. Rubber seed oil contains large amount of free fatty acids, which causes problem in biodiesel production. Free fatty acids can react with alkaline catalyst in biodiesel production. Acid esterification is used as pre-treatment to convert unwanted compound to desirable biodiesel. Phase separation of oil and methanol occurs at low ratio of methanol to oil and causes low reaction rate and conversion. Acid esterification requires large excess of methanol in order to increase the miscibility of methanol in oil and accordingly, it is a more expensive separation process. In this work, the kinetics of esterification of rubber seed oil with methanol is developed from available experimental results. Reactive distillation process was designed by using Aspen Plus program. The effects of operating parameters such as feed ratio, molar reflux ratio, feed temperature, and feed stage are investigated in order to find the optimum conditions. Results show that the reactive distillation process is proved to be better than conventional process. It consumes less feed methanol and less energy while yielding higher product purity than the conventional process. This work can be used as a guideline for further development to industrial scale of biodiesel production using reactive distillation.

Keywords: biodiesel, reactive distillation, rubber seed oil, transesterification

Procedia PDF Downloads 351
15309 A Learning Process for Aesthetics of Language in Thai Poetry for High School Teachers

Authors: Jiraporn Adchariyaprasit

Abstract:

The aesthetics of language in Thai poetry are emerged from the combination of sounds and meanings. The appreciation of such beauty can be achieved by means of education, acquisition of knowledge, and training. This research aims to study the learning process of aesthetics of language in Thai poetry for high school teachers in Bangkok and nearby provinces. There are 10 samples selected by purposive sampling for in-depth interviews. According to the research, there are four patterns in the learning process of aesthetics of language in Thai poetry which are 1) the study of characteristics and patterns of poetry, 2) the training of poetic reading, 3) the study of social and cultural contexts of poetry’s creation, and 4) the study of other sciences related to poetry such as linguistics, traditional dance, and so on.

Keywords: aesthetics, poetry, Thai poetry, poetry learning

Procedia PDF Downloads 436
15308 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series

Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold

Abstract:

To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.

Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network

Procedia PDF Downloads 139
15307 Development of Market Penetration for High Energy Efficiency Technologies in Alberta’s Residential Sector

Authors: Saeidreza Radpour, Md. Alam Mondal, Amit Kumar

Abstract:

Market penetration of high energy efficiency technologies has key impacts on energy consumption and GHG mitigation. Also, it will be useful to manage the policies formulated by public or private organizations to achieve energy or environmental targets. Energy intensity in residential sector of Alberta was 148.8 GJ per household in 2012 which is 39% more than the average of Canada 106.6 GJ, it was the highest amount among the provinces on per household energy consumption. Energy intensity by appliances of Alberta was 15.3 GJ per household in 2012 which is 14% higher than average value of other provinces and territories in energy demand intensity by appliances in Canada. In this research, a framework has been developed to analyze the market penetration and market share of high energy efficiency technologies in residential sector. The overall methodology was based on development of data-intensive models’ estimation of the market penetration of the appliances in the residential sector over a time period. The developed models were a function of a number of macroeconomic and technical parameters. Developed mathematical equations were developed based on twenty-two years of historical data (1990-2011). The models were analyzed through a series of statistical tests. The market shares of high efficiency appliances were estimated based on the related variables such as capital and operating costs, discount rate, appliance’s life time, annual interest rate, incentives and maximum achievable efficiency in the period of 2015 to 2050. Results show that the market penetration of refrigerators is higher than that of other appliances. The stocks of refrigerators per household are anticipated to increase from 1.28 in 2012 to 1.314 and 1.328 in 2030 and 2050, respectively. Modelling results show that the market penetration rate of stand-alone freezers will decrease between 2012 and 2050. Freezer stock per household will decline from 0.634 in 2012 to 0.556 and 0.515 in 2030 and 2050, respectively. The stock of dishwashers per household is expected to increase from 0.761 in 2012 to 0.865 and 0.960 in 2030 and 2050, respectively. The increase in the market penetration rate of clothes washers and clothes dryers is nearly parallel. The stock of clothes washers and clothes dryers per household is expected to rise from 0.893 and 0.979 in 2012 to 0.960 and 1.0 in 2050, respectively. This proposed presentation will include detailed discussion on the modelling methodology and results.

Keywords: appliances efficiency improvement, energy star, market penetration, residential sector

Procedia PDF Downloads 285
15306 Modelling and Optimization of a Combined Sorption Enhanced Biomass Gasification with Hydrothermal Carbonization, Hot Gas Cleaning and Dielectric Barrier Discharge Plasma Reactor to Produce Pure H₂ and Methanol Synthesis

Authors: Vera Marcantonio, Marcello De Falco, Mauro Capocelli, Álvaro Amado-Fierro, Teresa A. Centeno, Enrico Bocci

Abstract:

Concerns about energy security, energy prices, and climate change led scientific research towards sustainable solutions to fossil fuel as renewable energy sources coupled with hydrogen as an energy vector and carbon capture and conversion technologies. Among the technologies investigated in the last decades, biomass gasification acquired great interest owing to the possibility of obtaining low-cost and CO₂ negative emission hydrogen production from a large variety of everywhere available organic wastes. Upstream and downstream treatment were then studied in order to maximize hydrogen yield, reduce the content of organic and inorganic contaminants under the admissible levels for the technologies which are coupled with, capture, and convert carbon dioxide. However, studies which analyse a whole process made of all those technologies are still missing. In order to fill this lack, the present paper investigated the coexistence of hydrothermal carbonization (HTC), sorption enhance gasification (SEG), hot gas cleaning (HGC), and CO₂ conversion by dielectric barrier discharge (DBD) plasma reactor for H₂ production from biomass waste by means of Aspen Plus software. The proposed model aimed to identify and optimise the performance of the plant by varying operating parameters (such as temperature, CaO/biomass ratio, separation efficiency, etc.). The carbon footprint of the global plant is 2.3 kg CO₂/kg H₂, lower than the latest limit value imposed by the European Commission to consider hydrogen as “clean”, that was set to 3 kg CO₂/kg H₂. The hydrogen yield referred to the whole plant is 250 gH₂/kgBIOMASS.

Keywords: biomass gasification, hydrogen, aspen plus, sorption enhance gasification

Procedia PDF Downloads 78
15305 Treatment of Healthcare Wastewater Using The Peroxi-Photoelectrocoagulation Process: Predictive Models for Chemical Oxygen Demand, Color Removal, and Electrical Energy Consumption

Authors: Samuel Fekadu A., Esayas Alemayehu B., Bultum Oljira D., Seid Tiku D., Dessalegn Dadi D., Bart Van Der Bruggen A.

Abstract:

The peroxi-photoelectrocoagulation process was evaluated for the removal of chemical oxygen demand (COD) and color from healthcare wastewater. A 2-level full factorial design with center points was created to investigate the effect of the process parameters, i.e., initial COD, H₂O₂, pH, reaction time and current density. Furthermore, the total energy consumption and average current efficiency in the system were evaluated. Predictive models for % COD, % color removal and energy consumption were obtained. The initial COD and pH were found to be the most significant variables in the reduction of COD and color in peroxi-photoelectrocoagulation process. Hydrogen peroxide only has a significant effect on the treated wastewater when combined with other input variables in the process like pH, reaction time and current density. In the peroxi-photoelectrocoagulation process, current density appears not as a single effect but rather as an interaction effect with H₂O₂ in reducing COD and color. Lower energy expenditure was observed at higher initial COD, shorter reaction time and lower current density. The average current efficiency was found as low as 13 % and as high as 777 %. Overall, the study showed that hybrid electrochemical oxidation can be applied effectively and efficiently for the removal of pollutants from healthcare wastewater.

Keywords: electrochemical oxidation, UV, healthcare pollutants removals, factorial design

Procedia PDF Downloads 79
15304 PID Control of Quad-Rotor Unnamed Vehicle Based on Lagrange Approach Modelling

Authors: A. Benbouali, H. Saidi, A. Derrouazin, T. Bessaad

Abstract:

Aerial robotics is a very exciting research field dealing with a variety of subjects, including the attitude control. This paper deals with the control of a four rotor vertical take-off and landing (VTOL) Unmanned Aerial Vehicle. The paper presents a mathematical model based on the approach of Lagrange for the flight control of an autonomous quad-rotor. It also describes the controller architecture which is based on PID regulators. The control method has been simulated in closed loop in different situations. All the calculation stages and the simulation results have been detailed.

Keywords: quad-rotor, lagrange approach, proportional integral derivate (PID) controller, Matlab/Simulink

Procedia PDF Downloads 400
15303 Influence of [Emim][OAc] and Water on Gelatinization Process and Interactions with Starch

Authors: Shajaratuldur Ismail, Nurlidia Mansor, Zakaria Man

Abstract:

Thermoplastic starch (TPS) plasticized by 1-ethyl-3-methylimidazolium acetate [Emim][OAc] were obtained through gelatinization process. The gelatinization process occurred in the presence of water and [Emim][OAc] as plasticizer at high temperature (90˚C). The influence of [Emim][OAc] and water on the gelatinization and interactions with starch have been studied over a range of compositions. The homogenous mass was obtained for the samples containing 35, 40 and 43.5 % of water contents which showed that water plays important role in gelatinization process. Detailed IR spectroscopy analysis showed decrease in hydrogen bonding intensity and strong interaction between acetate anion in [Emim][OAc] and starch hydroxyl groups in the presence of [Emim][OAc]. Starch-[Emim][OAc]-water mixture at 10-3-8.7 presented homogenous mass, less hydrogen bonding intensity and strong interaction between acetate anion in [Emim][OAc] and starch hydroxyl groups.

Keywords: starch, ionic liquid, 1-ethyl-3-methylimidazolium acetate, plasticizer, gelatinization, IR spectroscopy

Procedia PDF Downloads 229
15302 A Generalisation of Pearson's Curve System and Explicit Representation of the Associated Density Function

Authors: S. B. Provost, Hossein Zareamoghaddam

Abstract:

A univariate density approximation technique whereby the derivative of the logarithm of a density function is assumed to be expressible as a rational function is introduced. This approach which extends Pearson’s curve system is solely based on the moments of a distribution up to a determinable order. Upon solving a system of linear equations, the coefficients of the polynomial ratio can readily be identified. An explicit solution to the integral representation of the resulting density approximant is then obtained. It will be explained that when utilised in conjunction with sample moments, this methodology lends itself to the modelling of ‘big data’. Applications to sets of univariate and bivariate observations will be presented.

Keywords: density estimation, log-density, moments, Pearson's curve system

Procedia PDF Downloads 280
15301 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects

Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang

Abstract:

As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.

Keywords: 4D, 5D, 6D, active BIM

Procedia PDF Downloads 276
15300 The Study of Spray Drying Process for Skimmed Coconut Milk

Authors: Jaruwan Duangchuen, Siwalak Pathaveerat

Abstract:

Coconut (Cocos nucifera) belongs to the family Arecaceae. Coconut juice and meat are consumed as food and dessert in several regions of the world. Coconut juice contains low proteins, and arginine is the main amino acid content. Coconut meat is the endosperm of coconut that has nutritional value. It composes of carbohydrate, protein and fat. The objective of this study is utilization of by-products from the virgin coconut oil extraction process by using the skimmed coconut milk as a powder. The skimmed coconut milk was separated from the coconut milk in virgin coconut oil extraction process that consists approximately of protein 6.4%, carbohydrate 7.2%, dietary fiber 0.27 %, sugar 6.27%, fat 3.6 % and moisture content of 86.93%. This skimmed coconut milk can be made to powder for value - added product by using spray drying. The factors effect to the yield and properties of dry skimmed coconut milk in spraying process are inlet, outlet air temperature and the maltodextrin concentration. The percentage of maltodextrin content (15, 20%), outlet air temperature (80 ºC, 85 ºC, 90 ºC) and inlet air temperature (190 ºC, 200 ºC, 210 ºC) were conducted to the skimmed coconut milk spray drying process. The spray dryer was kept air flow rate (0.2698 m3 /s). The result that shown 2.22 -3.23% of moisture content, solubility, bulk density (0.4-0.67g/mL), solubility, wettability (4.04 -19.25 min) for solubility in the water, color, particle size were analyzed for the powder samples. The maximum yield (18.00%) of spray dried coconut milk powder was obtained at 210 °C of temperature, 80°C of outlet temperature and 20% maltodextrin for 27.27 second for drying time. For the amino analysis shown that the high amino acids are Glutamine (16.28%), Arginine (10.32%) and Glycerin (9.59%) by using HPLP method (UV detector).

Keywords: skimmed coconut milk, spray drying, virgin coconut oil process (VCO), maltodextrin

Procedia PDF Downloads 332
15299 Finite Element Modelling for the Development of a Planar Ultrasonic Dental Scaler for Prophylactic and Periodontal Care

Authors: Martin Hofmann, Diego Stutzer, Thomas Niederhauser, Juergen Burger

Abstract:

Dental biofilm is the main etiologic factor for caries, periodontal and peri-implant infections. In addition to the risk of tooth loss, periodontitis is also associated with an increased risk of systemic diseases such as atherosclerotic cardiovascular disease and diabetes. For this reason, dental hygienists use ultrasonic scalers for prophylactic and periodontal care of the teeth. However, the current instruments are limited to their dimensions and operating frequencies. The innovative design of a planar ultrasonic transducer introduces a new type of dental scalers. The flat titanium-based design allows the mass to be significantly reduced compared to a conventional screw-mounted Langevin transducer, resulting in a more efficient and controllable scaler. For the development of the novel device, multi-physics finite element analysis was used to simulate and optimise various design concepts. This process was supported by prototyping and electromechanical characterisation. The feasibility and potential of a planar ultrasonic transducer have already been confirmed by our current prototypes, which achieve higher performance compared to commercial devices. Operating at the desired resonance frequency of 28 kHz with a driving voltage of 40 Vrms results in an in-plane tip oscillation with a displacement amplitude of up to 75 μm by having less than 8 % out-of-plane movement and an energy transformation factor of 1.07 μm/mA. In a further step, we will adapt the design to two additional resonance frequencies (20 and 40 kHz) to obtain information about the most suitable mode of operation. In addition to the already integrated characterization methods, we will evaluate the clinical efficiency of the different devices in an in vitro setup with an artificial biofilm pocket model.

Keywords: ultrasonic instrumentation, ultrasonic scaling, piezoelectric transducer, finite element simulation, dental biofilm, dental calculus

Procedia PDF Downloads 122
15298 Characterization and Modelling of Groundwater Flow towards a Public Drinking Water Well Field: A Case Study of Ter Kamerenbos Well Field

Authors: Buruk Kitachew Wossenyeleh

Abstract:

Groundwater is the largest freshwater reservoir in the world. Like the other reservoirs of the hydrologic cycle, it is a finite resource. This study focused on the groundwater modeling of the Ter Kamerenbos well field to understand the groundwater flow system and the impact of different scenarios. The study area covers 68.9Km2 in the Brussels Capital Region and is situated in two river catchments, i.e., Zenne River and Woluwe Stream. The aquifer system has three layers, but in the modeling, they are considered as one layer due to their hydrogeological properties. The catchment aquifer system is replenished by direct recharge from rainfall. The groundwater recharge of the catchment is determined using the spatially distributed water balance model called WetSpass, and it varies annually from zero to 340mm. This groundwater recharge is used as the top boundary condition for the groundwater modeling of the study area. During the groundwater modeling using Processing MODFLOW, constant head boundary conditions are used in the north and south boundaries of the study area. For the east and west boundaries of the study area, head-dependent flow boundary conditions are used. The groundwater model is calibrated manually and automatically using observed hydraulic heads in 12 observation wells. The model performance evaluation showed that the root means the square error is 1.89m and that the NSE is 0.98. The head contour map of the simulated hydraulic heads indicates the flow direction in the catchment, mainly from the Woluwe to Zenne catchment. The simulated head in the study area varies from 13m to 78m. The higher hydraulic heads are found in the southwest of the study area, which has the forest as a land-use type. This calibrated model was run for the climate change scenario and well operation scenario. Climate change may cause the groundwater recharge to increase by 43% and decrease by 30% in 2100 from current conditions for the high and low climate change scenario, respectively. The groundwater head varies for a high climate change scenario from 13m to 82m, whereas for a low climate change scenario, it varies from 13m to 76m. If doubling of the pumping discharge assumed, the groundwater head varies from 13m to 76.5m. However, if the shutdown of the pumps is assumed, the head varies in the range of 13m to 79m. It is concluded that the groundwater model is done in a satisfactory way with some limitations, and the model output can be used to understand the aquifer system under steady-state conditions. Finally, some recommendations are made for the future use and improvement of the model.

Keywords: Ter Kamerenbos, groundwater modelling, WetSpass, climate change, well operation

Procedia PDF Downloads 152
15297 Developing Indicators in System Mapping Process Through Science-Based Visual Tools

Authors: Cristian Matti, Valerie Fowles, Eva Enyedi, Piotr Pogorzelski

Abstract:

The system mapping process can be defined as a knowledge service where a team of facilitators, experts and practitioners facilitate a guided conversation, enable the exchange of information and support an iterative curation process. System mapping processes rely on science-based tools to introduce and simplify a variety of components and concepts of socio-technical systems through metaphors while facilitating an interactive dialogue process to enable the design of co-created maps. System maps work then as “artifacts” to provide information and focus the conversation into specific areas around the defined challenge and related decision-making process. Knowledge management facilitates the curation of that data gathered during the system mapping sessions through practices of documentation and subsequent knowledge co-production for which common practices from data science are applied to identify new patterns, hidden insights, recurrent loops and unexpected elements. This study presents empirical evidence on the application of these techniques to explore mechanisms by which visual tools provide guiding principles to portray system components, key variables and types of data through the lens of climate change. In addition, data science facilitates the structuring of elements that allow the analysis of layers of information through affinity and clustering analysis and, therefore, develop simple indicators for supporting the decision-making process. This paper addresses methodological and empirical elements on the horizontal learning process that integrate system mapping through visual tools, interpretation, cognitive transformation and analysis. The process is designed to introduce practitioners to simple iterative and inclusive processes that create actionable knowledge and enable a shared understanding of the system in which they are embedded.

Keywords: indicators, knowledge management, system mapping, visual tools

Procedia PDF Downloads 195
15296 Evaluation of Traditional Methods in Construction and Their Effects on Reinforced-Concrete Buildings Behavior

Authors: E. H. N. Gashti, M. Zarrini, M. Irannezhad, J. R. Langroudi

Abstract:

Using ETABS software, this study analyzed 23 buildings to evaluate effects of mistakes during construction phase on buildings structural behavior. For modelling, two different loadings were assumed: 1) design loading and 2) loading due to the effects of mistakes in construction phase. Research results determined that considering traditional construction methods for buildings resulted in a significant increase in dead loads and consequently intensified the displacements and base-shears of buildings under seismic loads.

Keywords: reinforced-concrete buildings, construction mistakes, base-shear, displacements, failure

Procedia PDF Downloads 270
15295 Effects of Strain-Induced Melt Activation Process on the Structure and Morphology Mg₂Si in Al-15%Mg₂Si Composite

Authors: Reza Eslami-Farsani, Mohammad Alipour

Abstract:

The effect of deformation on the semisolid microstructure and degree of globularity of Al–15%Mg₂Si composite produced by the strain induced melt activation (SIMA) process was studied. Deformation of 25% was used. After deformation, the samples were heated to a temperature above the solidus and below the liquidus point and maintained in the isothermal conditions at three different temperatures (560, 580 and 595 °C) for varying time (5, 10, 20 and 40 min). The microstructural study was carried out on the alloy by the use of optical microscopy. It was observed that strain induced deformation and subsequently melt activation has caused the globular morphology of Mg₂Si particles. The results showed that for the desired microstructures of the alloy during SIMA process, the optimum temperature and time are 595 °C and 40 min respectively.

Keywords: deformation, semisolid, SIMA, Mg₂Si phase, modification

Procedia PDF Downloads 282
15294 Numerical Analysis of Wire Laser Additive Manufacturing for Low Carbon Steels+

Authors: Juan Manuel Martinez Alvarez, Michele Chiumenti

Abstract:

This work explores the benefit of the thermo-metallurgical simulation to tackle the Wire Laser Additive Manufacturing (WLAM) of low-carbon steel components. The Finite Element Analysis is calibrated by process monitoring via thermal imaging and thermocouples measurements, to study the complex thermo-metallurgical behavior inherent to the WLAM process of low carbon steel parts.A critical aspect is the analysis of the heterogeneity in the resulting microstructure. This heterogeneity depends on both the thermal history and the residual stresses experienced during the WLAM process. Because of low carbon grades are highly sensitive to quenching, a high-gradient microstructure often arises due to the layer-by-layer metal deposition in WLAM. The different phases have been identified by scanning electron microscope. A clear influence of the heterogeneities on the final mechanical performance has been established by the subsequent mechanical characterization. The thermo-metallurgical analysis has been used to determine the actual thermal history and the corresponding thermal gradients during the printing process. The correlation between the thermos-mechanical evolution, the printing parameters and scanning sequence has been established. Therefore, an enhanced printing strategy, including optimized process window has been used to minimize the microstructure heterogeneity at ArcelorMittal.

Keywords: additive manufacturing, numerical simulation, metallurgy, steel

Procedia PDF Downloads 71
15293 A Decentralized Application for Secure Data Handling of Wireless Networks Using Ethereum Smart Contracts

Authors: Midhun Xavier

Abstract:

This paper introduces a method to verify multi-agent systems in industrial control systems using blockchain technology. The proposed solution enables to record and verify each process that occurs while generating a customized product using Ethereum-based smart contracts. Node-Red software agents are developed with the help of semantic web technologies, and these software agents interact with IEC 61499 function blocks to execute the processes. The agent associated with each mechatronic component and its controller can communicate with the blockchain to record various events that occur during each process, and the latter smart contract helps to verify these process orders of the customized product.

Keywords: blockchain, Ethereum, node-red, IEC 61499, multi-agent system, MQTT

Procedia PDF Downloads 94
15292 Technologies of Isolation and Separation of Anthraquinone Derivatives

Authors: Dmitry Yu. Korulkin, Raissa A. Muzychkina

Abstract:

In review the generalized data about different methods of extraction, separation and purification of natural and modify anthraquinones is presented. The basic regularity of an isolation process is analyzed. Action of temperature, pH, and polarity of extragent, catalysts and other factors on an isolation process is revealed.

Keywords: anthraquinones; isolation; extraction; polarity; chromatography; precipitation; bioactivity; phytopreparation; chrysophanol; aloe-emodin; emodin; physcion.

Procedia PDF Downloads 341
15291 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase

Authors: Antoine Lauvray, Fabien Poulhaon, Pierre Michaud, Pierre Joyot, Emmanuel Duc

Abstract:

Additive Friction Stir Manufacturing (AFSM) is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. Unlike in Friction Stir Welding (FSW) where abundant literature exists and addresses many aspects going from process implementation to characterization and modeling, there are still few research works focusing on AFSM. Therefore, there is still a lack of understanding of the physical phenomena taking place during the process. This research work aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system composed of the tool, the filler material, and the substrate and due to pure friction. Analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes, through numerical modeling followed by experimental validation, to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque, and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.

Keywords: numerical model, additive manufacturing, friction, process

Procedia PDF Downloads 147
15290 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company

Authors: Rahma Saleh Hussein Al Balushi

Abstract:

Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.

Keywords: asset management ISO55001, standard procedures process, governance, CMMS

Procedia PDF Downloads 125
15289 Studies on Lucrative Process Layout for Medium Scale Industries

Authors: Balamurugan Baladhandapani, Ganesh Renganathan, V. R. Sanal Kumar

Abstract:

In this paper a comprehensive review on various factory layouts has been carried out for designing a lucrative process layout for medium scale industries. Industry data base reveals that the end product rejection rate is on the order of 10% amounting large profit loss. In order to avoid these rejection rates and to increase the quality product production an intermediate non-destructive testing facility (INDTF) has been recommended for increasing the overall profit. We observed through detailed case studies that while introducing INDTF to medium scale industries the expensive production process can be avoided to the defective products well before its final shape. Additionally, the defective products identified during the intermediate stage can be effectively utilized for other applications or recycling; thereby the overall wastage of the raw materials can be reduced and profit can be increased. We concluded that the prudent design of a factory layout through critical path method facilitating with INDTF will warrant profitable outcome.

Keywords: intermediate non-destructive testing, medium scale industries, process layout design

Procedia PDF Downloads 502
15288 The Mediating Effect of Taxpayers’ Compliance on Internal Business Process-Tax Revenue Relationship: A Case Study at the Directorate General of Taxation in Indonesia

Authors: Efrizal, Ferdiansyah, Noorlailie Soewarno, Bambang Tjahjadi

Abstract:

Tax revenue plays an important role in the State Budget of the Government of Indonesia (GOI). The GOI keeps raising tax revenue portion of the Budget from year to year. The low tax ratio of 11 percent in Indonesia shows a big opportunity to collect taxes in the future. The Directorate General of Taxation (DGT) is the institution mandated by the Law to collect tax revenue. This is a case study using quantitative and qualitative approaches. This study introduces contingent factors of taxpayers’ compliance as the mediating variable and internal business process as the independent variable. This study aims to empirically test the contingency theory, especially the mediating effect of taxpayers’ compliance on internal business process-tax revenue relationship. Internal business processes of the DGT include servicing, counseling, expanding, supervising, inspecting, and enforcing. The secondary data of 31 regional offices representing 293 tax offices in Indonesia was collected and analyzed using Partial Least Square. The result showed the following: (1) internal business process affected tax revenue; (2) taxpayers’ compliance did not mediate internal business processes - tax revenue relationship, and (3) taxpayers’ compliance affected tax revenue. In-depth interviews revealed that the DGT needs to make more innovations in business processes in the future.

Keywords: innovations, internal business process, taxpayers’ compliance, tax revenue

Procedia PDF Downloads 356
15287 The Effect of Solution Density on the Synthesis of Magnesium Borate from Boron-Gypsum

Authors: N. Tugrul, E. Sariburun, F. T. Senberber, A. S. Kipcak, E. Moroydor Derun, S. Piskin

Abstract:

Boron-gypsum is a waste which occurs in the boric acid production process. In this study, the boron content of this waste is evaluated for the use in synthesis of magnesium borates and such evaluation of this kind of waste is useful more than storage or disposal. Magnesium borates, which are a sub-class of boron minerals, are useful additive materials for the industries due to their remarkable thermal and mechanical properties. Magnesium borates were obtained hydrothermally at different temperatures. Novelty of this study is the search of the solution density effects to magnesium borate synthesis process for the increasing the possibility of boron-gypsum usage as a raw material. After the synthesis process, products are subjected to XRD and FT-IR to identify and characterize their crystal structure, respectively.

Keywords: boron-gypsum, hydrothermal synthesis, magnesium borate, solution density

Procedia PDF Downloads 396
15286 Towards a Computational Model of Consciousness: Global Abstraction Workspace

Authors: Halim Djerroud, Arab Ali Cherif

Abstract:

We assume that conscious functions are implemented automatically. In other words that consciousness as well as the non-consciousness aspect of human thought, planning, and perception, are produced by biologically adaptive algorithms. We propose that the mechanisms of consciousness can be produced using similar adaptive algorithms to those executed by the mechanism. In this paper, we propose a computational model of consciousness, the ”Global Abstraction Workspace” which is an internal environmental modelling perceived as a multi-agent system. This system is able to evolve and generate new data and processes as well as actions in the environment.

Keywords: artificial consciousness, cognitive architecture, global abstraction workspace, multi-agent system

Procedia PDF Downloads 340
15285 The Forensic Swing of Things: The Current Legal and Technical Challenges of IoT Forensics

Authors: Pantaleon Lutta, Mohamed Sedky, Mohamed Hassan

Abstract:

The inability of organizations to put in place management control measures for Internet of Things (IoT) complexities persists to be a risk concern. Policy makers have been left to scamper in finding measures to combat these security and privacy concerns. IoT forensics is a cumbersome process as there is no standardization of the IoT products, no or limited historical data are stored on the devices. This paper highlights why IoT forensics is a unique adventure and brought out the legal challenges encountered in the investigation process. A quadrant model is presented to study the conflicting aspects in IoT forensics. The model analyses the effectiveness of forensic investigation process versus the admissibility of the evidence integrity; taking into account the user privacy and the providers’ compliance with the laws and regulations. Our analysis concludes that a semi-automated forensic process using machine learning, could eliminate the human factor from the profiling and surveillance processes, and hence resolves the issues of data protection (privacy and confidentiality).

Keywords: cloud forensics, data protection Laws, GDPR, IoT forensics, machine Learning

Procedia PDF Downloads 150
15284 Creep Behaviour of Asphalt Modified by Waste Polystyrene and Its Hybrids with Crumb Rubber and Low-Density Polyethylene

Authors: Soheil Heydari, Ailar Hajimohammadi, Nasser Khalili

Abstract:

Polystyrene, being made from a monomer called styrene, is a rigid and easy-to mould polymer that is widely used for many applications, from foam packaging to disposable containers. Considering that the degradation of waste polystyrene takes up to 500 years, there is an urgent need for a sustainable application for waste polystyrene. This study evaluates the application of waste polystyrene as an asphalt modifier. The inclusion of waste plastics in asphalt is either practised by the dry process or the wet process. In the dry process, plastics are added straight into the asphalt mixture and in the wet process, they are mixed and digested into bitumen. In this article, polystyrene was used as an asphalt modifier in a dry process. However, the mixing process is precisely designed to make sure that the polymer is melted and modified in the binder. It was expected that, due to the rigidity of polystyrene, it will have positive effects on the permanent deformation of the asphalt mixture. Therefore, different mixtures were manufactured with different contents of polystyrene and Marshall specimens were manufactured, and dynamic creep tests were conducted to evaluate the permanent deformation of the modification. This is a commonly repeated loading test conducted at different stress levels and temperatures. Loading cycles are applied to the AC specimen until failure occurs; with the amount of deformation constantly recorded the cumulative, permanent strain is determined and reported as a function of the number of cycles. Also, to our best knowledge, hybrid mixes of polystyrene with crumb rubber and low-density polyethylene were made and compared with a polystyrene-modified mixture. The test results of this study showed that the hybrid mix of polystyrene and low-density polyethylene has the highest resistance against permanent deformation. However, the polystyrene-modified mixture outperformed the hybrid mix of polystyrene and crumb rubber, and both demonstrated way lower permanent deformation than the unmodified specimen.

Keywords: permanent deformation, waste plastics, polystyrene, hybrid plastics, hybrid mix, hybrid modification, dry process

Procedia PDF Downloads 105