Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35195

Search results for: process data

34415 Utilising an Online Data Collection Platform for the Development of a Community Engagement Database: A Case Study on Building Inter-Institutional Partnerships at UWC

Authors: P. Daniels, T. Adonis, P. September-Brown, R. Comalie

Abstract:

The community engagement unit at the University of the Western Cape was tasked with establishing a community engagement database. The database would store information of all community engagement projects related to the university. The wealth of knowledge obtained from the various disciplines would be used to facilitate interdisciplinary collaboration within the university, as well as facilitating community university partnership opportunities. The purpose of this qualitative study was to explore electronic data collection through the development of a database. Two types of electronic data collection platforms were used, namely online questionnaire and email. The semi structured questionnaire was used to collect data related to community engagement projects from different faculties and departments at the university. There are many benefits for using an electronic data collection platform, such as reduction of costs and time, ease in reaching large numbers of potential respondents, and the possibility of providing anonymity to participants. Despite all the advantages of using the electronic platform, there were as many challenges, as depicted in our findings. The findings suggest that certain barriers existed by using an electronic platform for data collection, even though it was in an academic environment, where knowledge and resources were in abundance. One of the challenges experienced in this process was the lack of dissemination of information via email to staff within faculties. The actual online software used for the questionnaire had its own limitations, such as only being able to access the questionnaire from the same electronic device. In a few cases, academics only completed the questionnaire after a telephonic prompt or face to face meeting about "Is higher education in South Africa ready to embrace electronic platform in data collection?"

Keywords: community engagement, database, data collection, electronic platform, electronic tools, knowledge sharing, university

Procedia PDF Downloads 250
34414 Integration of a Microbial Electrolysis Cell and an Oxy-Combustion Boiler

Authors: Ruth Diego, Luis M. Romeo, Antonio Morán

Abstract:

In the present work, a study of the coupling of a Bioelectrochemical System together with an oxy-combustion boiler is carried out; specifically, it proposes to connect the combustion gas outlet of a boiler with a microbial electrolysis cell (MEC) where the CO2 from the gases are transformed into methane in the cathode chamber, and the oxygen produced in the anode chamber is recirculated to the oxy-combustion boiler. The MEC mainly consists of two electrodes (anode and cathode) immersed in an aqueous electrolyte; these electrodes are separated by a proton exchange membrane (PEM). In this case, the anode is abiotic (where oxygen is produced), and it is at the cathode that an electroactive biofilm is formed with microorganisms that catalyze the CO2 reduction reactions. Real data from an oxy-combustion process in a boiler of around 20 thermal MW have been used for this study and are combined with data obtained on a smaller scale (laboratory-pilot scale) to determine the yields that could be obtained considering the system as environmentally sustainable energy storage. In this way, an attempt is made to integrate a relatively conventional energy production system (oxy-combustion) with a biological system (microbial electrolysis cell), which is a challenge to be addressed in this type of new hybrid scheme. In this way, a novel concept is presented with the basic dimensioning of the necessary equipment and the efficiency of the global process. In this work, it has been calculated that the efficiency of this power-to-gas system based on MEC cells when coupled to industrial processes is of the same order of magnitude as the most promising equivalent routes. The proposed process has two main limitations, the overpotentials in the electrodes that penalize the overall efficiency and the need for storage tanks for the process gases. The results of the calculations carried out in this work show that certain real potentials achieve an acceptable performance. Regarding the tanks, with adequate dimensioning, it is possible to achieve complete autonomy. The proposed system called OxyMES provides energy storage without energetically penalizing the process when compared to an oxy-combustion plant with conventional CO2 capture. According to the results obtained, this system can be applied as a measure to decarbonize an industry, changing the original fuel of the oxy-combustion boiler to the biogas generated in the MEC cell. It could also be used to neutralize CO2 emissions from industry by converting it to methane and then injecting it into the natural gas grid.

Keywords: microbial electrolysis cells, oxy-combustion, co2, power-to-gas

Procedia PDF Downloads 86
34413 Improving the Students’ Writing Skill by Using Brainstorming Technique

Authors: M. Z. Abdul Rofiq Badril Rizal

Abstract:

This research is aimed to know the improvement of students’ English writing skill by using brainstorming technique. The technique used in writing is able to help the students’ difficulties in generating ideas and to lead the students to arrange the ideas well as well as to focus on the topic developed in writing. The research method used is classroom action research. The data sources of the research are an English teacher who acts as an observer and the students of class X.MIA5 consist of 35 students. The test result and observation are collected as the data in this research. Based on the research result in cycle one, the percentage of students who reach minimum accomplishment criteria (MAC) is 76.31%. It shows that the cycle must be continued to cycle two because the aim of the research has not accomplished, all of the students’ scores have not reached MAC yet. After continuing the research to cycle two and the weaknesses are improved, the process of teaching and learning runs better. At the test which is conducted in the end of learning process in cycle two, all of the students reach the minimum score and above 76 based on the minimum accomplishment criteria. It means the research has been successful and the percentage of students who reach minimum accomplishment criteria is 100%. Therefore, the writer concludes that brainstorming technique is able to improve the students’ English writing skill at the tenth grade of SMAN 2 Jember.

Keywords: brainstorming technique, improving, writing skill, knowledge and innovation engineering

Procedia PDF Downloads 350
34412 A Review of the Run to Run (R to R) Control in the Manufacturing Processes

Authors: Khalil Aghapouramin, Mostafa Ranjbar

Abstract:

Run- to- Run (R2 R) control was developed in order to monitor and control different semiconductor manufacturing processes based upon the fundamental engineering frameworks. This technology allows rectification in the optimum direction. This control always had a significant potency in which was appeared in a variety of processes. The term run to run refers to the case where the act of control would take with the aim of getting batches of silicon wafers which produced in a manufacturing process. In the present work, a brief review about run-to-run control investigated which mainly is effective in the manufacturing process.

Keywords: Run-to-Run (R2R) control, manufacturing, process in engineering, manufacturing controls

Procedia PDF Downloads 466
34411 The Influence of Human Factors Education on the Irish Registered Pre-Hospital Practitioner within the National Ambulance Service

Authors: Desmond Wade, Alfredo Ormazabal

Abstract:

Background: Ever since it commenced its registration process of pre-hospital practitioners in the year 2000 through the Irish Government Statute Instrument (SI 109 of 2000) process, the approach to education of its professionals has changed drastically. The progression from the traditional behaviouristic to the current constructivist approach has been based on experiences from other sectors and industries, nationally and internationally. Today, the delivery of a safe and efficient ambulance service heavily depends on its practitioners’ range of technical skills, academic knowledge, and overall competences. As these increase, so does the level of complexity of paramedics’ everyday practice. This has made it inevitable to consider the 'Human Factor' as a source of potential risk and made formative institutions like the National Ambulance Service College to include it in their curriculum. Methods: This paper used a mixed-method approach, where both, an online questionnaire and a set of semi-structured interviews were the source of primary data. An analysis of this data was carried out using qualitative and quantitative data analysis. Conclusions: The evidence presented leads to the conclusion that in the National Ambulance Service there is a considerable lack of education of Human Factors and the levels in understanding of how to manage Human Factors in practice vary across its spectrum. Paramedic Practitioners in Ireland seem to understand that the responsibility of patient care lies on the team, rather than on the most hierarchically senior practitioner present in the scene.

Keywords: human factors, ergonomics, stress, decision making, pre-hospital care, paramedic, education

Procedia PDF Downloads 139
34410 Uncovering the Complex Structure of Building Design Process Based on Royal Institute of British Architects Plan of Work

Authors: Fawaz A. Binsarra, Halim Boussabaine

Abstract:

The notion of complexity science has been attracting the interest of researchers and professionals due to the need of enhancing the efficiency of understanding complex systems dynamic and structure of interactions. In addition, complexity analysis has been used as an approach to investigate complex systems that contains a large number of components interacts with each other to accomplish specific outcomes and emerges specific behavior. The design process is considered as a complex action that involves large number interacted components, which are ranked as design tasks, design team, and the components of the design process. Those three main aspects of the building design process consist of several components that interact with each other as a dynamic system with complex information flow. In this paper, the goal is to uncover the complex structure of information interactions in building design process. The Investigating of Royal Institute of British Architects Plan Of Work 2013 information interactions as a case study to uncover the structure and building design process complexity using network analysis software to model the information interaction will significantly enhance the efficiency of the building design process outcomes.

Keywords: complexity, process, building desgin, Riba, design complexity, network, network analysis

Procedia PDF Downloads 507
34409 Improving Digital Data Security Awareness among Teacher Candidates with Digital Storytelling Technique

Authors: Veysel Çelik, Aynur Aker, Ebru Güç

Abstract:

Developments in information and communication technologies have increased both the speed of producing information and the speed of accessing new information. Accordingly, the daily lives of individuals have started to change. New concepts such as e-mail, e-government, e-school, e-signature have emerged. For this reason, prospective teachers who will be future teachers or school administrators are expected to have a high awareness of digital data security. The aim of this study is to reveal the effect of the digital storytelling technique on the data security awareness of pre-service teachers of computer and instructional technology education departments. For this purpose, participants were selected based on the principle of volunteering among third-grade students studying at the Computer and Instructional Technologies Department of the Faculty of Education at Siirt University. In the research, the pretest/posttest half experimental research model, one of the experimental research models, was used. In this framework, a 6-week lesson plan on digital data security awareness was prepared in accordance with the digital narration technique. Students in the experimental group formed groups of 3-6 people among themselves. The groups were asked to prepare short videos or animations for digital data security awareness. The completed videos were watched and evaluated together with prospective teachers during the evaluation process, which lasted approximately 2 hours. In the research, both quantitative and qualitative data collection tools were used by using the digital data security awareness scale and the semi-structured interview form consisting of open-ended questions developed by the researchers. According to the data obtained, it was seen that the digital storytelling technique was effective in creating data security awareness and creating permanent behavior changes for computer and instructional technology students.

Keywords: digital storytelling, self-regulation, digital data security, teacher candidates, self-efficacy

Procedia PDF Downloads 111
34408 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis

Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior

Abstract:

Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyse several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.

Keywords: drying, models, jackfruit, biotechnology

Procedia PDF Downloads 364
34407 Modeling and Optimizing of Sinker Electric Discharge Machine Process Parameters on AISI 4140 Alloy Steel by Central Composite Rotatable Design Method

Authors: J. Satya Eswari, J. Sekhar Babub, Meena Murmu, Govardhan Bhat

Abstract:

Electrical Discharge Machining (EDM) is an unconventional manufacturing process based on removal of material from a part by means of a series of repeated electrical sparks created by electric pulse generators at short intervals between a electrode tool and the part to be machined emmersed in dielectric fluid. In this paper, a study will be performed on the influence of the factors of peak current, pulse on time, interval time and power supply voltage. The output responses measured were material removal rate (MRR) and surface roughness. Finally, the parameters were optimized for maximum MRR with the desired surface roughness. RSM involves establishing mathematical relations between the design variables and the resulting responses and optimizing the process conditions. RSM is not free from problems when it is applied to multi-factor and multi-response situations. Design of experiments (DOE) technique to select the optimum machining conditions for machining AISI 4140 using EDM. The purpose of this paper is to determine the optimal factors of the electro-discharge machining (EDM) process investigate feasibility of design of experiment techniques. The work pieces used were rectangular plates of AISI 4140 grade steel alloy. The study of optimized settings of key machining factors like pulse on time, gap voltage, flushing pressure, input current and duty cycle on the material removal, surface roughness is been carried out using central composite design. The objective is to maximize the Material removal rate (MRR). Central composite design data is used to develop second order polynomial models with interaction terms. The insignificant coefficients’ are eliminated with these models by using student t test and F test for the goodness of fit. CCD is first used to establish the determine the optimal factors of the electro-discharge machining (EDM) for maximizing the MRR. The responses are further treated through a objective function to establish the same set of key machining factors to satisfy the optimization problem of the electro-discharge machining (EDM) process. The results demonstrate the better performance of CCD data based RSM for optimizing the electro-discharge machining (EDM) process.

Keywords: electric discharge machining (EDM), modeling, optimization, CCRD

Procedia PDF Downloads 326
34406 Resistance Spot Welding of Boron Steel 22MnB5 with Complex Welding Programs

Authors: Szymon Kowieski, Zygmunt Mikno

Abstract:

The study involved the optimization of process parameters during resistance spot welding of Al-coated martensitic boron steel 22MnB5, applied in hot stamping, performed using a programme with a multiple current impulse mode and a programme with variable pressure force. The aim of this research work was to determine the possibilities of a growth in welded joint strength and to identify the expansion of a welding lobe. The process parameters were adjusted on the basis of welding process simulation and confronted with experimental data. 22MnB5 steel is known for its tendency to obtain high hardness values in weld nuggets, often leading to interfacial failures (observed in the study-related tests). In addition, during resistance spot welding, many production-related factors can affect process stability, e.g. welding lobe narrowing, and lead to the deterioration of quality. Resistance spot welding performed using the above-named welding programme featuring 3 levels of force made it possible to achieve 82% of welding lobe extension. Joints made using the multiple current impulse program, where the total welding time was below 1.4s, revealed a change in a peeling mode (to full plug) and an increase in weld tensile shear strength of 10%.

Keywords: 22MnB5, hot stamping, interfacial fracture, resistance spot welding, simulation, single lap joint, welding lobe

Procedia PDF Downloads 367
34405 Formation of Clipped Forms in Hausa Language

Authors: Maryam Maimota Shehu

Abstract:

Words are the basic building blocks of a language. In everyday usage of a language, words are used, and new words are formed and reformed in order to contain and accommodate all entities, phenomena, qualities and every aspect of the entire life. Despite the fact that many studies have been conducted on morphological processes in Hausa language. Most of the works concentrated on borrowing, affixation, reduplication and derivation, but clipping has been neglected to the extent that only a few scholars sited some examples in the language. Therefore, the current study investigates and examines clipping as one of the word formation processes fully found in the language. The study focuses its main attention on clipping as a word-formation process and how this process is used adequately in the formation of words and their occurrence in Hausa sentences. In order to achieve the aims, the research answered these questions: 1) is clipping used as process of word formation in Hausa? 2) What are the words formed using this process? This study utilizes the Natural Morphology Theory proposed by Dressler, (1985) which was adopted by belly (2007). The data of this study have been collected from newspaper articles, novels, and written literature of Hausa language. Based on the findings, this study found out that, there exist many kinds of words formed in Hausa language using clipping in sentence and discuss, which previous findings did not either reveals, or explain in detail. Other part of the finding shows that clipping in Hausa language occurs on nouns, verbs, adjectives, reduplicated words and compounds while retains their meanings and grammatical classes.

Keywords: clipping, Hausa language, morphology, word formation processes

Procedia PDF Downloads 444
34404 Machine Learning Strategies for Data Extraction from Unstructured Documents in Financial Services

Authors: Delphine Vendryes, Dushyanth Sekhar, Baojia Tong, Matthew Theisen, Chester Curme

Abstract:

Much of the data that inform the decisions of governments, corporations and individuals are harvested from unstructured documents. Data extraction is defined here as a process that turns non-machine-readable information into a machine-readable format that can be stored, for instance, in a database. In financial services, introducing more automation in data extraction pipelines is a major challenge. Information sought by financial data consumers is often buried within vast bodies of unstructured documents, which have historically required thorough manual extraction. Automated solutions provide faster access to non-machine-readable datasets, in a context where untimely information quickly becomes irrelevant. Data quality standards cannot be compromised, so automation requires high data integrity. This multifaceted task is broken down into smaller steps: ingestion, table parsing (detection and structure recognition), text analysis (entity detection and disambiguation), schema-based record extraction, user feedback incorporation. Selected intermediary steps are phrased as machine learning problems. Solutions leveraging cutting-edge approaches from the fields of computer vision (e.g. table detection) and natural language processing (e.g. entity detection and disambiguation) are proposed.

Keywords: computer vision, entity recognition, finance, information retrieval, machine learning, natural language processing

Procedia PDF Downloads 93
34403 The Process of Crisis: Model of Its Development in the Organization

Authors: M. Mikušová

Abstract:

The main aim of this paper is to present a clear and comprehensive picture of the process of a crisis in the organization which will help to better understand its possible developments. For a description of the sequence of individual steps and an indication of their causation and possible variants of the developments, a detailed flow diagram with verbal comment is applied. For simplicity, the process of the crisis is observed in four basic phases called: symptoms of the crisis, diagnosis, action and prevention. The model highlights the complexity of the phenomenon of the crisis and that the various phases of the crisis are interweaving.

Keywords: crisis, management, model, organization

Procedia PDF Downloads 276
34402 Preparation of Magnetic Hydroxyapatite Composite by Wet Chemical Process for Phycobiliproteins Adsorption

Authors: Shu-Jen Chen, Yi-Chien Wan, Ruey-Chi Wang

Abstract:

Hydroxyapatite (Ca10(PO4)6(OH)2, HAp) can be applied to the fabrication of bone replacement materials, the composite of dental filling, and the adsorption of biomolecules and dyes. The integration of HAp and magnetic materials would offer several advantages for bio-separation process because the magnetic adsorbents is capable of recovered by applied magnetic field. C-phycocyanin (C-PC) and Allophycocyanin (APC), isolated from Spirulina platensis, can be used in fluorescent labeling probes, health care foods and clinical diagnostic reagents. Although the purification of C-PC and APC are reported by HAp adsorption, the adsorption of C-PC and APC by magnetic HAp composites was not reported yet. Therefore, the fabrication of HAp with magnetic silica nanoparticles for proteins adsorption was investigated in this work. First, the magnetic silica particles were prepared by covering silica layer on Fe3O4 nanoparticles with a reverse micelle method. Then, the Fe3O4@SiO2 nanoparticles were mixed with calcium carbonate to obtain magnetic silica/calcium carbonate composites (Fe3O4@SiO2/CaCO3). The Fe3O4@SiO2/CaCO3 was further reacted with K2HPO4 for preparing the magnetic silica/hydroxyapatite composites (Fe3O4@SiO2/HAp). The adsorption experiments indicated that the adsorption capacity of Fe3O4@SiO2/HAp toward C-PC and APC were highest at pH 6. The adsorption of C-PC and APC by Fe3O4@SiO2/HAp could be correlated by the pseudo-second-order model, indicating chemical adsorption dominating the adsorption process. Furthermore, the adsorption data showed that the adsorption of Fe3O4@SiO2/HAp toward C-PC and APC followed the Langmuir isotherm. The isoelectric points of C-PC and APC were around 5.0. Additionally, the zeta potential data showed the Fe3O4@SiO2/HAp composite was negative charged at pH 6. Accordingly, the adsorption mechanism of Fe3O4@SiO2/HAp toward C-PC and APC should be governed by hydrogen bonding rather than electrostatic interaction. On the other hand, as compared to C-PC, the Fe3O4@SiO2/HAp shows higher adsorption affinity toward APC. Although the Fe3O4@SiO2/HAp cannot recover C-PC and APC from Spirulina platensis homogenate, the Fe3O4@SiO2/HAp can be applied to separate C-PC and APC.

Keywords: hydroxyapatite, magnetic, C-phycocyanin, allophycocyanin

Procedia PDF Downloads 134
34401 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)

Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim

Abstract:

This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.

Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm

Procedia PDF Downloads 383
34400 A Study of Electrowetting-Assisted Mold Filling in Nanoimprint Lithography

Authors: Wei-Hsuan Hsu, Yi-Xuan Huang

Abstract:

Nanoimprint lithography (NIL) possesses the advantages of sub-10-nm feature and low cost. NIL patterns the resist with physical deformation using a mold, which can easily reproduce the required nano-scale pattern. However, the variation of process parameters and environmental conditions seriously affect reproduction quality. How to ensure the quality of imprinted pattern is essential for industry. In this study, the authors used the electrowetting technology to assist mold filling in the NIL process. A special mold structure was designed to cause electrowetting. During the imprinting process, when a voltage was applied between the mold and substrate, the hydrophilicity/hydrophobicity of the surface of the mold can be converted. Both simulation and experiment confirmed that the electrowetting technology can assist mold filling and avoid incomplete filling rate. The proposed method can also reduce the crack formation during the de-molding process. Therefore, electrowetting technology can improve the process quality of NIL.

Keywords: electrowetting, mold filling, nano-imprint, surface modification

Procedia PDF Downloads 151
34399 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process

Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz

Abstract:

One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.

Keywords: glass, melting process, glass set, raw materials

Procedia PDF Downloads 44
34398 Parameter Optimization and Thermal Simulation in Laser Joining of Coach Peel Panels of Dissimilar Materials

Authors: Masoud Mohammadpour, Blair Carlson, Radovan Kovacevic

Abstract:

The quality of laser welded-brazed (LWB) joints were strongly dependent on the main process parameters, therefore the effect of laser power (3.2–4 kW), welding speed (60–80 mm/s) and wire feed rate (70–90 mm/s) on mechanical strength and surface roughness were investigated in this study. The comprehensive optimization process by means of response surface methodology (RSM) and desirability function was used for multi-criteria optimization. The experiments were planned based on Box– Behnken design implementing linear and quadratic polynomial equations for predicting the desired output properties. Finally, validation experiments were conducted on an optimized process condition which exhibited good agreement between the predicted and experimental results. AlSi3Mn1 was selected as the filler material for joining aluminum alloy 6022 and hot-dip galvanized steel in coach peel configuration. The high scanning speed could control the thickness of IMC as thin as 5 µm. The thermal simulations of joining process were conducted by the Finite Element Method (FEM), and results were validated through experimental data. The Fe/Al interfacial thermal history evidenced that the duration of critical temperature range (700–900 °C) in this high scanning speed process was less than 1 s. This short interaction time leads to the formation of reaction-control IMC layer instead of diffusion-control mechanisms.

Keywords: laser welding-brazing, finite element, response surface methodology (RSM), multi-response optimization, cross-beam laser

Procedia PDF Downloads 341
34397 Evaluation of Free Technologies as Tools for Business Process Management

Authors: Julio Sotomayor, Daniel Yucra, Jorge Mayhuasca

Abstract:

The article presents an evaluation of free technologies for business process automation, with emphasis only on tools compatible with the general public license (GPL). The compendium of technologies was based on promoting a service-oriented enterprise architecture (SOA) and the establishment of a business process management system (BPMS). The methodology for the selection of tools was Agile UP. This proposal allows businesses to achieve technological sovereignty and independence, in addition to the promotion of service orientation and the development of free software based on components.

Keywords: BPM, BPMS suite, open-source software, SOA, enterprise architecture, business process management

Procedia PDF Downloads 269
34396 Copula Autoregressive Methodology for Simulation of Solar Irradiance and Air Temperature Time Series for Solar Energy Forecasting

Authors: Andres F. Ramirez, Carlos F. Valencia

Abstract:

The increasing interest in renewable energies strategies application and the path for diminishing the use of carbon related energy sources have encouraged the development of novel strategies for integration of solar energy into the electricity network. A correct inclusion of the fluctuating energy output of a photovoltaic (PV) energy system into an electric grid requires improvements in the forecasting and simulation methodologies for solar energy potential, and the understanding not only of the mean value of the series but the associated underlying stochastic process. We present a methodology for synthetic generation of solar irradiance (shortwave flux) and air temperature bivariate time series based on copula functions to represent the cross-dependence and temporal structure of the data. We explore the advantages of using this nonlinear time series method over traditional approaches that use a transformation of the data to normal distributions as an intermediate step. The use of copulas gives flexibility to represent the serial variability of the real data on the simulation and allows having more control on the desired properties of the data. We use discrete zero mass density distributions to assess the nature of solar irradiance, alongside vector generalized linear models for the bivariate time series time dependent distributions. We found that the copula autoregressive methodology used, including the zero mass characteristics of the solar irradiance time series, generates a significant improvement over state of the art strategies. These results will help to better understand the fluctuating nature of solar energy forecasting, the underlying stochastic process, and quantify the potential of a photovoltaic (PV) energy generating system integration into a country electricity network. Experimental analysis and real data application substantiate the usage and convenience of the proposed methodology to forecast solar irradiance time series and solar energy across northern hemisphere, southern hemisphere, and equatorial zones.

Keywords: copula autoregressive, solar irradiance forecasting, solar energy forecasting, time series generation

Procedia PDF Downloads 305
34395 Decision Support System in Air Pollution Using Data Mining

Authors: E. Fathallahi Aghdam, V. Hosseini

Abstract:

Environmental pollution is not limited to a specific region or country; that is why sustainable development, as a necessary process for improvement, pays attention to issues such as destruction of natural resources, degradation of biological system, global pollution, and climate change in the world, especially in the developing countries. According to the World Health Organization, as a developing city, Tehran (capital of Iran) is one of the most polluted cities in the world in terms of air pollution. In this study, three pollutants including particulate matter less than 10 microns, nitrogen oxides, and sulfur dioxide were evaluated in Tehran using data mining techniques and through Crisp approach. The data from 21 air pollution measuring stations in different areas of Tehran were collected from 1999 to 2013. Commercial softwares Clementine was selected for this study. Tehran was divided into distinct clusters in terms of the mentioned pollutants using the software. As a data mining technique, clustering is usually used as a prologue for other analyses, therefore, the similarity of clusters was evaluated in this study through analyzing local conditions, traffic behavior, and industrial activities. In fact, the results of this research can support decision-making system, help managers improve the performance and decision making, and assist in urban studies.

Keywords: data mining, clustering, air pollution, crisp approach

Procedia PDF Downloads 415
34394 A Study of Various Ontology Learning Systems from Text and a Look into Future

Authors: Fatima Al-Aswadi, Chan Yong

Abstract:

With the large volume of unstructured data that increases day by day on the web, the motivation of representing the knowledge in this data in the machine processable form is increased. Ontology is one of the major cornerstones of representing the information in a more meaningful way on the semantic Web. The goal of Ontology learning from text is to elicit and represent domain knowledge in the machine readable form. This paper aims to give a follow-up review on the ontology learning systems from text and some of their defects. Furthermore, it discusses how far the ontology learning process will enhance in the future.

Keywords: concept discovery, deep learning, ontology learning, semantic relation, semantic web

Procedia PDF Downloads 492
34393 An Investigation into the Views of Distant Science Education Students Regarding Teaching Laboratory Work Online

Authors: Abraham Motlhabane

Abstract:

This research analysed the written views of science education students regarding the teaching of laboratory work using the online mode. The research adopted the qualitative methodology. The qualitative research was aimed at investigating small and distinct groups normally regarded as a single-site study. Qualitative research was used to describe and analyze the phenomena from the student’s perspective. This means the research began with assumptions of the world view that use theoretical lenses of research problems inquiring into the meaning of individual students. The research was conducted with three groups of students studying for Postgraduate Certificate in Education, Bachelor of Education and honors Bachelor of Education respectively. In each of the study programmes, the science education module is compulsory. Five science education students from each study programme were purposively selected to participate in this research. Therefore, 15 students participated in the research. In order to analysis the data, the data were first printed and hard copies were used in the analysis. The data was read several times and key concepts and ideas were highlighted. Themes and patterns were identified to describe the data. Coding as a process of organising and sorting data was used. The findings of the study are very diverse; some students are in favour of online laboratory whereas other students argue that science can only be learnt through hands-on experimentation.

Keywords: online learning, laboratory work, views, perceptions

Procedia PDF Downloads 123
34392 Traffic Congestions Modeling and Predictions by Social Networks

Authors: Bojan Najdenov, Danco Davcev

Abstract:

Reduction of traffic congestions and the effects of pollution and waste of resources that come with them has been a big challenge in the past decades. Having reliable systems to facilitate the process of modeling and prediction of traffic conditions would not only reduce the environmental pollution, but will also save people time and money. Social networks play big role of people’s lives nowadays providing them means of communicating and sharing thoughts and ideas, that way generating huge knowledge bases by crowdsourcing. In addition to that, crowdsourcing as a concept provides mechanisms for fast and relatively reliable data generation and also many services are being used on regular basis because they are mainly powered by the public as main content providers. In this paper we present the Social-NETS-Traffic-Control System (SNTCS) that should serve as a facilitator in the process of modeling and prediction of traffic congestions. The main contribution of our system is to integrate data from social networks as Twitter and also implements a custom created crowdsourcing subsystem with which users report traffic conditions using an android application. Our first experience of the usage of the system confirms that the integrated approach allows easy extension of the system with other social networks and represents a very useful tool for traffic control.

Keywords: traffic, congestion reduction, crowdsource, social networks, twitter, android

Procedia PDF Downloads 461
34391 The Communication Library DIALOG for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

Modern experiments in high energy physics impose great demands on the reliability, the efficiency, and the data rate of Data Acquisition Systems (DAQ). This contribution focuses on the development and deployment of the new communication library DIALOG for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. The iFDAQ utilizing a hardware event builder is designed to be able to readout data at the maximum rate of the experiment. The DIALOG library is a communication system both for distributed and mixed environments, it provides a network transparent inter-process communication layer. Using the high-performance and modern C++ framework Qt and its Qt Network API, the DIALOG library presents an alternative to the previously used DIM library. The DIALOG library was fully incorporated to all processes in the iFDAQ during the run 2016. From the software point of view, it might be considered as a significant improvement of iFDAQ in comparison with the previous run. To extend the possibilities of debugging, the online monitoring of communication among processes via DIALOG GUI is a desirable feature. In the paper, we present the DIALOG library from several insights and discuss it in a detailed way. Moreover, the efficiency measurement and comparison with the DIM library with respect to the iFDAQ requirements is provided.

Keywords: data acquisition system, DIALOG library, DIM library, FPGA, Qt framework, TCP/IP

Procedia PDF Downloads 300
34390 Deterioration Prediction of Pavement Load Bearing Capacity from FWD Data

Authors: Kotaro Sasai, Daijiro Mizutani, Kiyoyuki Kaito

Abstract:

Expressways in Japan have been built in an accelerating manner since the 1960s with the aid of rapid economic growth. About 40 percent in length of expressways in Japan is now 30 years and older and has become superannuated. Time-related deterioration has therefore reached to a degree that administrators, from a standpoint of operation and maintenance, are forced to take prompt measures on a large scale aiming at repairing inner damage deep in pavements. These measures have already been performed for bridge management in Japan and are also expected to be embodied for pavement management. Thus, planning methods for the measures are increasingly demanded. Deterioration of layers around road surface such as surface course and binder course is brought about at the early stages of whole pavement deterioration process, around 10 to 30 years after construction. These layers have been repaired primarily because inner damage usually becomes significant after outer damage, and because surveys for measuring inner damage such as Falling Weight Deflectometer (FWD) survey and open-cut survey are costly and time-consuming process, which has made it difficult for administrators to focus on inner damage as much as they have been supposed to. As expressways today have serious time-related deterioration within them deriving from the long time span since they started to be used, it is obvious the idea of repairing layers deep in pavements such as base course and subgrade must be taken into consideration when planning maintenance on a large scale. This sort of maintenance requires precisely predicting degrees of deterioration as well as grasping the present situations of pavements. Methods for predicting deterioration are determined to be either mechanical or statistical. While few mechanical models have been presented, as far as the authors know of, previous studies have presented statistical methods for predicting deterioration in pavements. One describes deterioration process by estimating Markov deterioration hazard model, while another study illustrates it by estimating Proportional deterioration hazard model. Both of the studies analyze deflection data obtained from FWD surveys and present statistical methods for predicting deterioration process of layers around road surface. However, layers of base course and subgrade remain unanalyzed. In this study, data collected from FWD surveys are analyzed to predict deterioration process of layers deep in pavements in addition to surface layers by a means of estimating a deterioration hazard model using continuous indexes. This model can prevent the loss of information of data when setting rating categories in Markov deterioration hazard model when evaluating degrees of deterioration in roadbeds and subgrades. As a result of portraying continuous indexes, the model can predict deterioration in each layer of pavements and evaluate it quantitatively. Additionally, as the model can also depict probability distribution of the indexes at an arbitrary point and establish a risk control level arbitrarily, it is expected that this study will provide knowledge like life cycle cost and informative content during decision making process referring to where to do maintenance on as well as when.

Keywords: deterioration hazard model, falling weight deflectometer, inner damage, load bearing capacity, pavement

Procedia PDF Downloads 366
34389 Optimal Design of Shape for Increasing the Bonding Pressure Drawing of Hot Clad Pipes by Finite Element Method Analysis

Authors: Seok-Hyeon Park, Joon-Hong Park, Mok-Tan-Ahn, Seong-Hun Ha

Abstract:

Clad Pipe is made of a different kind of material, which is different from the internal and external materials, for the corrosive crude oil transportation tube. Most of the clad pipes are produced by hot rolling. However, problems arise due to high product prices and excessive process numbers. Therefore, in this study, the hot drawing process with excellent product cost, process number and productivity is applied. Due to the nature of the drawing process, the shape of the mold greatly influences the formability of the material and the bonding pressure of the two materials because it is a process of drawing the material to the die and reducing the cross-sectional area. Also, in case of hot drawing, if the mold shape is not suitable due to the increased fluidity of the material, it may cause problems such as tearing and stretching. Therefore, in this study, we try to find the shape of the mold which suppresses the occurrence of defects in the hot drawing process and maximizes the bonding pressure between the two materials through the mold shape optimization design by FEM analysis.

Keywords: clad pipe, hot drawing, bonding pressure, mold shape

Procedia PDF Downloads 285
34388 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System

Authors: Imran Dayan, Ashiqul Khan

Abstract:

Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.

Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining

Procedia PDF Downloads 315
34387 Handwriting Velocity Modeling by Artificial Neural Networks

Authors: Mohamed Aymen Slim, Afef Abdelkrim, Mohamed Benrejeb

Abstract:

The handwriting is a physical demonstration of a complex cognitive process learnt by man since his childhood. People with disabilities or suffering from various neurological diseases are facing so many difficulties resulting from problems located at the muscle stimuli (EMG) or signals from the brain (EEG) and which arise at the stage of writing. The handwriting velocity of the same writer or different writers varies according to different criteria: age, attitude, mood, writing surface, etc. Therefore, it is interesting to reconstruct an experimental basis records taking, as primary reference, the writing speed for different writers which would allow studying the global system during handwriting process. This paper deals with a new approach of the handwriting system modeling based on the velocity criterion through the concepts of artificial neural networks, precisely the Radial Basis Functions (RBF) neural networks. The obtained simulation results show a satisfactory agreement between responses of the developed neural model and the experimental data for various letters and forms then the efficiency of the proposed approaches.

Keywords: Electro Myo Graphic (EMG) signals, experimental approach, handwriting process, Radial Basis Functions (RBF) neural networks, velocity modeling

Procedia PDF Downloads 423
34386 The Economic Limitations of Defining Data Ownership Rights

Authors: Kacper Tomasz Kröber-Mulawa

Abstract:

This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.

Keywords: antitrust, data, data ownership, digital economy, property rights

Procedia PDF Downloads 62