Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 34909

Search results for: process data

34549 Learning Compression Techniques on Smart Phone

Authors: Farouk Lawan Gambo, Hamada Mohammad

Abstract:

Data compression shrinks files into fewer bits than their original presentation. It has more advantage on the internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature, therefore, making them difficult to digest by some students (engineers in particular). This paper studies the learning preference of engineering students who tend to have strong, active, sensing, visual and sequential learning preferences, the paper also studies the three shift of technology-aided that learning has experienced, which mobile learning has been considered to be the feature of learning that will integrate other form of the education process. Lastly, we propose a design and implementation of mobile learning application using software engineering methodology that will enhance the traditional teaching and learning of data compression techniques.

Keywords: data compression, learning preference, mobile learning, multimedia

Procedia PDF Downloads 420
34548 Real-Time Sensor Fusion for Mobile Robot Localization in an Oil and Gas Refinery

Authors: Adewole A. Ayoade, Marshall R. Sweatt, John P. H. Steele, Qi Han, Khaled Al-Wahedi, Hamad Karki, William A. Yearsley

Abstract:

Understanding the behavioral characteristics of sensors is a crucial step in fusing data from several sensors of different types. This paper introduces a practical, real-time approach to integrate heterogeneous sensor data to achieve higher accuracy than would be possible from any one individual sensor in localizing a mobile robot. We use this approach in both indoor and outdoor environments and it is especially appropriate for those environments like oil and gas refineries due to their sparse and featureless nature. We have studied the individual contribution of each sensor data to the overall combined accuracy achieved from the fusion process. A Sequential Update Extended Kalman Filter(EKF) using validation gates was used to integrate GPS data, Compass data, WiFi data, Inertial Measurement Unit(IMU) data, Vehicle Velocity, and pose estimates from Fiducial marker system. Results show that the approach can enable a mobile robot to navigate autonomously in any environment using a priori information.

Keywords: inspection mobile robot, navigation, sensor fusion, sequential update extended Kalman filter

Procedia PDF Downloads 441
34547 Video Analytics on Pedagogy Using Big Data

Authors: Jamuna Loganath

Abstract:

Education is the key to the development of any individual’s personality. Today’s students will be tomorrow’s citizens of the global society. The education of the student is the edifice on which his/her future will be built. Schools therefore should provide an all-round development of students so as to foster a healthy society. The behaviors and the attitude of the students in school play an essential role for the success of the education process. Frequent reports of misbehaviors such as clowning, harassing classmates, verbal insults are becoming common in schools today. If this issue is left unattended, it may develop a negative attitude and increase the delinquent behavior. So, the need of the hour is to find a solution to this problem. To solve this issue, it is important to monitor the students’ behaviors in school and give necessary feedback and mentor them to develop a positive attitude and help them to become a successful grownup. Nevertheless, measuring students’ behavior and attitude is extremely challenging. None of the present technology has proven to be effective in this measurement process because actions, reactions, interactions, response of the students are rarely used in the course of the data due to complexity. The purpose of this proposal is to recommend an effective supervising system after carrying out a feasibility study by measuring the behavior of the Students. This can be achieved by equipping schools with CCTV cameras. These CCTV cameras installed in various schools of the world capture the facial expressions and interactions of the students inside and outside their classroom. The real time raw videos captured from the CCTV can be uploaded to the cloud with the help of a network. The video feeds get scooped into various nodes in the same rack or on the different racks in the same cluster in Hadoop HDFS. The video feeds are converted into small frames and analyzed using various Pattern recognition algorithms and MapReduce algorithm. Then, the video frames are compared with the bench marking database (good behavior). When misbehavior is detected, an alert message can be sent to the counseling department which helps them in mentoring the students. This will help in improving the effectiveness of the education process. As Video feeds come from multiple geographical areas (schools from different parts of the world), BIG DATA helps in real time analysis as it analyzes computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions. It also analyzes data that can’t be analyzed by traditional software applications such as RDBMS, OODBMS. It has also proven successful in handling human reactions with ease. Therefore, BIG DATA could certainly play a vital role in handling this issue. Thus, effectiveness of the education process can be enhanced with the help of video analytics using the latest BIG DATA technology.

Keywords: big data, cloud, CCTV, education process

Procedia PDF Downloads 220
34546 Marosok Tradition in the Process of Buying and Selling Cattle in Payakumbuh: A Comparative Study between Adat Law and Positive Law of Indonesia

Authors: Mhd. Zakiul Fikri, M. Agus Maulidi

Abstract:

Indonesia is a constitutional state. As the constitutional state, Indonesia is not only using a single legal system, but also adopting three legal systems consist of: The European continental legal system or positive law of Indonesia, adat law system, and legal system of religion. This study will discuss Marosok tradition in the process of buying and selling cattle in Payakumbuh: a comparative study between adat law and positive law of Indonesia. The objectives of this research are: First, to find the meaning of the philosophical of Marosok tradition in Payakumbuh. Second, to find the legal implications of the Marosok tradition reviewed aspects of adat law and positive law of Indonesia. Third, to find legal procedure in arbitrating the dispute wich is potentially appear in the post-process of buying and selling cattle based on positive law and adat law adopted in Indonesia. This research is empirical legal research that using two model approaches which are statute approach and conceptual approach. Data was obtained through interviews, observations, and documents or books. Then a method of data analysis used is inductive analysis. Finally, this study found that: First, tradition of Marosok contains the meaning of harmonization of social life that keep people from negative debate, envy, and arrogant. Second, Marosok tradition is one of the adat law in Indonesia; it is one of contract law in the process of buying and selling. If the comparison between the practice Marosok tradition as adat law with the provisions of Article 1320 book of civil code about the terms of the validity of a contract, the elements contained in the provisions of these regulations are met in practice Marosok. Thus, the practice of Marosok in buying and selling cattle process in Payakumbuh justified in view of the positive law of Indonesia. Last of all, all kinds of disputes arising due to contracts made by Marosok tradition can be resolved by positive law and adat law of Indonesia.

Keywords: Adat law, contract, Indonesia, Marosok

Procedia PDF Downloads 290
34545 Impact of Process Parameters on Tensile Strength of Fused Deposition Modeling Printed Crisscross Poylactic Acid

Authors: Shilpesh R. Rajpurohit, Harshit K. Dave

Abstract:

Additive manufacturing gains the popularity in recent times, due to its capability to create prototype as well functional as end use product directly from CAD data without any specific requirement of tooling. Fused deposition modeling (FDM) is one of the widely used additive manufacturing techniques that are used to create functional end use part of polymer that is comparable with the injection-molded parts. FDM printed part has an application in various fields such as automobile, aerospace, medical, electronic, etc. However, application of FDM part is greatly affected by poor mechanical properties. Proper selection of the process parameter could enhance the mechanical performance of the printed part. In the present study, experimental investigation has been carried out to study the behavior of the mechanical performance of the printed part with respect to process variables. Three process variables viz. raster angle, raster width and layer height have been varied to understand its effect on tensile strength. Further, effect of process variables on fractured surface has been also investigated.

Keywords: 3D Printing, fused deposition modeling, layer height, raster angle, raster width, tensile strength

Procedia PDF Downloads 176
34544 Programming Systems in Implementation of Process Safety at Chemical Process Industry

Authors: Maryam Shayan

Abstract:

Programming frameworks have been utilized as a part of chemical industry process safety operation and configuration to enhance its effectiveness. This paper gives a brief survey and investigation of the best in class and effects of programming frameworks in process security. A study was completed by talking staff accountable for procedure wellbeing practices in the Iranian chemical process industry and diving into writing of innovation for procedure security. This article investigates the useful and operational attributes of programming frameworks for security and endeavors to sort the product as indicated by its level of effect in the administration chain of importance. The study adds to better comprehension of the parts of Information Communication Technology in procedure security, the future patterns and conceivable gaps for innovative work.

Keywords: programming frameworks, chemical industry process, process security, administration chain, information communication technology

Procedia PDF Downloads 344
34543 An Interpretative Phenomenological Analysis on the Concept of Friends of Children in Conflict with the Law

Authors: Karla Kristine Bay, Jovie Ann Gabin, Allana Joyce Sasotona

Abstract:

This research employed an Interpretative Phenomenological Analysis to explore the experiences of Children in Conflict with the Law (CICL) which gave light to their concept of ‘friends’. Derived from this context are the following objectives of the study: 1) determining the differentiation of the forms of friends of the CICL; 2) presenting the process of attachment towards detachment in the formation of friendship; and 3) discussing the experiences, and reflections of the CICL on the ‘self’ out of their encounter with friendship. Using the data gathered from the individual drawings of the CICL of their representations of the self, family, friends, community, and Bahay Kalinga as subjects in the meaning-making process utilizing Filipino Psychology methods of pagtatanong-tanong (interview), and pakikipagkwentuhan (conversation), data analysis produced a synthesis of seventeen individual cases. Overall results generated three superordinate themes on the differentiation of the forms of friends which include friends with good influences, friends with bad influences, and friends within the family. While two superordinate themes were produced on the process of attachment towards detachment, namely social, emotional, and psychological experiences on the process of attachment, and emotional and psychological experiences on the process of detachment. Lastly, two superordinate themes were created on the experiences, and reflections of the CICL on the ‘self’ out of their encounter with friendship. This consists of the recognition of the ‘self’ as a responsible agent in developing healthy relationships between the self and others, and reconstruction of the self from the collective experiences of healing, forgiveness, and acceptance. These findings, together with supporting theories discussed the impact of friendship on the emergence of criminal behavior and other dispositions; springing from the child’s dissociation from the family that led to finding belongingness from an external group called friends.

Keywords: children in conflict with the law, criminal behavior, friends, interpretative phenomenological analysis

Procedia PDF Downloads 216
34542 A Framework for Information Quality in Accounting Information Systems Adoption

Authors: Wongsim Manirath

Abstract:

In order to implement AIS adoption successfully, it is important to consider the quality of information management and understand Information Quality (IQ) factors influencing AIS adoption. This research aims to explore ways of managing AIS adoption to investigate the adoption of accounting information systems within organisations. The study has led to the development of a framework for understanding the AIS adoption process in an organisation. This research used qualitative, interpretive evidence. This framework was developed from case studies and by collecting qualitative data (interviews). This research has conducted 10 case studies to study how IQ is managed through the accounting information system adoption process. A special focus is placed on determining how organisation size influences the information quality practices. The finding is especially useful to SMEs as many SMEs have the desire to grow bigger. By better dealing with IQ issues, there could be a successful future.

Keywords: data quality, information quality, accounting information system, information management

Procedia PDF Downloads 442
34541 Mixed Integer Programming-Based One-Class Classification Method for Process Monitoring

Authors: Younghoon Kim, Seoung Bum Kim

Abstract:

One-class classification plays an important role in detecting outlier and abnormality from normal observations. In the previous research, several attempts were made to extend the scope of application of the one-class classification techniques to statistical process control problems. For most previous approaches, such as support vector data description (SVDD) control chart, the design of the control limits is commonly based on the assumption that the proportion of abnormal observations is approximately equal to an expected Type I error rate in Phase I process. Because of the limitation of the one-class classification techniques based on convex optimization, we cannot make the proportion of abnormal observations exactly equal to expected Type I error rate: controlling Type I error rate requires to optimize constraints with integer decision variables, but convex optimization cannot satisfy the requirement. This limitation would be undesirable in theoretical and practical perspective to construct effective control charts. In this work, to address the limitation of previous approaches, we propose the one-class classification algorithm based on the mixed integer programming technique, which can solve problems formulated with continuous and integer decision variables. The proposed method minimizes the radius of a spherically shaped boundary subject to the number of normal data to be equal to a constant value specified by users. By modifying this constant value, users can exactly control the proportion of normal data described by the spherically shaped boundary. Thus, the proportion of abnormal observations can be made theoretically equal to an expected Type I error rate in Phase I process. Moreover, analogous to SVDD, the boundary can be made to describe complex structures by using some kernel functions. New multivariate control chart applying the effectiveness of the algorithm is proposed. This chart uses a monitoring statistic to characterize the degree of being an abnormal point as obtained through the proposed one-class classification. The control limit of the proposed chart is established by the radius of the boundary. The usefulness of the proposed method was demonstrated through experiments with simulated and real process data from a thin film transistor-liquid crystal display.

Keywords: control chart, mixed integer programming, one-class classification, support vector data description

Procedia PDF Downloads 154
34540 The Influence of Students’ Learning Factor and Parents’ Involvement in Their Learning and Suspension: The Application of Big Data Analysis of Internet of Things Technology

Authors: Chih Ming Kung

Abstract:

This study is an empirical study examining the enrollment rate and dropout rate of students from the perspectives of students’ learning, parents’ involvement and the learning process. Methods: Using the data collected from the entry website of Internet of Things (IoT), parents’ participation and the installation pattern of exit poll website, an investigation was conducted. Results: This study discovered that in the aspect of the degree of involvement, the attractiveness of courses, self-performance and departmental loyalty exerts significant influences on the four aspects: psychological benefits, physical benefits, social benefits and educational benefits of learning benefits. Parents’ participation also exerts a significant influence on the learning benefits. A suitable tool on the cloud was designed to collect the dynamic big data of students’ learning process. Conclusion: This research’s results can be valuable references for the government when making and promoting related policies, with more macro view and consideration. It is also expected to be contributory to schools for the practical study of promotion for enrollment.

Keywords: students’ learning factor, parents’ involvement, involvement, technology

Procedia PDF Downloads 122
34539 Optimization of Process Parameters in Wire Electrical Discharge Machining of Inconel X-750 for Dimensional Deviation Using Taguchi Technique

Authors: Mandeep Kumar, Hari Singh

Abstract:

The effective optimization of machining process parameters affects dramatically the cost and production time of machined components as well as the quality of the final products. This paper presents the optimization aspects of a Wire Electrical Discharge Machining operation using Inconel X-750 as work material. The objective considered in this study is minimization of the dimensional deviation. Six input process parameters of WEDM namely spark gap voltage, pulse-on time, pulse-off time, wire feed rate, peak current and wire tension, were chosen as variables to study the process performance. Taguchi's design of experiments methodology has been used for planning and designing the experiments. The analysis of variance was carried out for raw data as well as for signal to noise ratio. Four input parameters and one two-factor interaction have been found to be statistically significant for their effects on the response of interest. The confirmation experiments were also performed for validating the predicted results.

Keywords: ANOVA, DOE, inconel, machining, optimization

Procedia PDF Downloads 176
34538 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects

Authors: Mai Ghazal, Ahmed Hammad

Abstract:

Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.

Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management

Procedia PDF Downloads 336
34537 Managing Uncertainty in Unmanned Aircraft System Safety Performance Requirements Compliance Process

Authors: Achim Washington, Reece Clothier, Jose Silva

Abstract:

System Safety Regulations (SSR) are a central component to the airworthiness certification of Unmanned Aircraft Systems (UAS). There is significant debate on the setting of appropriate SSR for UAS. Putting this debate aside, the challenge lies in how to apply the system safety process to UAS, which lacks the data and operational heritage of conventionally piloted aircraft. The limited knowledge and lack of operational data result in uncertainty in the system safety assessment of UAS. This uncertainty can lead to incorrect compliance findings and the potential certification and operation of UAS that do not meet minimum safety performance requirements. The existing system safety assessment and compliance processes, as used for conventional piloted aviation, do not adequately account for the uncertainty, limiting the suitability of its application to UAS. This paper discusses the challenges of undertaking system safety assessments for UAS and presents current and envisaged research towards addressing these challenges. It aims to highlight the main advantages associated with adopting a risk based framework to the System Safety Performance Requirement (SSPR) compliance process that is capable of taking the uncertainty associated with each of the outputs of the system safety assessment process into consideration. Based on this study, it is made clear that developing a framework tailored to UAS, would allow for a more rational, transparent and systematic approach to decision making. This would reduce the need for conservative assumptions and take the risk posed by each UAS into consideration while determining its state of compliance to the SSR.

Keywords: Part 1309 regulations, risk models, uncertainty, unmanned aircraft systems

Procedia PDF Downloads 160
34536 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 103
34535 Thermal Comfort Study of School Buildings in South Minahasa Regency Case Study: SMA Negeri 1 Amurang, Indonesia

Authors: Virgino Stephano Moniaga

Abstract:

Thermal comfort inside a building can affect students in their learning process. The learning process of students can be improved if the condition of the classrooms is comfortable. This study will be conducted in SMA Negeri 1 Amurang which is a senior high school building located in South Minahasa Regency. Based on preliminary survey, generally, students were not satisfied with the existing level of comfort, which subsequently affected the teaching and learning process in the classroom. The purpose of this study is to analyze the comfort level of classrooms occupants and recommend building design solutions that can improve the thermal comfort of classrooms. In this study, three classrooms will be selected for thermal comfort measurements. The thermal comfort measurements will be taken in naturally ventilated classrooms. The measured data comprise of personal data (clothing and students activity), air humidity, air temperature, mean radiant temperature and air flow velocity. Simultaneously, the students will be asked to fill out a questionnaire that asked about the level of comfort that was felt at the time. The results of field measurements and questionnaires will be analyzed based on the PMV and PPD indices. The results of the analysis will decide whether the classrooms are comfortable or not. This study can be continued to obtain a more optimal design solution to improve the thermal comfort of the classrooms. The expected results from this study can improve the quality of teaching and learning process between teachers and students which can further assist the government efforts to improve the quality of national education.

Keywords: classrooms, PMV, PPD, thermal comfort

Procedia PDF Downloads 291
34534 Improving Security in Healthcare Applications Using Federated Learning System With Blockchain Technology

Authors: Aofan Liu, Qianqian Tan, Burra Venkata Durga Kumar

Abstract:

Data security is of the utmost importance in the healthcare area, as sensitive patient information is constantly sent around and analyzed by many different parties. The use of federated learning, which enables data to be evaluated locally on devices rather than being transferred to a central server, has emerged as a potential solution for protecting the privacy of user information. To protect against data breaches and unauthorized access, federated learning alone might not be adequate. In this context, the application of blockchain technology could provide the system extra protection. This study proposes a distributed federated learning system that is built on blockchain technology in order to enhance security in healthcare. This makes it possible for a wide variety of healthcare providers to work together on data analysis without raising concerns about the confidentiality of the data. The technical aspects of the system, including as the design and implementation of distributed learning algorithms, consensus mechanisms, and smart contracts, are also investigated as part of this process. The technique that was offered is a workable alternative that addresses concerns about the safety of healthcare while also fostering collaborative research and the interchange of data.

Keywords: data privacy, distributed system, federated learning, machine learning

Procedia PDF Downloads 84
34533 The Face Sync-Smart Attendance

Authors: Bekkem Chakradhar Reddy, Y. Soni Priya, Mathivanan G., L. K. Joshila Grace, N. Srinivasan, Asha P.

Abstract:

Currently, there are a lot of problems related to marking attendance in schools, offices, or other places. Organizations tasked with collecting daily attendance data have numerous concerns. There are different ways to mark attendance. The most commonly used method is collecting data manually by calling each student. It is a longer process and problematic. Now, there are a lot of new technologies that help to mark attendance automatically. It reduces work and records the data. We have proposed to implement attendance marking using the latest technologies. We have implemented a system based on face identification and analyzing faces. The project is developed by gathering faces and analyzing data, using deep learning algorithms to recognize faces effectively. The data is recorded and forwarded to the host through mail. The project was implemented in Python and Python libraries used are CV2, Face Recognition, and Smtplib.

Keywords: python, deep learning, face recognition, CV2, smtplib, Dlib.

Procedia PDF Downloads 29
34532 A Process to Support Multidisciplinary Teams to Design Serious Games

Authors: Naza Djafarova, Tony Bates, Margaret Verkuyl, Leonora Zefi, Ozgur Turetken, Alex Ferworn, Mastrilli Paula, Daria Romaniuk, Kosha Bramesfeld, Anastasia Dimitriadou, Cheryl To

Abstract:

Designing serious games for education is a challenging and resource-intensive effort. If a well-designed process that balances pedagogical principles with game mechanics is in place, it can help to simplify the design process of serious games and increase efficiency. Multidisciplinary teams involved in designing serious games can benefit tremendously from such a process in their endeavours to develop and implement these games at undergraduate and graduate levels. This paper presentation will outline research results on identified gaps within existing processes and frameworks and present an adapted process that emerged from the research. The research methodology was based on a survey, semi-structured interviews and workshops for testing the adapted process for game design. Based on the findings, the authors propose a simple process for the pre-production stage of serious game design that may help guide multidisciplinary teams in their work. This process was used to facilitate team brainstorming, and is currently being tested to assess if multidisciplinary teams find value in using it in their process of designing serious games.

Keywords: serious game-design, multidisciplinary team, game design framework, learning games, multidisciplinary game design process

Procedia PDF Downloads 394
34531 Feature Selection Approach for the Classification of Hydraulic Leakages in Hydraulic Final Inspection using Machine Learning

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Manufacturing companies are facing global competition and enormous cost pressure. The use of machine learning applications can help reduce production costs and create added value. Predictive quality enables the securing of product quality through data-supported predictions using machine learning models as a basis for decisions on test results. Furthermore, machine learning methods are able to process large amounts of data, deal with unfavourable row-column ratios and detect dependencies between the covariates and the given target as well as assess the multidimensional influence of all input variables on the target. Real production data are often subject to highly fluctuating boundary conditions and unbalanced data sets. Changes in production data manifest themselves in trends, systematic shifts, and seasonal effects. Thus, Machine learning applications require intensive pre-processing and feature selection. Data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets. Within the used real data set of Bosch hydraulic valves, the comparability of the same production conditions in the production of hydraulic valves within certain time periods can be identified by applying the concept drift method. Furthermore, a classification model is developed to evaluate the feature importance in different subsets within the identified time periods. By selecting comparable and stable features, the number of features used can be significantly reduced without a strong decrease in predictive power. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. In this research, the ada boosting classifier is used to predict the leakage of hydraulic valves based on geometric gauge blocks from machining, mating data from the assembly, and hydraulic measurement data from end-of-line testing. In addition, the most suitable methods are selected and accurate quality predictions are achieved.

Keywords: classification, achine learning, predictive quality, feature selection

Procedia PDF Downloads 141
34530 Reviewing Privacy Preserving Distributed Data Mining

Authors: Sajjad Baghernezhad, Saeideh Baghernezhad

Abstract:

Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.

Keywords: data mining, distributed data mining, privacy protection, privacy preserving

Procedia PDF Downloads 492
34529 Integrated Model for Enhancing Data Security Performance in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 452
34528 The Right to Data Portability and Its Influence on the Development of Digital Services

Authors: Roman Bieda

Abstract:

The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.

Keywords: data portability, digital market, GDPR, personal data

Procedia PDF Downloads 445
34527 Evidence Theory Enabled Quickest Change Detection Using Big Time-Series Data from Internet of Things

Authors: Hossein Jafari, Xiangfang Li, Lijun Qian, Alexander Aved, Timothy Kroecker

Abstract:

Traditionally in sensor networks and recently in the Internet of Things, numerous heterogeneous sensors are deployed in distributed manner to monitor a phenomenon that often can be model by an underlying stochastic process. The big time-series data collected by the sensors must be analyzed to detect change in the stochastic process as quickly as possible with tolerable false alarm rate. However, sensors may have different accuracy and sensitivity range, and they decay along time. As a result, the big time-series data collected by the sensors will contain uncertainties and sometimes they are conflicting. In this study, we present a framework to take advantage of Evidence Theory (a.k.a. Dempster-Shafer and Dezert-Smarandache Theories) capabilities of representing and managing uncertainty and conflict to fast change detection and effectively deal with complementary hypotheses. Specifically, Kullback-Leibler divergence is used as the similarity metric to calculate the distances between the estimated current distribution with the pre- and post-change distributions. Then mass functions are calculated and related combination rules are applied to combine the mass values among all sensors. Furthermore, we applied the method to estimate the minimum number of sensors needed to combine, so computational efficiency could be improved. Cumulative sum test is then applied on the ratio of pignistic probability to detect and declare the change for decision making purpose. Simulation results using both synthetic data and real data from experimental setup demonstrate the effectiveness of the presented schemes.

Keywords: CUSUM, evidence theory, kl divergence, quickest change detection, time series data

Procedia PDF Downloads 306
34526 Presenting a Model in the Analysis of Supply Chain Management Components by Using Statistical Distribution Functions

Authors: Ramin Rostamkhani, Thurasamy Ramayah

Abstract:

One of the most important topics of today’s industrial organizations is the challenging issue of supply chain management. In this field, scientists and researchers have published numerous practical articles and models, especially in the last decade. In this research, to our best knowledge, the discussion of data modeling of supply chain management components using well-known statistical distribution functions has been considered. The world of science owns mathematics, and showing the behavior of supply chain data based on the characteristics of statistical distribution functions is innovative research that has not been published anywhere until the moment of doing this research. In an analytical process, describing different aspects of functions including probability density, cumulative distribution, reliability, and failure function can reach the suitable statistical distribution function for each of the components of the supply chain management. It can be applied to predict the behavior data of the relevant component in the future. Providing a model to adapt the best statistical distribution function in the supply chain management components will be a big revolution in the field of the behavior of the supply chain management elements in today's industrial organizations. Demonstrating the final results of the proposed model by introducing the process capability indices before and after implementing it alongside verifying the approach through the relevant assessment as an acceptable verification is a final step. The introduced approach can save the required time and cost to achieve the organizational goals. Moreover, it can increase added value in the organization.

Keywords: analyzing, process capability indices, statistical distribution functions, supply chain management components

Procedia PDF Downloads 66
34525 Compliance and Assessment Process of Information Technology in Accounting, in Turkey

Authors: Kocakaya Eda, Argun Doğan

Abstract:

This study analyzed the present state of information technology in the field of accounting by bibliometric analysis of scientific studies on the impact on the transformation of e-billing and tax managementin Turkey. With comparative bibliometric analysis, the innovation and positive effects of the process that changed with e-transformation in the field of accounting with e-transformation in businesses and the information technologies used in accounting and tax management were analyzed comparatively. By evaluating the data obtained as a result of these analyzes, suggestions on the use of information technologies in accounting and tax management and the positive and negative effects of e-transformation on the analyzed activities of the enterprises were emphasized. With the e-transformation, which will be realized with the most efficient use of information technologies in Turkey. The synergy and efficiency of IT technology developments in avcoounting and finance should be revealed in the light of scientific data, from the smallest business to the largest economic enterprises.

Keywords: information technologies, E-invoice, E-Tax management, E-transformation, accounting programs

Procedia PDF Downloads 93
34524 Removal of Hexavalent Chromium from Aqueous Solutions by Biosorption Using Macadamia Nutshells: Effect of Different Treatment Methods

Authors: Vusumzi E. Pakade, Themba D. Ntuli, Augustine E. Ofomaja

Abstract:

Macadamia nutshell biosorbents treated in three different methods (raw Macadamia nutshell powder (RMN), acid-treated Macadamia nutshell (ATMN) and base-treated Macadamia nutshell (BTMN)) were investigated for the adsorption of Cr(VI) from aqueous solutions. Fourier transform infrared spectroscopy (FT-IR) spectra of free and Cr(VI)-loaded sorbents as well as thermogravimetric analysis (TGA) revealed that the acid and base treatments modified the surface properties of the sorbents. The optimum conditions for the adsorption of Cr(VI) by sorbents were pH 2, contact time 10 h, adsorbent dosage 0.2 g L-1, and concentration 100 mg L-1. The different treatment methods altered the surface characteristics of the sorbents and produced different maximum binding capacities of 42.5, 40.6 and 37.5 mg g-1 for RMN, ATMN and BTMN, respectively. The data was fitted into the Langmuir, Freundlich, Redlich-Peterson and Sips isotherms. No single model could clearly explain the data perhaps due to the complexity of process taking place. The kinetic modeling results showed that the process of Cr(VI) biosorption with Macadamia sorbents was better described by a process of chemical sorption in pseudo-second order. These results showed that the three treatment methods yielded different surface properties which then influenced adsorption of Cr(VI) differently.

Keywords: biosorption, chromium(VI), isotherms, Macadamia, reduction, treatment

Procedia PDF Downloads 239
34523 Operational Advantages of Tungsten Inert Gas over Metal Inert Gas Welding Process

Authors: Emmanuel Ogundimu, Esther Akinlabi, Mutiu Erinosho

Abstract:

In this research, studies were done on the material characterization of type 304 austenitic stainless steel weld produced by TIG (Tungsten Inert Gas) and MIG (Metal Inert Gas) welding processes. This research is aimed to establish optimized process parameters that will result in a defect-free weld joint, homogenous distribution of the iron (Fe), chromium (Cr) and nickel (Ni) was observed at the welded joint of all the six samples. The welded sample produced at the current of 170 A by TIG welding process had the highest ultimate tensile strength (UTS) value of 621 MPa at the welds zone, and the welded sample produced by MIG process at the welding current of 150 A had the lowest UTS value of 568 MPa. However, it was established that TIG welding process is more appropriate for the welding of type 304 austenitic stainless steel compared to the MIG welding process.

Keywords: microhardness, microstructure, tensile, MIG welding, process, tensile, shear stress TIG welding, TIG-MIG welding

Procedia PDF Downloads 162
34522 Efficiency of Membrane Distillation to Produce Fresh Water

Authors: Sabri Mrayed, David Maccioni, Greg Leslie

Abstract:

Seawater desalination has been accepted as one of the most effective solutions to the growing problem of a diminishing clean drinking water supply. Currently, two desalination technologies dominate the market – the thermally driven multi-stage flash distillation (MSF) and the membrane based reverse osmosis (RO). However, in recent years membrane distillation (MD) has emerged as a potential alternative to the established means of desalination. This research project intended to determine the viability of MD as an alternative process to MSF and RO for seawater desalination. Specifically the project involves conducting a thermodynamic analysis of the process based on the second law of thermodynamics to determine the efficiency of the MD. Data was obtained from experiments carried out on a laboratory rig. In order to determine exergy values required for the exergy analysis, two separate models were built in Engineering Equation Solver – the ’Minimum Separation Work Model’ and the ‘Stream Exergy Model’. The efficiency of MD process was found to be 17.3 %, and the energy consumption was determined to be 4.5 kWh to produce one cubic meter of fresh water. The results indicate MD has potential as a technique for seawater desalination compared to RO and MSF. However, it was shown that this was only the case if an alternate energy source such as green or waste energy was available to provide the thermal energy input to the process. If the process was required to power itself, it was shown to be highly inefficient and in no way thermodynamically viable as a commercial desalination process.

Keywords: desalination, exergy, membrane distillation, second law efficiency

Procedia PDF Downloads 336
34521 Cognitive Science Based Scheduling in Grid Environment

Authors: N. D. Iswarya, M. A. Maluk Mohamed, N. Vijaya

Abstract:

Grid is infrastructure that allows the deployment of distributed data in large size from multiple locations to reach a common goal. Scheduling data intensive applications becomes challenging as the size of data sets are very huge in size. Only two solutions exist in order to tackle this challenging issue. First, computation which requires huge data sets to be processed can be transferred to the data site. Second, the required data sets can be transferred to the computation site. In the former scenario, the computation cannot be transferred since the servers are storage/data servers with little or no computational capability. Hence, the second scenario can be considered for further exploration. During scheduling, transferring huge data sets from one site to another site requires more network bandwidth. In order to mitigate this issue, this work focuses on incorporating cognitive science in scheduling. Cognitive Science is the study of human brain and its related activities. Current researches are mainly focused on to incorporate cognitive science in various computational modeling techniques. In this work, the problem solving approach of human brain is studied and incorporated during the data intensive scheduling in grid environments. Here, a cognitive engine is designed and deployed in various grid sites. The intelligent agents present in CE will help in analyzing the request and creating the knowledge base. Depending upon the link capacity, decision will be taken whether to transfer data sets or to partition the data sets. Prediction of next request is made by the agents to serve the requesting site with data sets in advance. This will reduce the data availability time and data transfer time. Replica catalog and Meta data catalog created by the agents assist in decision making process.

Keywords: data grid, grid workflow scheduling, cognitive artificial intelligence

Procedia PDF Downloads 370
34520 Use of Machine Learning in Data Quality Assessment

Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho

Abstract:

Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.

Keywords: machine learning, data quality, quality dimension, quality assessment

Procedia PDF Downloads 120