Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35195

Search results for: process data

34265 The Roles of Parental Involvement in the Teaching-Learning Process of Students with Special Needs: Perceptions of Special Needs Education Teachers

Authors: Chassel T. Paras, Tryxzy Q. Dela Cruz, Ma. Carmela Lousie V. Goingco, Pauline L. Tolentino, Carmela S. Dizon

Abstract:

In implementing inclusive education, parental involvement is measured to be an irreplaceable contributing factor. Parental involvement is described as an indispensable aspect of the teaching-learning process and has a remarkable effect on the student's academic performance. However, there are still differences in the viewpoints, expectations, and needs of both parents and teachers that are not yet fully conveyed in their relationship; hence, the perceptions of SNED teachers are essential in their collaboration with parents. This qualitative study explored how SNED teachers perceive the roles of parental involvement in the teaching-learning process of students with special needs. To answer this question, one-on-one face-to-face semi-structured interviews with three SNED teachers in a selected public school in Angeles City, Philippines, that offer special needs education services were conducted. The gathered data are then analyzed using Interpretative Phenomenological Analysis (IPA). The results revealed four superordinate themes, which include: (1) roles of parental involvement, (2) parental involvement opportunities, (3) barriers to parental involvement, and (4) parent-teacher collaboration practices. These results indicate that SNED teachers are aware of the roles and importance of parental involvement; however, despite parent-teacher collaboration, there are still barriers that impede parental involvement. Also, SNED teachers acknowledge the big roles of parents as they serve as main figures in the teaching-learning process of their children with special needs. Lastly, these results can be used as input in developing a school-facilitated parenting involvement framework that encompasses the contribution of SNED teachers in planning, developing, and evaluating parental involvement programs, which future researchers can also use in their studies

Keywords: parental involvement, special needs education, teaching-learning process, teachers’ perceptions, special needs education teachers, interpretative phenomenological analysis

Procedia PDF Downloads 87
34264 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive

Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh

Abstract:

Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.

Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data

Procedia PDF Downloads 276
34263 Performance Analysis of Scalable Secure Multicasting in Social Networking

Authors: R. Venkatesan, A. Sabari

Abstract:

Developments of social networking internet scenario are recommended for the requirements of scalable, authentic, secure group communication model like multicasting. Multicasting is an inter network service that offers efficient delivery of data from a source to multiple destinations. Even though multicast has been very successful at providing an efficient and best-effort data delivery service for huge groups, it verified complex process to expand other features to multicast in a scalable way. Separately, the requirement for secure electronic information had become gradually more apparent. Since multicast applications are deployed for mainstream purpose the need to secure multicast communications will become significant.

Keywords: multicasting, scalability, security, social network

Procedia PDF Downloads 277
34262 Various Models of Quality Management Systems

Authors: Mehrnoosh Askarizadeh

Abstract:

People, process and IT are the most important assets of any organization. Optimal utilization of these resources has been the question of research in business for many decades. The business world have responded by inventing various methodologies that can be used for addressing problems of quality improvement, efficiency of processes, continuous improvement, reduction of waste, automation, strategy alignments etc. Some of these methodologies can be commonly called as Business Process Quality Management methodologies (BPQM). In essence, the first references to the process management can be traced back to Frederick Taylor and scientific management. Time and motion study was addressed to improvement of manufacturing process efficiency. The ideas of scientific management were in use for quite a long period until more advanced quality management techniques were developed in Japan and USA. One of the first prominent methods had been Total Quality Management (TQM) which evolved during 1980’s. About the same time, Six Sigma (SS) originated at Motorola as a separate method. SS spread and evolved; and later joined with ideas of Lean manufacturing to form Lean Six Sigma. In 1990’s due to emerging IT technologies, beginning of globalization, and strengthening of competition, companies recognized the need for better process and quality management. Business Process Management (BPM) emerged as a novel methodology that has taken all this into account and helped to align IT technologies with business processes and quality management. In this article we will study various aspects of above mentioned methods and identified their relations.

Keywords: e-process, quality, TQM, BPM, lean, six sigma, CPI, information technology, management

Procedia PDF Downloads 417
34261 Electrical Decomposition of Time Series of Power Consumption

Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats

Abstract:

Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).

Keywords: electrical disaggregation, DTW, general appliance modeling, event detection

Procedia PDF Downloads 60
34260 Investigation of Heat Transfer Mechanism Inside Shell and Tube Latent Heat Thermal Energy Storage Systems

Authors: Saeid Seddegh, Xiaolin Wang, Alan D. Henderson, Dong Chen, Oliver Oims

Abstract:

The main objective of this research is to study the heat transfer processes and phase change behaviour of a phase change material (PCM) in shell and tube latent heat thermal energy storage (LHTES) systems. The thermal behaviour in a vertical and horizontal shell-and-tube heat energy storage system using a pure thermal conduction model and a combined conduction-convection heat transfer model is compared in this paper. The model is first validated using published experimental data available in literature and then used to study the temperature variation, solid-liquid interface, phase distribution, total melting and solidification time during melting and solidification processes of PCMs. The simulated results show that the combined convection and conduction model can better describe the energy transfer in PCMs during melting process. In contrast, heat transfer by conduction is more significant during the solidification process since the two models show little difference. Also, it was concluded that during the charging process for the horizontal orientation, convective heat transfer has a strong effect on melting of the upper part of the solid PCM and is less significant during melting of the lower half of the solid PCM. However, in the vertical orientation, convective heat transfer is the same active during the entire charging process. In the solidification process, the thermal behavior does not show any difference between horizontal and vertical systems.

Keywords: latent heat thermal energy storage, phase change material, natural convection, melting, shell and tube heat exchanger, melting, solidification

Procedia PDF Downloads 538
34259 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects

Authors: Behnam Tavakkol

Abstract:

Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.

Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data

Procedia PDF Downloads 197
34258 Beyond Personal Evidence: Using Learning Analytics and Student Feedback to Improve Learning Experiences

Authors: Shawndra Bowers, Allie Brandriet, Betsy Gilbertson

Abstract:

This paper will highlight how Auburn Online’s instructional designers leveraged student and faculty data to update and improve online course design and instructional materials. When designing and revising online courses, it can be difficult for faculty to know what strategies are most likely to engage learners and improve educational outcomes in a specific discipline. It can also be difficult to identify which metrics are most useful for understanding and improving teaching, learning, and course design. At Auburn Online, the instructional designers use a suite of data based student’s performance, participation, satisfaction, and engagement, as well as faculty perceptions, to inform sound learning and design principles that guide growth-mindset consultations with faculty. The consultations allow the instructional designer, along with the faculty member, to co-create an actionable course improvement plan. Auburn Online gathers learning analytics from a variety of sources that any instructor or instructional design team may have access to at their own institutions. Participation and performance data, such as page: views, assignment submissions, and aggregate grade distributions, are collected from the learning management system. Engagement data is pulled from the video hosting platform, which includes unique viewers, views and downloads, the minutes delivered, and the average duration each video is viewed. Student satisfaction is also obtained through a short survey that is embedded at the end of each instructional module. This survey is included in each course every time it is taught. The survey data is then analyzed by an instructional designer for trends and pain points in order to identify areas that can be modified, such as course content and instructional strategies, to better support student learning. This analysis, along with the instructional designer’s recommendations, is presented in a comprehensive report to instructors in an hour-long consultation where instructional designers collaborate with the faculty member on how and when to implement improvements. Auburn Online has developed a triage strategy of priority 1 or 2 level changes that will be implemented in future course iterations. This data-informed decision-making process helps instructors focus on what will best work in their teaching environment while addressing which areas need additional attention. As a student-centered process, it has created improved learning environments for students and has been well received by faculty. It has also shown to be effective in addressing the need for improvement while removing the feeling the faculty’s teaching is being personally attacked. The process that Auburn Online uses is laid out, along with the three-tier maintenance and revision guide that will be used over a three-year implementation plan. This information can help others determine what components of the maintenance and revision plan they want to utilize, as well as guide them on how to create a similar approach. The data will be used to analyze, revise, and improve courses by providing recommendations and models of good practices through determining and disseminating best practices that demonstrate an impact on student success.

Keywords: data-driven, improvement, online courses, faculty development, analytics, course design

Procedia PDF Downloads 40
34257 Progressing Institutional Quality Assurance and Accreditation of Higher Education Programmes

Authors: Dominique Parrish

Abstract:

Globally, higher education institutions are responsible for the quality assurance and accreditation of their educational programmes (Courses). The primary purpose of these activities is to ensure that the educational standards of the governing higher education authority are met and the quality of the education provided to students is assured. Despite policies and frameworks being established in many countries, to improve the veracity and accountability of quality assurance and accreditation processes, there are reportedly still mistakes, gaps and deficiencies in these processes. An analysis of Australian universities’ quality assurance and accreditation processes noted that significant improvements were needed in managing these processes and ensuring that review recommendations were implemented. It has also been suggested that the following principles are critical for higher education quality assurance and accreditation to be effective and sustainable: academic standards and performance outcomes must be defined, attainable and monitored; those involved in providing the higher education must assume responsibility for the associated quality assurance and accreditation; potential academic risks must be identified and management solutions developed; and the expectations of the public, governments and students should be considered and incorporated into Course enhancements. This phenomenological study, which was conducted in a Faculty of Science, Medicine and Health in an Australian university, sought to systematically and iteratively develop an effective quality assurance and accreditation process that integrated the evidence-based principles of success and promoted meaningful and sustainable change. Qualitative evaluative feedback was gathered, over a period of eleven months (January - November 2014), from faculty staff engaged in the quality assurance and accreditation of forty-eight undergraduate and postgraduate Courses. Reflexive analysis was used to analyse the data and inform ongoing modifications and developments to the assurance and accreditation process as well as the associated supporting resources. The study resulted in the development of a formal quality assurance and accreditation process together with a suite of targeted resources that were identified as critical for success. The research findings also provided some insights into the institutional enablers that were antecedents to successful quality assurance and accreditation processes as well as meaningful change in the educational practices of academics. While longitudinal data will be collected to further assess the value of the assurance and accreditation process on educational quality, early indicators are that there has been a change in the pedagogical perspectives and activities of academic staff and growing momentum to explore opportunities to further enhance and develop Courses. This presentation will explain the formal quality assurance and accreditation process as well as the component parts, which resulted from this study. The targeted resources that were developed will be described, the pertinent factors that contributed to the success of the process will be discussed and early indicators of sustainable academic change as well as suggestions for future research will be outlined.

Keywords: academic standards, quality assurance and accreditation, phenomenological study, process, resources

Procedia PDF Downloads 358
34256 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations

Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa

Abstract:

This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.

Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy

Procedia PDF Downloads 181
34255 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering

Authors: Yunus Doğan, Ahmet Durap

Abstract:

Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.

Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods

Procedia PDF Downloads 344
34254 The Optimization of TICSI in the Convergence Mechanism of Urban Water Management

Authors: M. Macchiaroli, L. Dolores, V. Pellecchia

Abstract:

With the recent Resolution n. 580/2019/R/idr, the Italian Regulatory Authority for Energy, Networks, and Environment (ARERA) for the Urban Water Management has introduced, for water managements characterized by persistent critical issues regarding the planning and organization of the service and the implementation of the necessary interventions for the improvement of infrastructures and management quality, a new mechanism for determining tariffs: the regulatory scheme of Convergence. The aim of this regulatory scheme is the overcoming of the Water Service Divided in order to improve the stability of the local institutional structures, technical quality, contractual quality, as well as in order to guarantee transparency elements for Users of the Service. Convergence scheme presupposes the identification of the cost items to be considered in the tariff in parametric terms, distinguishing three possible cases according to the type of historical data available to the Manager. The study, in particular, focuses on operations that have neither data on tariff revenues nor data on operating costs. In this case, the Manager's Constraint on Revenues (VRG) is estimated on the basis of a reference benchmark and becomes the starting point for defining the structure of the tariff classes, in compliance with the TICSI provisions (Integrated Text for tariff classes, ARERA's Resolution n. 665/2017/R/idr). The proposed model implements the recent studies on optimization models for the definition of tariff classes in compliance with the constraints dictated by TICSI in the application of the Convergence mechanism, proposing itself as a support tool for the Managers and the local water regulatory Authority in the decision-making process.

Keywords: decision-making process, economic evaluation of projects, optimizing tools, urban water management, water tariff

Procedia PDF Downloads 101
34253 A Study on Optimum Shape in According to Equivalent Stress Distributions at the Die and Plug in the Multi-Pass Drawing Process

Authors: Yeon-Jong Jeong, Mok-Tan Ahn, Seok-Hyeon Park, Seong-Hun Ha, Joon-Hong Park, Jong-Bae Park

Abstract:

Multi-stage drawing process is an important technique for forming a shape that cannot be molded in a single process. multi-stage drawing process in number of passes and the shape of the die are an important factors influencing the productivity and formability of the product. The number and shape of the multi-path in the mold of the drawing process is very influencing the productivity and formability of the product. Half angle of the die and mandrel affects the drawing force and it also affects the completion of the final shape. Thus reducing the number of pass and the die shape optimization are necessary to improve the formability of the billet. Analyzing the load on the die through the FEM analysis and in consideration of the formability of the material presents a die model.

Keywords: multi-pass shape drawing, equivalent stress, FEM, finite element method, optimum shape

Procedia PDF Downloads 460
34252 A Multi-Role Oriented Collaboration Platform for Distributed Disaster Reduction in China

Authors: Linyao Qiu, Zhiqiang Du

Abstract:

As the rapid development of urbanization, economic developments, and steady population growth in China, the widespread devastation, economic damages, and loss of human lives caused by numerous forms of natural disasters are becoming increasingly serious every year. Disaster management requires available and effective cooperation of different roles and organizations in whole process including mitigation, preparedness, response and recovery. Due to the imbalance of regional development in China, the disaster management capabilities of national and provincial disaster reduction centers are uneven. When an undeveloped area suffers from disaster, neither local reduction department could get first-hand information like high-resolution remote sensing images from satellites and aircrafts independently, nor sharing mechanism is provided for the department to access to data resources deployed in other place directly. Most existing disaster management systems operate in a typical passive data-centric mode and work for single department, where resources cannot be fully shared. The impediment blocks local department and group from quick emergency response and decision-making. In this paper, we introduce a collaborative platform for distributed disaster reduction. To address the issues of imbalance of sharing data sources and technology in the process of disaster reduction, we propose a multi-role oriented collaboration business mechanism, which is capable of scheduling and allocating for optimum utilization of multiple resources, to link various roles for collaborative reduction business in different place. The platform fully considers the difference of equipment conditions in different provinces and provide several service modes to satisfy technology need in disaster reduction. An integrated collaboration system based on focusing services mechanism is designed and implemented for resource scheduling, functional integration, data processing, task management, collaborative mapping, and visualization. Actual applications illustrate that the platform can well support data sharing and business collaboration between national and provincial department. It could significantly improve the capability of disaster reduction in China.

Keywords: business collaboration, data sharing, distributed disaster reduction, focusing service

Procedia PDF Downloads 279
34251 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 290
34250 Detergent Removal from Rinsing Water by Peroxi Electrocoagulation Process

Authors: A. Benhadji, M. Taleb Ahmed

Abstract:

Among the various methods of treatment, advanced oxidation processes (AOP) are the most promising ones. In this study, Peroxi Electrocoagulation Process (PEP) was investigated for the treatment of detergent wastewater. The process was compared with electrooxidation treatment. The results showed that chemical oxygen demand (COD) was high 7584 mgO2.L-1, while the biochemical oxygen demand was low (250 mgO2.L-1). This wastewater was hardly biodegradable. Electrochemical process was carried out for the removal of detergent using a glass reactor with a volume of 1 L and fitted with three electrodes. A direct current (DC) supply was used. Samples were taken at various current density (0.0227 A/cm2 to 0.0378 A/cm2) and reaction time (1-2-3-4 and 5 hour). Finally, the COD was determined. The results indicated that COD removal efficiency of PEP was observed to increase with current intensity and reached to 77% after 5 h. The highest removal efficiency was observed after 5 h of treatment.

Keywords: AOP, COD, detergent, PEP, wastewater

Procedia PDF Downloads 104
34249 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery

Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene

Abstract:

Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.

Keywords: multi-objective, analysis, data flow, freight delivery, methodology

Procedia PDF Downloads 165
34248 Analysis on the Need of Engineering Drawing and Feasibility Study on 3D Model Based Engineering Implementation

Authors: Parthasarathy J., Ramshankar C. S.

Abstract:

Engineering drawings these days play an important role in every part of an industry. By and large, Engineering drawings are influential over every phase of the product development process. Traditionally, drawings are used for communication in industry because they are the clearest way to represent the product manufacturing information. Until recently, manufacturing activities were driven by engineering data captured in 2D paper documents or digital representations of those documents. The need of engineering drawing is inevitable. Still Engineering drawings are disadvantageous in re-entry of data throughout manufacturing life cycle. This document based approach is prone to errors and requires costly re-entry of data at every stage in the manufacturing life cycle. So there is a requirement to eliminate Engineering drawings throughout product development process and to implement 3D Model Based Engineering (3D MBE or 3D MBD). Adopting MBD appears to be the next logical step to continue reducing time-to-market and improve product quality. Ideally, by fully applying the MBD concept, the product definition will no longer rely on engineering drawings throughout the product lifecycle. This project addresses the need of Engineering drawing and its influence in various parts of an industry and the need to implement the 3D Model Based Engineering with its advantages and the technical barriers that must be overcome in order to implement 3D Model Based Engineering. This project also addresses the requirements of neutral formats and its realisation in order to implement the digital product definition principles in a light format. In order to prove the concepts of 3D Model Based Engineering, the screw jack body part is also demonstrated. At ZF Windpower Coimbatore Limited, 3D Model Based Definition is implemented to Torque Arm (Machining and Casting), Steel tube, Pinion shaft, Cover, Energy tube.

Keywords: engineering drawing, model based engineering MBE, MBD, CAD

Procedia PDF Downloads 412
34247 Multi-Pass Shape Drawing Process Design for Manufacturing of Automotive Reinforcing Agent with Closed Cross-Section Shape using Finite Element Method Analysis

Authors: Mok-Tan Ahn, Hyeok Choi, Joon-Hong Park

Abstract:

Multi-stage drawing process is an important technique for forming a shape that cannot be molded in a single process. multi-stage drawing process in number of passes and the shape of the die are an important factor influencing the productivity and moldability of the product. The number and shape of the multi-path in the mold of the drawing process is very influencing the productivity and moldability of the product. Half angle of the die and mandrel affects the drawing force and it also affects the completion of the final shape. Thus reducing the number of pass and the die shape optimization are necessary to improve the formability of the billet. The purpose of this study, Analyzing the load on the die through the FEM analysis and in consideration of the formability of the material presents a die model.

Keywords: automotive reinforcing agent, multi-pass shape drawing, automotive parts, FEM analysis

Procedia PDF Downloads 441
34246 Access to Health Data in Medical Records in Indonesia in Terms of Personal Data Protection Principles: The Limitation and Its Implication

Authors: Anny Retnowati, Elisabeth Sundari

Abstract:

This research aims to elaborate the meaning of personal data protection principles on patient access to health data in medical records in Indonesia and its implications. The method uses normative legal research by examining health law in Indonesia regarding the patient's right to access their health data in medical records. The data will be analysed qualitatively using the interpretation method to elaborate on the limitation of the meaning of personal data protection principles on patients' access to their data in medical records. The results show that patients only have the right to obtain copies of their health data in medical records. There is no right to inspect directly at any time. Indonesian health law limits the principle of patients' right to broad access to their health data in medical records. This restriction has implications for the reduction of personal data protection as part of human rights. This research contribute to show that a limitaion of personal data protection may abuse the human rights.

Keywords: access, health data, medical records, personal data, protection

Procedia PDF Downloads 66
34245 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 337
34244 Estimation of Morbidity Level of Industrial Labour Conditions at Zestafoni Ferroalloy Plant

Authors: M. Turmanauli, T. Todua, O. Gvaberidze, R. Javakhadze, N. Chkhaidze, N. Khatiashvili

Abstract:

Background: Mining process has the significant influence on human health and quality of life. In recent years the events in Georgia were reflected on the industry working process, especially minimal requirements of labor safety, hygiene standards of workplace and the regime of work and rest are not observed. This situation is often caused by the lack of responsibility, awareness, and knowledge both of workers and employers. The control of working conditions and its protection has been worsened in many of industries. Materials and Methods: For evaluation of the current situation the prospective epidemiological study by face to face interview method was conducted at Georgian “Manganese Zestafoni Ferroalloy Plant” in 2011-2013. 65.7% of employees (1428 bulletin) were surveyed and the incidence rates of temporary disability days were studied. Results: The average length of a temporary disability single accident was studied taking into consideration as sex groups as well as the whole cohort. According to the classes of harmfulness the following results were received: Class 2.0-10.3%; 3.1-12.4%; 3.2-35.1%; 3.3-12.1%; 3.4-17.6%; 4.0-12.5%. Among the employees 47.5% and 83.1% were tobacco and alcohol consumers respectively. According to the age groups and years of work on the base of previous experience ≥50 ages and ≥21 years of work data prevalence respectively. The obtained data revealed increased morbidity rate according to age and years of work. It was found that the bone and articulate system and connective tissue diseases, aggravation of chronic respiratory diseases, ischemic heart diseases, hypertension and cerebral blood discirculation were the leading among the other diseases. High prevalence of morbidity observed in the workplace with not satisfactory labor conditions from the hygienic point of view. Conclusion: According to received data the causes of morbidity are the followings: unsafety labor conditions; incomplete of preventive medical examinations (preliminary and periodic); lack of access to appropriate health care services; derangement of gathering, recording, and analysis of morbidity data. This epidemiological study was conducted at the JSC “Manganese Ferro Alloy Plant” according to State program “ Prevention of Occupational Diseases” (Program code is 35 03 02 05).

Keywords: occupational health, mining process, morbidity level, cerebral blood discirculation

Procedia PDF Downloads 413
34243 Process Modeling and Problem Solving: Connecting Two Worlds by BPMN

Authors: Gionata Carmignani, Mario G. C. A. Cimino, Franco Failli

Abstract:

Business Processes (BPs) are the key instrument to understand how companies operate at an organizational level, taking an as-is view of the workflow, and how to address their issues by identifying a to-be model. In last year’s, the BP Model and Notation (BPMN) has become a de-facto standard for modeling processes. However, this standard does not incorporate explicitly the Problem-Solving (PS) knowledge in the Process Modeling (PM) results. Thus, such knowledge cannot be shared or reused. To narrow this gap is today a challenging research area. In this paper we present a framework able to capture the PS knowledge and to improve a workflow. This framework extends the BPMN specification by incorporating new general-purpose elements. A pilot scenario is also presented and discussed.

Keywords: business process management, BPMN, problem solving, process mapping

Procedia PDF Downloads 394
34242 Interpretation of Heritage Revitalization

Authors: Jarot Mahendra

Abstract:

The primary objective of this paper is to provide a view in the interpretation of the revitalization of heritage buildings. This objective is achieved by analyzing the concept of interpretation that is oriented in the perspective of law, urban spatial planning, and stakeholder perspective, and then develops the theoretical framework of interpretation in the cultural resources management through issues of identity, heritage as a process, and authenticity in heritage. The revitalization of heritage buildings with the interpretation of these three issues is that interpretation can be used as a communication process to express the meaning and relation of heritage to the community so as to avoid the conflict that will arise and develop as a result of different perspectives of stakeholders. Using case studies in Indonesia, this study focuses on the revitalization of heritage sites in the National Gallery of Indonesia (GNI). GNI is a cultural institution that uses several historical buildings that have been designated as heritage and have not been designated as a heritage according to the regulations applicable in Indonesia, in carrying out its function as the center of Indonesian art development and art museums. The revitalization of heritage buildings is taken as a step to meet space needs in running the current GNI function. In the revitalization master plan, there are physical interventions on the building of heritage and the removal of some historic buildings which will then be built new buildings at that location. The research matrix was used to map out the main elements of the study (the concept of GNI revitalization, heritage as identity, heritage as a process, and authenticity in the heritage). Expert interviews and document studies are the main tools used in collecting data. Qualitative data is then analyzed through content analysis and template analysis. This study identifies the significance of historic buildings (heritage buildings and buildings not defined as heritage) as an important value of history, architecture, education, and culture. The significance becomes the basis for revisiting the revitalization master plan which is then reviewed according to applicable regulations and the spatial layout of Jakarta. The interpretation that is built is (1) GNI is one of the elements of the embodiment of the National Cultural Center in the context of the region, where there are National Monument, National Museum and National Library in the same area, so the heritage not only gives identity to the past culture but the culture of current community; (2) The heritage should be seen as a dynamic cultural process towards the cultural change of community, where heritage must develop along with the urban development, so that the heritage buildings can remain alive and side by side with modern buildings but still observe the principles of preservation of heritage; (3) The authenticity of heritage should be able to balance the cultural heritage conservation approach with urban development, where authenticity can serve as a 'Value Transmitter' so that authenticity can be used to evaluate, preserve and manage heritage buildings by considering tangible and intangible aspects.

Keywords: authenticity, culture process, identity, interpretation, revitalization

Procedia PDF Downloads 130
34241 An Application of a Feedback Control System to Minimize Unforeseen Disruption in a Paper Manufacturing Industry in South Africa

Authors: Martha E. Ndeley

Abstract:

Operation management is the key element within the manufacturing process. However, during this process, there are a number of unforeseen disruptions that causes the process to a standstill which are, machine breakdown, employees absenteeism, improper scheduling. When this happens, it forces the shop flow to a rescheduling process and these strategy reschedules only a limited part of the initial schedule to match up with the pre-schedule at some point with the objective to create a new schedule that is reliable which in the long run gets disrupted. In this work, we have developed feedback control system that minimizes any form of disruption before the impact becomes severe, the model was tested in a paper manufacturing industries and the results revealed that, if the disruption is minimized at the initial state, the impact becomes unnoticeable.

Keywords: disruption, machine, absenteeism, scheduling

Procedia PDF Downloads 290
34240 System Identification and Quantitative Feedback Theory Design of a Lathe Spindle

Authors: M. Khairudin

Abstract:

This paper investigates the system identification and design quantitative feedback theory (QFT) for the robust control of a lathe spindle. The dynamic of the lathe spindle is uncertain and time variation due to the deepness variation on cutting process. System identification was used to obtain the dynamics model of the lathe spindle. In this work, real time system identification is used to construct a linear model of the system from the nonlinear system. These linear models and its uncertainty bound can then be used for controller synthesis. The real time nonlinear system identification process to obtain a set of linear models of the lathe spindle that represents the operating ranges of the dynamic system. With a selected input signal, the data of output and response is acquired and nonlinear system identification is performed using Matlab to obtain a linear model of the system. Practical design steps are presented in which the QFT-based conditions are formulated to obtain a compensator and pre-filter to control the lathe spindle. The performances of the proposed controller are evaluated in terms of velocity responses of the the lathe machine spindle in corporating deepness on cutting process.

Keywords: lathe spindle, QFT, robust control, system identification

Procedia PDF Downloads 521
34239 Psychological and Ethical Factors in African American Custody Litigation

Authors: Brian Carey Sims

Abstract:

The current study examines psychological factors relevant to child custody litigation among African American fathers. Thirty-seven fathers engaged in various stages of custody litigation involving their children were surveyed about their perceptions of racial stereotypes, parental motivations, and racialized dynamics of the court/ legal process. Data were analyzed using a Critical Race Theory model designed to statistically isolate fathers’ perceptions of the existence and maintenance of structural racism through the legal process. Results indicate significant correlations between fathers’ psychological measures and structural outcomes of their cases. Findings are discussed in terms of ethical implications for family court judicial systems and attorney practice.

Keywords: ethics, family, legal psychology, policy, race

Procedia PDF Downloads 333
34238 Analysis and Forecasting of Bitcoin Price Using Exogenous Data

Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka

Abstract:

Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.

Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance

Procedia PDF Downloads 341
34237 Knowledge Audit Model for Requirement Elicitation Process

Authors: Laleh Taheri, Noraini C. Pa, Rusli Abdullah, Salfarina Abdullah

Abstract:

Knowledge plays an important role to the success of any organization. Software development organizations are highly knowledge-intensive organizations especially in their Requirement Elicitation Process (REP). There are several problems regarding communicating and using the knowledge in REP such as misunderstanding, being out of scope, conflicting information and changes of requirements. All of these problems occurred in transmitting the requirements knowledge during REP. Several researches have been done in REP in order to solve the problem towards requirements. Knowledge Audit (KA) approaches were proposed in order to solve managing knowledge in human resources, financial, and manufacturing. There is lack of study applying the KA in requirements elicitation process. Therefore, this paper proposes a KA model for REP in supporting to acquire good requirements.

Keywords: knowledge audit, requirement elicitation process, KA model, knowledge in requirement elicitation

Procedia PDF Downloads 328
34236 The Usage of Bridge Estimator for Hegy Seasonal Unit Root Tests

Authors: Huseyin Guler, Cigdem Kosar

Abstract:

The aim of this study is to propose Bridge estimator for seasonal unit root tests. Seasonality is an important factor for many economic time series. Some variables may contain seasonal patterns and forecasts that ignore important seasonal patterns have a high variance. Therefore, it is very important to eliminate seasonality for seasonal macroeconomic data. There are some methods to eliminate the impacts of seasonality in time series. One of them is filtering the data. However, this method leads to undesired consequences in unit root tests, especially if the data is generated by a stochastic seasonal process. Another method to eliminate seasonality is using seasonal dummy variables. Some seasonal patterns may result from stationary seasonal processes, which are modelled using seasonal dummies but if there is a varying and changing seasonal pattern over time, so the seasonal process is non-stationary, deterministic seasonal dummies are inadequate to capture the seasonal process. It is not suitable to use seasonal dummies for modeling such seasonally nonstationary series. Instead of that, it is necessary to take seasonal difference if there are seasonal unit roots in the series. Different alternative methods are proposed in the literature to test seasonal unit roots, such as Dickey, Hazsa, Fuller (DHF) and Hylleberg, Engle, Granger, Yoo (HEGY) tests. HEGY test can be also used to test the seasonal unit root in different frequencies (monthly, quarterly, and semiannual). Another issue in unit root tests is the lag selection. Lagged dependent variables are added to the model in seasonal unit root tests as in the unit root tests to overcome the autocorrelation problem. In this case, it is necessary to choose the lag length and determine any deterministic components (i.e., a constant and trend) first, and then use the proper model to test for seasonal unit roots. However, this two-step procedure might lead size distortions and lack of power in seasonal unit root tests. Recent studies show that Bridge estimators are good in selecting optimal lag length while differentiating nonstationary versus stationary models for nonseasonal data. The advantage of this estimator is the elimination of the two-step nature of conventional unit root tests and this leads a gain in size and power. In this paper, the Bridge estimator is proposed to test seasonal unit roots in a HEGY model. A Monte-Carlo experiment is done to determine the efficiency of this approach and compare the size and power of this method with HEGY test. Since Bridge estimator performs well in model selection, our approach may lead to some gain in terms of size and power over HEGY test.

Keywords: bridge estimators, HEGY test, model selection, seasonal unit root

Procedia PDF Downloads 314