Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35541

Search results for: process data

34851 Inter-Personal and Inter-Organizational Relationships in Supply Chain Integration: A Resource Orchestration Perspective

Authors: Bill Wang, Paul Childerhouse, Yuanfei Kang

Abstract:

Purpose: The research is to extend resource orchestration theory (ROT) into supply chain management (SCM) area to investigate the dyadic relationships at both individual and organizational levels in supply chain integration (SCI). Also, we try to explore the interaction mechanism between inter-personal relationships (IPRs) and inter-organizational (IORs) during the whole SCI process. Methodology/approach: The research employed an exploratory multiple case study approach of four New Zealand companies. The data was collected via semi-structured interviews with top, middle, and lower level managers and operators from different departments of both suppliers and customers triangulated with company archival data. Findings: The research highlights the important role of both IPRs and IORs in the whole SCI process. Both IPRs and IORs are valuable, inimitable resources but IORs are formal and exterior while IPRs are informal and subordinated. In the initial stage of SCI process, IPRs are seen as key resources antecedents to IOR building while three IPRs dimensions work differently: personal credibility acts as an icebreaker to strengthen the confidence forming IORs, and personal affection acts as a gatekeeper, whilst personal communication expedites the IORs process. In the maintenance and development stage, IORs and IPRs interact each other continuously: good interaction between IPRs and IORs can facilitate SCI process while the bad interaction between IPRs can damage the SCI process. On the other hand, during the life-cycle of SCI process, IPRs can facilitate the formation, development of IORs while IORs development can cultivate the ties of IPRs. Out of the three dimensions of IPRs, Personal communication plays a more important role to develop IORs than personal credibility and personal affection. Originality/value: This research contributes to ROT in supply chain management literature by highlighting the interaction of IPRs and IORs in SCI. The intangible resources and capabilities of three dimensions of IPRs need to be orchestrated and nurtured to achieve efficient and effective IORs in SCI. Also, IPRs and IORs need to be orchestrated in terms of breadth, depth, and life-cycle of whole SCI process. Our study provides further insight into the rarely explored inter-personal level of SCI. Managerial implications: Our research provides top management with further evidence of the significance roles of IPRs at different levels when working with trading partners. This highlights the need to actively manage and develop these soft IPRs skills as an intangible competitive resource. Further, the research identifies when staff with specific skills and connections should be utilized during the different stages of building and maintaining inter-organizational ties. More importantly, top management needs to orchestrate and balance the resources of IPRs and IORs.

Keywords: case study, inter-organizational relationships, inter-personal relationships, resource orchestration, supply chain integration

Procedia PDF Downloads 229
34850 System Dietadhoc® - A Fusion of Human-Centred Design and Agile Development for the Explainability of AI Techniques Based on Nutritional and Clinical Data

Authors: Michelangelo Sofo, Giuseppe Labianca

Abstract:

In recent years, the scientific community's interest in the exploratory analysis of biomedical data has increased exponentially. Considering the field of research of nutritional biologists, the curative process, based on the analysis of clinical data, is a very delicate operation due to the fact that there are multiple solutions for the management of pathologies in the food sector (for example can recall intolerances and allergies, management of cholesterol metabolism, diabetic pathologies, arterial hypertension, up to obesity and breathing and sleep problems). In this regard, in this research work a system was created capable of evaluating various dietary regimes for specific patient pathologies. The system is founded on a mathematical-numerical model and has been created tailored for the real working needs of an expert in human nutrition using the human-centered design (ISO 9241-210), therefore it is in step with continuous scientific progress in the field and evolves through the experience of managed clinical cases (machine learning process). DietAdhoc® is a decision support system nutrition specialists for patients of both sexes (from 18 years of age) developed with an agile methodology. Its task consists in drawing up the biomedical and clinical profile of the specific patient by applying two algorithmic optimization approaches on nutritional data and a symbolic solution, obtained by transforming the relational database underlying the system into a deductive database. For all three solution approaches, particular emphasis has been given to the explainability of the suggested clinical decisions through flexible and customizable user interfaces. Furthermore, the system has multiple software modules based on time series and visual analytics techniques that allow to evaluate the complete picture of the situation and the evolution of the diet assigned for specific pathologies.

Keywords: medical decision support, physiological data extraction, data driven diagnosis, human centered AI, symbiotic AI paradigm

Procedia PDF Downloads 10
34849 The Effect of Self and Peer Assessment Activities in Second Language Writing: A Washback Effect Study on the Writing Growth during the Revision Phase in the Writing Process: Learners’ Perspective

Authors: Musbah Abdussayed

Abstract:

The washback effect refers to the influence of assessment on teaching and learning, and this washback effect can either be positive or negative. This study implemented, sequentially, self-assessment (SA) and peer assessment (PA) and examined the washback effect of self and peer assessment (SPA) activities on the writing growth during the revision phase in the writing process. Twenty advanced Arabic as a second language learners from a private school in the USA participated in the study. The participants composed and then revised a short Arabic story as a part of a midterm grade. Qualitative data was collected, analyzed, and synthesized from ten interviews with the learners and from the twenty learners’ post-reflective journals. The findings indicate positive washback effects on the learners’ writing growth. The PA activity enhanced descriptions and meaning, promoted creativity, and improved textual coherence, whereas the SA activity led to detecting editing issues. Furthermore, both SPA activities had washback effects in common, including helping the learners meet the writing genre conventions and developing metacognitive awareness. However, the findings also demonstrate negative washback effects on the learners’ attitudes during the revision phase in the writing process, including bias toward self-evaluation during the SA activity and reluctance to rate peers’ writing performance during the PA activity. The findings suggest that self-and peer assessment activities are essential teaching and learning tools that can be utilized sequentially to help learners tackle multiple writing areas during the revision phase in the writing process.

Keywords: self assessment, peer assessment, washback effect, second language writing, writing process

Procedia PDF Downloads 63
34848 A Gamification Teaching Method for Software Measurement Process

Authors: Lennon Furtado, Sandro Oliveira

Abstract:

The importance of an effective measurement program lies in the ability to control and predict what can be measured. Thus, the measurement program has the capacity to provide bases in decision-making to support the interests of an organization. Therefore, it is only possible to apply for an effective measurement program with a team of software engineers well trained in the measurement area. However, the literature indicates that are few computer science courses that have in their program the teaching of the software measurement process. And even these, generally present only basic theoretical concepts of said process and little or no measurement in practice, which results in the student's lack of motivation to learn the measurement process. In this context, according to some experts in software process improvements, one of the most used approaches to maintaining the motivation and commitment to software process improvements program is the use of the gamification. Therefore, this paper aims to present a proposal of teaching the measurement process by gamification. Which seeks to improve student motivation and performance in the assimilation of tasks related to software measurement, by incorporating elements of games into the practice of measurement process, making it more attractive for learning. And as a way of validating the proposal will be made a comparison between two distinct groups of 20 students of Software Quality class, a control group, and an experiment group. The control group will be the students that will not make use of the gamification proposal to learn software measurement process, while the experiment group, will be the students that will make use of the gamification proposal to learn software measurement process. Thus, this paper will analyze the objective and subjective results of each group. And as objective result will be analyzed the student grade reached at the end of the course, and as subjective results will be analyzed a post-course questionnaire with the opinion of each student about the teaching method. Finally, this paper aims to prove or refute the following hypothesis: If the gamification proposal to teach software measurement process does appropriate motivate the student, in order to attribute the necessary competence to the practical application of the measurement process.

Keywords: education, gamification, software measurement process, software engineering

Procedia PDF Downloads 310
34847 Review on Optimization of Drinking Water Treatment Process

Authors: M. Farhaoui, M. Derraz

Abstract:

In the drinking water treatment processes, the optimization of the treatment is an issue of particular concern. In general, the process consists of many units as settling, coagulation, flocculation, sedimentation, filtration and disinfection. The optimization of the process consists of some measures to decrease the managing and monitoring expenses and improve the quality of the produced water. The objective of this study is to provide water treatment operators with methods and practices that enable to attain the most effective use of the facility and, in consequence, optimize the of the cubic meter price of the treated water. This paper proposes a review on optimization of drinking water treatment process by analyzing all of the water treatment units and gives some solutions in order to maximize the water treatment performances without compromising the water quality standards. Some solutions and methods are performed in the water treatment plant located in the middle of Morocco (Meknes).

Keywords: coagulation process, optimization, turbidity removal, water treatment

Procedia PDF Downloads 413
34846 Constructivist Grounded Theory of Intercultural Learning

Authors: Vaida Jurgile

Abstract:

Intercultural learning is one of the approaches taken to understand the cultural diversity of the modern world and to accept changes in cultural identity and otherness and the expression of tolerance. During intercultural learning, students develop their abilities to interact and communicate with their group members. These abilities help to understand social and cultural differences, to form one’s identity, and to give meaning to intercultural learning. Intercultural education recognizes that a true understanding of differences and similarities of another culture is necessary in order to lay the foundations for working together with others, which contributes to the promotion of intercultural dialogue, appreciation of diversity, and cultural exchange. Therefore, it is important to examine the concept of intercultural learning, revealed through students’ learning experiences and understanding of how this learning takes place and what significance this phenomenon has in higher education. At a scientific level, intercultural learning should be explored in order to uncover the influence of cultural identity, i.e., intercultural learning should be seen in a local context. This experience would provide an opportunity to learn from various everyday intercultural learning situations. Intercultural learning can be not only a form of learning but also a tool for building understanding between people of different cultures. The research object of the study is the process of intercultural learning. The aim of the dissertation is to develop a grounded theory of the process of learning in an intercultural study environment, revealing students’ learning experiences. The research strategy chosen in this study is a constructivist grounded theory (GT). GT is an inductive method that seeks to form a theory by applying the systematic collection, synthesis, analysis, and conceptualization of data. The targeted data collection was based on the analysis of data provided by previous research participants, which revealed the need for further research participants. During the research, only students with at least half a year of study experience, i.e., who have completed at least one semester of intercultural studies, were purposefully selected for the research. To select students, snowballing sampling was used. 18 interviews were conducted with students representing 3 different fields of sciences (social sciences, humanities, and technology sciences). In the process of intercultural learning, language expresses and embodies cultural reality and a person’s cultural identity. It is through language that individual experiences are expressed, and the world in which Others exist is perceived. The increased emphasis is placed on the fact that language conveys certain “signs’ of communication and perception with cultural value, enabling the students to identify the Self and the Other. Language becomes an important tool in the process of intercultural communication because it is only through language that learners can communicate, exchange information, and understand each other. Thus, in the process of intercultural learning, language either promotes interpersonal relationships with foreign students or leads to mutual rejection.

Keywords: intercultural learning, grounded theory, students, other

Procedia PDF Downloads 54
34845 Application of Blockchain Technology in Geological Field

Authors: Mengdi Zhang, Zhenji Gao, Ning Kang, Rongmei Liu

Abstract:

Management and application of geological big data is an important part of China's national big data strategy. With the implementation of a national big data strategy, geological big data management becomes more and more critical. At present, there are still a lot of technology barriers as well as cognition chaos in many aspects of geological big data management and application, such as data sharing, intellectual property protection, and application technology. Therefore, it’s a key task to make better use of new technologies for deeper delving and wider application of geological big data. In this paper, we briefly introduce the basic principle of blockchain technology at the beginning and then make an analysis of the application dilemma of geological data. Based on the current analysis, we bring forward some feasible patterns and scenarios for the blockchain application in geological big data and put forward serval suggestions for future work in geological big data management.

Keywords: blockchain, intellectual property protection, geological data, big data management

Procedia PDF Downloads 79
34844 Comparative Analysis of Enzyme Activities Concerned in Decomposition of Toluene

Authors: Ayuko Itsuki, Sachiyo Aburatani

Abstract:

In recent years, pollutions of the environment by toxic substances become a serious problem. While there are many methods of environmental clean-up, the methods by microorganisms are considered to be reasonable and safety for environment. Compost is known that it catabolize the meladorous substancess in its production process, however the mechanism of its catabolizing system is not known yet. In the catabolization process, organic matters turn into inorganic by the released enzymes from lots of microorganisms which live in compost. In other words, the cooperative of activated enzymes in the compost decomposes malodorous substances. Thus, clarifying the interaction among enzymes is important for revealing the catabolizing system of meladorous substance in compost. In this study, we utilized statistical method to infer the interaction among enzymes. We developed a method which combined partial correlation with cross correlation to estimate the relevance between enzymes especially from time series data of few variables. Because of using cross correlation, we can estimate not only the associative structure but also the reaction pathway. We applied the developed method to the enzyme measured data and estimated an interaction among the enzymes in decomposition mechanism of toluene.

Keywords: enzyme activities, comparative analysis, compost, toluene

Procedia PDF Downloads 266
34843 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics

Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur

Abstract:

Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.

Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics

Procedia PDF Downloads 105
34842 The Role of People and Data in Complex Spatial-Related Long-Term Decisions: A Case Study of Capital Project Management Groups

Authors: Peter Boyes, Sarah Sharples, Paul Tennent, Gary Priestnall, Jeremy Morley

Abstract:

Significant long-term investment projects can involve complex decisions. These are often described as capital projects, and the factors that contribute to their complexity include budgets, motivating reasons for investment, stakeholder involvement, interdependent projects, and the delivery phases required. The complexity of these projects often requires management groups to be established involving stakeholder representatives; these teams are inherently multidisciplinary. This study uses two university campus capital projects as case studies for this type of management group. Due to the interaction of projects with wider campus infrastructure and users, decisions are made at varying spatial granularity throughout the project lifespan. This spatial-related context brings complexity to the group decisions. Sensemaking is the process used to achieve group situational awareness of a complex situation, enabling the team to arrive at a consensus and make a decision. The purpose of this study is to understand the role of people and data in the complex spatial related long-term decision and sensemaking processes. The paper aims to identify and present issues experienced in practical settings of these types of decision. A series of exploratory semi-structured interviews with members of the two projects elicit an understanding of their operation. From two stages of thematic analysis, inductive and deductive, emergent themes are identified around the group structure, the data usage, and the decision making within these groups. When data were made available to the group, there were commonly issues with the perception of veracity and validity of the data presented; this impacted the ability of group to reach consensus and, therefore, for decisions to be made. Similarly, there were different responses to forecasted or modelled data, shaped by the experience and occupation of the individuals within the multidisciplinary management group. This paper provides an understanding of further support required for team sensemaking and decision making in complex capital projects. The paper also discusses the barriers found to effective decision making in this setting and suggests opportunities to develop decision support systems in this team strategic decision-making process. Recommendations are made for further research into the sensemaking and decision-making process of this complex spatial-related setting.

Keywords: decision making, decisions under uncertainty, real decisions, sensemaking, spatial, team decision making

Procedia PDF Downloads 125
34841 Microarray Data Visualization and Preprocessing Using R and Bioconductor

Authors: Ruchi Yadav, Shivani Pandey, Prachi Srivastava

Abstract:

Microarrays provide a rich source of data on the molecular working of cells. Each microarray reports on the abundance of tens of thousands of mRNAs. Virtually every human disease is being studied using microarrays with the hope of finding the molecular mechanisms of disease. Bioinformatics analysis plays an important part of processing the information embedded in large-scale expression profiling studies and for laying the foundation for biological interpretation. A basic, yet challenging task in the analysis of microarray gene expression data is the identification of changes in gene expression that are associated with particular biological conditions. Careful statistical design and analysis are essential to improve the efficiency and reliability of microarray experiments throughout the data acquisition and analysis process. One of the most popular platforms for microarray analysis is Bioconductor, an open source and open development software project based on the R programming language. This paper describes specific procedures for conducting quality assessment, visualization and preprocessing of Affymetrix Gene Chip and also details the different bioconductor packages used to analyze affymetrix microarray data and describe the analysis and outcome of each plots.

Keywords: microarray analysis, R language, affymetrix visualization, bioconductor

Procedia PDF Downloads 472
34840 Innovation and Employment in Sub-Saharan Africa: Evidence from Uganda Microdata

Authors: Milton Ayoki, Edward Bbaale

Abstract:

This paper analyses the relationship between innovation and employment at firm level with the objective of understanding the contribution of the different innovation strategies in fostering employment growth in Uganda. We use National Innovation Survey (micro-data of 705 Ugandan firms) for the period 2011-2014 and follow closely Harrison et al. (2014) structured approach, and relate employment growth to process innovations and to the growth of sales separately due to innovative and unchanged products. We find positive effects of product innovation on employment at firm level, while process innovation has no discernable impact on employment. Although there is evidence to suggest displacement of labour in some cases where firms only introduce new process, this effect is compensated by growth in employment from new products, which for most firms are introduced simultaneously with new process. Results suggest that source of innovation as well as size of innovating firms or end users of innovation matter for job growth. Innovation that develops from within the firm itself (user) and involving larger firms has greater impact on employment than that developed from outside or coming from within smaller firms. In addition, innovative firms are one and half times more likely to survive in the innovation driven economy environment than those that do not innovate. These results have important implications for policymakers and stakeholders in innovation ecosystem. Supporting policies need to be correctly tailored since the impacts depend on the innovation strategy (type) and characteristics and sector of the innovative firms (small, large, industry, etc.). Policies to spur investment, particularly in innovative sectors and firms with high growth potential would have long lasting effects on job creation. JEL Classification: D24, J0, J20, L20, O30.

Keywords: employment, process innovation, product innovation, Sub-Saharan Africa

Procedia PDF Downloads 156
34839 Smart Sensor Data to Predict Machine Performance with IoT-Based Machine Learning and Artificial Intelligence

Authors: C. J. Rossouw, T. I. van Niekerk

Abstract:

The global manufacturing industry is utilizing the internet and cloud-based services to further explore the anatomy and optimize manufacturing processes in support of the movement into the Fourth Industrial Revolution (4IR). The 4IR from a third world and African perspective is hindered by the fact that many manufacturing systems that were developed in the third industrial revolution are not inherently equipped to utilize the internet and services of the 4IR, hindering the progression of third world manufacturing industries into the 4IR. This research focuses on the development of a non-invasive and cost-effective cyber-physical IoT system that will exploit a machine’s vibration to expose semantic characteristics in the manufacturing process and utilize these results through a real-time cloud-based machine condition monitoring system with the intention to optimize the system. A microcontroller-based IoT sensor was designed to acquire a machine’s mechanical vibration data, process it in real-time, and transmit it to a cloud-based platform via Wi-Fi and the internet. Time-frequency Fourier analysis was applied to the vibration data to form an image representation of the machine’s behaviour. This data was used to train a Convolutional Neural Network (CNN) to learn semantic characteristics in the machine’s behaviour and relate them to a state of operation. The same data was also used to train a Convolutional Autoencoder (CAE) to detect anomalies in the data. Real-time edge-based artificial intelligence was achieved by deploying the CNN and CAE on the sensor to analyse the vibration. A cloud platform was deployed to visualize the vibration data and the results of the CNN and CAE in real-time. The cyber-physical IoT system was deployed on a semi-automated metal granulation machine with a set of trained machine learning models. Using a single sensor, the system was able to accurately visualize three states of the machine’s operation in real-time. The system was also able to detect a variance in the material being granulated. The research demonstrates how non-IoT manufacturing systems can be equipped with edge-based artificial intelligence to establish a remote machine condition monitoring system.

Keywords: IoT, cyber-physical systems, artificial intelligence, manufacturing, vibration analytics, continuous machine condition monitoring

Procedia PDF Downloads 82
34838 A Unique Exact Approach to Handle a Time-Delayed State-Space System: The Extraction of Juice Process

Authors: Mohamed T. Faheem Saidahmed, Ahmed M. Attiya Ibrahim, Basma GH. Elkilany

Abstract:

This paper discusses the application of Time Delay Control (TDC) compensation technique in the juice extraction process in a sugar mill. The objective is to improve the control performance of the process and increase extraction efficiency. The paper presents the mathematical model of the juice extraction process and the design of the TDC compensation controller. Simulation results show that the TDC compensation technique can effectively suppress the time delay effect in the process and improve control performance. The extraction efficiency is also significantly increased with the application of the TDC compensation technique. The proposed approach provides a practical solution for improving the juice extraction process in sugar mills using MATLAB Processes.

Keywords: time delay control (TDC), exact and unique state space model, delay compensation, Smith predictor.

Procedia PDF Downloads 81
34837 Information Management Approach in the Prediction of Acute Appendicitis

Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki

Abstract:

This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.

Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree

Procedia PDF Downloads 343
34836 Narrative Study to Resilience and Adversity's Response

Authors: Yun Hang Stanley Cheung

Abstract:

In recent years, many educators and entrepreneurs have often suggested that students’ and workers’ ability of the adversity response is very important, it would affect problem-solving strategies and ultimate success in their career or life. The meaning of resilience is discussed as the process of bouncing back and the ability to adapt well in adversity’s response, being resilient does not mean to live without any stress and difficulty, but to grow and thrive under pressure. The purpose of this study is to describe the process of resilience and adversity’s response. The use of the narrative inquiry aims for understanding the experiential process of adversity response, and the problem-solving strategies (such as emotion control, motivation, decisions making process), as well as making the experience become life story, which may be evaluated by its teller and its listeners. The narrative study describes the researcher’s self-experience of adversity’s response to the recovery of the seriously burnt injury from a hill fire at his 12 years old, as well as the adversities and obstacles related to the tragedy after the physical recovery. Sense-Making Theory and McCormack’s Lenses were used for constructive perspective and data analyzing. To conclude, this study has described the life story of fighting the adversities, also, those narratives come out some suggestions, which point out positive thinking is necessary to build up resilience and the ability of immediate adversity response. Also, some problem-solving strategies toward adversities are discussed, which are helpful for resilience education for youth and young adult.

Keywords: adversity response, life story, narrative inquiry, resilience

Procedia PDF Downloads 303
34835 Frequent Item Set Mining for Big Data Using MapReduce Framework

Authors: Tamanna Jethava, Rahul Joshi

Abstract:

Frequent Item sets play an essential role in many data Mining tasks that try to find interesting patterns from the database. Typically it refers to a set of items that frequently appear together in transaction dataset. There are several mining algorithm being used for frequent item set mining, yet most do not scale to the type of data we presented with today, so called “BIG DATA”. Big Data is a collection of large data sets. Our approach is to work on the frequent item set mining over the large dataset with scalable and speedy way. Big Data basically works with Map Reduce along with HDFS is used to find out frequent item sets from Big Data on large cluster. This paper focuses on using pre-processing & mining algorithm as hybrid approach for big data over Hadoop platform.

Keywords: frequent item set mining, big data, Hadoop, MapReduce

Procedia PDF Downloads 422
34834 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits

Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena

Abstract:

Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.

Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling

Procedia PDF Downloads 305
34833 Like Making an Ancient Urn: Metaphor Conceptualization of L2 Writing

Authors: Muhalim Muhalim

Abstract:

Drawing on Lakoff’s theory of metaphor conceptualization, this article explores the conceptualization of language two writing (L2W) of ten students-teachers in Indonesia via metaphors. The ten postgraduate English language teaching students and at the same time (former) English teachers received seven days of intervention in teaching and learning L2. Using introspective log and focus group discussion, the results illuminate us that all participants are unanimous on perceiving L2W as process-oriented rather than product-oriented activity. Specifically, the metaphor conceptualizations exhibit three categories of process-oriented L2W: deliberate process, learning process, and problem-solving process. However, it has to be clarified from the outset that this categorization is not rigid because some of the properties of metaphors might belong to other categories. Results of the study and implications for English language teaching will be further discussed.

Keywords: metaphor conceptualisation, second language, learning writing, teaching writing

Procedia PDF Downloads 405
34832 The Role Of Data Gathering In NGOs

Authors: Hussaini Garba Mohammed

Abstract:

Background/Significance: The lack of data gathering is affecting NGOs world-wide in general to have good data information about educational and health related issues among communities in any country and around the world. For example, HIV/AIDS smoking (Tuberculosis diseases) and COVID-19 virus carriers is becoming a serious public health problem, especially among old men and women. But there is no full details data survey assessment from communities, villages, and rural area in some countries to show the percentage of victims and patients, especial with this world COVID-19 virus among the people. These data are essential to inform programming targets, strategies, and priorities in getting good information about data gathering in any society.

Keywords: reliable information, data assessment, data mining, data communication

Procedia PDF Downloads 175
34831 Understanding Cruise Passengers’ On-board Experience throughout the Customer Decision Journey

Authors: Sabina Akter, Osiris Valdez Banda, Pentti Kujala, Jani Romanoff

Abstract:

This paper examines the relationship between on-board environmental factors and customer overall satisfaction in the context of the cruise on-board experience. The on-board environmental factors considered are ambient, layout/design, social, product/service and on-board enjoyment factors. The study presents a data-driven framework and model for the on-board cruise experience. The data are collected from 893 respondents in an application of a self-administered online questionnaire of their cruise experience. This study reveals the cruise passengers’ on-board experience through the customer decision journey based on the publicly available data. Pearson correlation and regression analysis have been applied, and the results show a positive and a significant relationship between the environmental factors and on-board experience. These data help understand the cruise passengers’ on-board experience, which will be used for the ultimate decision-making process in cruise ship design.

Keywords: cruise behavior, customer activities, on-board environmental factors, on-board experience, user or customer satisfaction

Procedia PDF Downloads 164
34830 Additive Friction Stir Manufacturing Process: Interest in Understanding Thermal Phenomena and Numerical Modeling of the Temperature Rise Phase

Authors: Antoine Lauvray, Fabien Poulhaon, Pierre Michaud, Pierre Joyot, Emmanuel Duc

Abstract:

Additive Friction Stir Manufacturing (AFSM) is a new industrial process that follows the emergence of friction-based processes. The AFSM process is a solid-state additive process using the energy produced by the friction at the interface between a rotating non-consumable tool and a substrate. Friction depends on various parameters like axial force, rotation speed or friction coefficient. The feeder material is a metallic rod that flows through a hole in the tool. Unlike in Friction Stir Welding (FSW) where abundant literature exists and addresses many aspects going from process implementation to characterization and modeling, there are still few research works focusing on AFSM. Therefore, there is still a lack of understanding of the physical phenomena taking place during the process. This research work aims at a better AFSM process understanding and implementation, thanks to numerical simulation and experimental validation performed on a prototype effector. Such an approach is considered a promising way for studying the influence of the process parameters and to finally identify a process window that seems relevant. The deposition of material through the AFSM process takes place in several phases. In chronological order these phases are the docking phase, the dwell time phase, the deposition phase, and the removal phase. The present work focuses on the dwell time phase that enables the temperature rise of the system composed of the tool, the filler material, and the substrate and due to pure friction. Analytic modeling of heat generation based on friction considers as main parameters the rotational speed and the contact pressure. Another parameter considered influential is the friction coefficient assumed to be variable due to the self-lubrication of the system with the rise in temperature or the materials in contact roughness smoothing over time. This study proposes, through numerical modeling followed by experimental validation, to question the influence of the various input parameters on the dwell time phase. Rotation speed, temperature, spindle torque, and axial force are the main monitored parameters during experimentations and serve as reference data for the calibration of the numerical model. This research shows that the geometry of the tool as well as fluctuations of the input parameters like axial force and rotational speed are very influential on the temperature reached and/or the time required to reach the targeted temperature. The main outcome is the prediction of a process window which is a key result for a more efficient process implementation.

Keywords: numerical model, additive manufacturing, friction, process

Procedia PDF Downloads 141
34829 Holistic Risk Assessment Based on Continuous Data from the User’s Behavior and Environment

Authors: Cinzia Carrodano, Dimitri Konstantas

Abstract:

Risk is part of our lives. In today’s society risk is connected to our safety and safety has become a major priority in our life. Each person lives his/her life based on the evaluation of the risk he/she is ready to accept and sustain, and the level of safety he/she wishes to reach, based on highly personal criteria. The assessment of risk a person takes in a complex environment and the impact of actions of other people’actions and events on our perception of risk are alements to be considered. The concept of Holistic Risk Assessment (HRA) aims in developing a methodology and a model that will allow us to take into account elements outside the direct influence of the individual, and provide a personalized risk assessment. The concept is based on the fact that in the near future, we will be able to gather and process extremely large amounts of data about an individual and his/her environment in real time. The interaction and correlation of these data is the key element of the holistic risk assessment. In this paper, we present the HRA concept and describe the most important elements and considerations.

Keywords: continuous data, dynamic risk, holistic risk assessment, risk concept

Procedia PDF Downloads 119
34828 Skid-mounted Gathering System Hydrate Control And Process Simulation Optimization

Authors: Di Han, Lingfeng Li, Peixue Zhang, Yuzhuo Zhang

Abstract:

Since natural gas extracted from the wellhead of a gas well, after passing through the throttle valve, causes a rapid decrease in temperature along with a decrease in pressure, which creates conditions for hydrate generation. In order to solve the problem of hydrate generation in the process of wellhead gathering, effective measures should be taken to prevent hydrate generation. In this paper, we firstly introduce the principle of natural gas throttling temperature drop and the theoretical basis of hydrate inhibitor injection calculation, and then use HYSYS software to simulate and calculate the three processes and determine the key process parameters. The hydrate control process applicable to the skid design of natural gas wellhead gathering skids was determined by comparing the hydrate control effect, energy consumption of key equipment and process adaptability.

Keywords: natural gas, hydrate control, skid design, HYSYS

Procedia PDF Downloads 80
34827 Dimension Free Rigid Point Set Registration in Linear Time

Authors: Jianqin Qu

Abstract:

This paper proposes a rigid point set matching algorithm in arbitrary dimensions based on the idea of symmetric covariant function. A group of functions of the points in the set are formulated using rigid invariants. Each of these functions computes a pair of correspondence from the given point set. Then the computed correspondences are used to recover the unknown rigid transform parameters. Each computed point can be geometrically interpreted as the weighted mean center of the point set. The algorithm is compact, fast, and dimension free without any optimization process. It either computes the desired transform for noiseless data in linear time, or fails quickly in exceptional cases. Experimental results for synthetic data and 2D/3D real data are provided, which demonstrate potential applications of the algorithm to a wide range of problems.

Keywords: covariant point, point matching, dimension free, rigid registration

Procedia PDF Downloads 164
34826 Multiloop Fractional Order PID Controller Tuned Using Cuckoo Algorithm for Two Interacting Conical Tank Process

Authors: U. Sabura Banu, S. K. Lakshmanaprabu

Abstract:

The improvement of meta-heuristic algorithm encourages control engineer to design an optimal controller for industrial process. Most real-world industrial processes are non-linear multivariable process with high interaction. Even in sub-process unit, thousands of loops are available mostly interacting in nature. Optimal controller design for such process are still challenging task. Closed loop controller design by multiloop PID involves a tedious procedure by performing interaction study and then PID auto-tuning the loop with higher interaction. Finally, detuning the controller to accommodate the effects of the other process variables. Fractional order PID controllers are replacing integer order PID controllers recently. Design of Multiloop Fractional Order (MFO) PID controller is still more complicated. Cuckoo algorithm, a swarm intelligence technique is used to optimally tune the MFO PID controller with easiness minimizing Integral Time Absolute Error. The closed loop performance is tested under servo, regulatory and servo-regulatory conditions.

Keywords: Cuckoo algorithm, mutliloop fractional order PID controller, two Interacting conical tank process

Procedia PDF Downloads 494
34825 Examination of the Satisfaction Levels of Pre-Service Teachers Concerning E-Learning Process in Terms of Different Variables

Authors: Agah Tugrul Korucu

Abstract:

Significant changes have taken place for the better in the bulk of information and in the use of technology available in the field of education induced by technological changes in the 21st century. It is mainly the job of the teachers and pre-service teachers to integrate information and communication technologies into education by means of conveying the use of technology to individuals. While the pre-service teachers are conducting lessons by using technology, the methods they have developed are important factors for the requirements of the lesson and for the satisfaction levels of the students. The study of this study is to examine the satisfaction levels of pre-service teachers as regards e-learning in a technological environment in which there are lesson activities conducted through an online learning environment in terms of various variables. The study group of the research is composed of 156 pre-service teachers that were students in the departments of Computer and Teaching Technologies, Art Teaching and Pre-school Teaching in the academic year of 2014 - 2015. The qualitative research method was adopted for this study; the scanning model was employed in collecting the data. “The Satisfaction Scale regarding the E-learning Process”, developed by Gülbahar, and the personal information form, which was developed by the researcher, were used as means of collecting the data. Cronbach α reliability coefficient, which is the internal consistency coefficient of the scale, is 0.91. SPSS computerized statistical package program and the techniques of medium, standard deviation, percentage, correlation, t-test and variance analysis were used in the analysis of the data.

Keywords: online learning environment, integration of information technologies, e-learning, e-learning satisfaction, pre-service teachers

Procedia PDF Downloads 347
34824 The Application of Data Mining Technology in Building Energy Consumption Data Analysis

Authors: Liang Zhao, Jili Zhang, Chongquan Zhong

Abstract:

Energy consumption data, in particular those involving public buildings, are impacted by many factors: the building structure, climate/environmental parameters, construction, system operating condition, and user behavior patterns. Traditional methods for data analysis are insufficient. This paper delves into the data mining technology to determine its application in the analysis of building energy consumption data including energy consumption prediction, fault diagnosis, and optimal operation. Recent literature are reviewed and summarized, the problems faced by data mining technology in the area of energy consumption data analysis are enumerated, and research points for future studies are given.

Keywords: data mining, data analysis, prediction, optimization, building operational performance

Procedia PDF Downloads 846
34823 Using Mind Mapping and Morphological Analysis within a New Methodology for Teaching Students of Products’ Design

Authors: Kareem Saber

Abstract:

Many products’ design instructors search for how to help students to develop their designs simply by reducing design stages and extrapolating simple design process forms to achieve design creativity. So, the researcher extrapolated a new design process form called “hierarchical design” which reduced design process into three stages and he had tried that methodology on about two hundred students. That trial had led to great results as students could develop their designs which characterized by creativity and innovation. That proved the success and effectiveness of the proposed methodology.

Keywords: mind mapping, morphological analysis, product design, design process

Procedia PDF Downloads 164
34822 Customers' Perception towards the Service Marketing Mix and Frequency of Use of Mercedes Benz Automobile Service, Thailand

Authors: Pranee Tridhoskul

Abstract:

This research paper is aimed to examine a relationship between the service marketing mix and customers’ frequency of use of service at Mercedes Benz Auto Repair Centres under Thonburi Group, Thailand. Based on 2,267 customers who used the service of Thonburi Group’s Auto Repair Centres as the population, the sampling of this research was a total of 340 samples, by use of Probability Sampling Technique. Systematic Random Sampling was applied by use of questionnaire in collecting the data at Thonburi Group’s Auto Repair Centres. Mean and Pearson’s basic statistical correlations were utilized in analyzing the data. The study discovered a medium level of customers’ perception towards product and service of Thonburi Group’s Auto Repair Centres, price, place or distribution channel and promotion. People who provided service were perceived also at a medium level, whereas the physical evidence and service process were perceived at a high level. Furthermore, there appeared a correlation between the physical evidence and service process, and customers’ frequency of use of automobile service per year.

Keywords: service marketing mix, behavior, Mercedes Auto Service Centre, frequency of use

Procedia PDF Downloads 322