Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35359

Search results for: process data

34849 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning

Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem

Abstract:

The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.

Keywords: connectivism, learning analytics, lifelong learning, social semantic web

Procedia PDF Downloads 198
34848 The Search of Possibility of Running Six Sigma Process in It Education Center

Authors: Mohammad Amini, Aliakbar Alijarahi

Abstract:

This research that is collected and title as ‘ the search of possibility of running six sigma process in IT education center ‘ goals to test possibility of running the six sigma process and using in IT education center system. This process is a good method that is used for reducing process, errors. To evaluate running off six sigma in the IT education center, some variables relevant to this process is selected. These variables are: - The amount of support from organization master boss to process. - The current specialty. - The ability of training system for compensating reduction. - The amount of match between current culture whit six sigma culture . - The amount of current quality by comparing whit quality gain from running six sigma. For evaluation these variables we select four question and to gain the answers, we set a questionnaire from with 28 question and distribute it in our typical society. Since, our working environment is a very competition, and organization needs to decree the errors to minimum, otherwise it lasts their customers. The questionnaire from is given to 55 persons, they were filled and returned by 50 persons, after analyzing the forms these results is gained: - IT education center needs to use and run this system (six sigma) for improving their process qualities. - The most factors need to run the six sigma exist in the IT education center, but there is a need to support.

Keywords: education, customer, self-action, quality, continuous improvement process

Procedia PDF Downloads 330
34847 Temperament as a Success Determinant in Formative Assessment

Authors: George Fomunyam Kehdinga

Abstract:

Assessment is a vital part of the educational process, and formative assessment is a way of ensuring that higher education achieves the desired effects. Different factors influence how students perform in assessments in general, and formative assessment in particular and temperament is one of such determining factors. This paper which is a qualitative case study of four universities in four different countries examines how the temperamental make up of students either empowers them to perform excellently in formative assessment or incapacitates their performance. These four universities were chosen from Cameroon, South Africa, United Kingdom and the United States of America and three students were chosen from each institution, six of which were undergraduate student and six postgraduate students. Data in this paper was generated through qualitative interviews and document analyses which was preceded by a temperament test. From the data generated, it was discovered that cholerics who are natural leaders, hence do not struggle to express themselves often perform excellently in formative assessment while sanguines on the other hand who are also extroverts like cholerics perform relatively well. Phlegmatics and melancholics performed averagely and poorly respectively in formative assessment because they are naturally prone to fear and hate such activities because they like keeping to themselves. The paper, therefore, suggest that temperament is a success determinant in formative assessment. It also proposes that lecturers need and understanding of temperaments to be able to fully administer formative assessment in the lecturer room. It also suggests that assessment should be balance in the classroom so that some students because of their temperamental make-up are not naturally disadvantaged while others are performing excellently. Lastly, the paper suggests that since formative assessment is a process of generating data, it should be contextualised or given and individualised approach so as to ensure that trustworthy data is generated.

Keywords: temperament, formative assessment, academic success, students

Procedia PDF Downloads 240
34846 Identifying and Analyzing the Role of Brand Loyalty towards Incumbent Smartphones in New Branded Smartphone Adoption: Approach by Dual Process Theory

Authors: Lee Woong-Kyu

Abstract:

Fierce competition in smartphone market may encourage users to switch brands when buying a new smartphone. However, many smartphone users continue to use the same brand although other branded smartphones are perceived to be more attractive. The purpose of this study is to identify and analyze the effects of brand loyalty toward incumbent smartphone on new smartphone adoption. For this purpose, a research model including two hypotheses, the positive effect on rational judgments and the negative effect on rational judgments, are proposed based on the dual process theory. For the validation of the research model, the data was collected by surveying Korean university students and tested by the group comparison between high and low brand loyalty. The results show that the two hypotheses were statistically supported.

Keywords: brand loyalty, dual process theory, incumbent smartphone, smartphone adoption

Procedia PDF Downloads 281
34845 Energy Efficient Assessment of Energy Internet Based on Data-Driven Fuzzy Integrated Cloud Evaluation Algorithm

Authors: Chuanbo Xu, Xinying Li, Gejirifu De, Yunna Wu

Abstract:

Energy Internet (EI) is a new form that deeply integrates the Internet and the entire energy process from production to consumption. The assessment of energy efficient performance is of vital importance for the long-term sustainable development of EI project. Although the newly proposed fuzzy integrated cloud evaluation algorithm considers the randomness of uncertainty, it relies too much on the experience and knowledge of experts. Fortunately, the enrichment of EI data has enabled the utilization of data-driven methods. Therefore, the main purpose of this work is to assess the energy efficient of park-level EI by using a combination of a data-driven method with the fuzzy integrated cloud evaluation algorithm. Firstly, the indicators for the energy efficient are identified through literature review. Secondly, the artificial neural network (ANN)-based data-driven method is employed to cluster the values of indicators. Thirdly, the energy efficient of EI project is calculated through the fuzzy integrated cloud evaluation algorithm. Finally, the applicability of the proposed method is demonstrated by a case study.

Keywords: energy efficient, energy internet, data-driven, fuzzy integrated evaluation, cloud model

Procedia PDF Downloads 186
34844 Establishing Control Chart Limits for Rounded Measurements

Authors: Ran Etgar

Abstract:

The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X̄ chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter ȳ is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.

Keywords: SPC, round-off data, control limit, rounding error

Procedia PDF Downloads 62
34843 Board Nomination and Selection Process in Indonesian State-Owned Enterprises

Authors: Synthia A. Sari

Abstract:

The transparent nomination and selection process is the first step to obtaining qualified members of board. It is believed as the representative (agent) of the owners, members of the board must consist of competent and professional people. However, the development of transparent and ideal nomination and selection process in Indonesian State-owned enterprises (SOEs) has been based on relatively little research. Considering the relative importance attached by boards to conduct their roles in their principal’s interest in a variety of governance tasks in state-owned enterprises, the primary aim of this paper is to shed light on the extent of nomination and selection process impact performance of the board in implementing good corporate governance in Indonesian SOEs. The exploratory nature of this study led to the adoption of a qualitative research methodology which uses semi-structured interviews and publically available documents to collect a range of data pertaining board nomination and selection and the work of boards. Interviews were conducted with four informants from three Indonesian SOEs and Ministry of SOEs. Findings in this study demonstrate unclear job description and expectations board members as a result of unclear functions of the board in Indonesian SOEs make transparent and accountable nomination and selection process hard to conduct. This situation is vulnerable to the influences from political interest and that even the process itself can degenerate into situations of political interference. In the end, it leads to choosing the wrong person for membership of the board. This study makes a significant contribution to several fields; the human resource management, corporate governance, and Southeast studies by addressing the basic research gaps of board selection process issues in Indonesian SOEs. The gap is addressed by providing a more coherent framework for effective nomination and selection system which reflects more clearly the real experiences of those actually involved at board level.

Keywords: board selection and nomination process, Indonesian stated-owned enterprises, good corporate governance, political influence

Procedia PDF Downloads 257
34842 Object-Oriented Modeling Simulation and Control of Activated Sludge Process

Authors: J. Fernandez de Canete, P. Del Saz Orozco, I. Garcia-Moral, A. Akhrymenka

Abstract:

Object-oriented modeling is spreading in current simulation of wastewater treatments plants through the use of the individual components of the process and its relations to define the underlying dynamic equations. In this paper, we describe the use of the free-software OpenModelica simulation environment for the object-oriented modeling of an activated sludge process under feedback control. The performance of the controlled system was analyzed both under normal conditions and in the presence of disturbances. The object-oriented described approach represents a valuable tool in teaching provides a practical insight in wastewater process control field.

Keywords: object-oriented programming, activated sludge process, OpenModelica, feedback control

Procedia PDF Downloads 373
34841 Using Building Information Modelling to Mitigate Risks Associated with Health and Safety in the Construction and Maintenance of Infrastructure Assets

Authors: Mohammed Muzafar, Darshan Ruikar

Abstract:

BIM, an acronym for Building Information Modelling relates to the practice of creating a computer generated model which is capable of displaying the planning, design, construction and operation of a structure. The resulting simulation is a data-rich, object-oriented, intelligent and parametric digital representation of the facility, from which views and data, appropriate to various users needs can be extracted and analysed to generate information that can be used to make decisions and to improve the process of delivering the facility. BIM also refers to a shift in culture that will influence the way the built environment and infrastructure operates and how it is delivered. One of the main issues of concern in the construction industry at present in the UK is its record on Health & Safety (H&S). It is, therefore, important that new technologies such as BIM are developed to help improve the quality of health and safety. Historically the H&S record of the construction industry in the UK is relatively poor as compared to the manufacturing industries. BIM and the digital environment it operates within now allow us to use design and construction data in a more intelligent way. It allows data generated by the design process to be re-purposed and contribute to improving efficiencies in other areas of a project. This evolutionary step in design is not only creating exciting opportunities for the designers themselves but it is also creating opportunity for every stakeholder in any given project. From designers, engineers, contractors through to H&S managers, BIM is accelerating a cultural change. The paper introduces the concept behind a research project that mitigates the H&S risks associated with the construction, operation and maintenance of assets through the adoption of BIM.

Keywords: building information modeling, BIM levels, health, safety, integration

Procedia PDF Downloads 236
34840 Development of a Process to Manufacture High Quality Refined Salt from Crude Solar Salt

Authors: Rathnayaka D. D. T. , Vidanage P. W. , Wasalathilake K. C. , Wickramasingha H. W. , Wijayarathne U. P. L. , Perera S. A. S.

Abstract:

This paper describes the research carried out to develop a process to increase the NaCl percentage of crude salt which is obtained from the conventional solar evaporation process. In this study refined salt was produced from crude solar salt by a chemico-physical method which consists of coagulation, precipitation and filtration. Initially crude salt crystals were crushed and dissolved in water. Optimum amounts of calcium hydroxide, sodium carbonate and Poly Aluminium Chloride (PAC) were added to the solution respectively. Refined NaCl solution was separated out by a filtration process. The solution was tested for Total Suspended Solids, SO42-, Mg2+, Ca2+. With optimum dosage of reagents, the results showed that a level of 99.60% NaCl could be achieved. Further this paper discusses the economic viability of the proposed process. A 83% profit margin can be achieved by this process and it is an increase of 112.3% compared to the traditional process.

Keywords: chemico-physical, economic, optimum, refined, solar salt

Procedia PDF Downloads 238
34839 3D Numerical Modelling of a Pulsed Pumping Process of a Large Dense Non-Aqueous Phase Liquid Pool: In situ Pilot-Scale Case Study of Hexachlorobutadiene in a Keyed Enclosure

Authors: Q. Giraud, J. Gonçalvès, B. Paris

Abstract:

Remediation of dense non-aqueous phase liquids (DNAPLs) represents a challenging issue because of their persistent behaviour in the environment. This pilot-scale study investigates, by means of in situ experiments and numerical modelling, the feasibility of the pulsed pumping process of a large amount of a DNAPL in an alluvial aquifer. The main compound of the DNAPL is hexachlorobutadiene, an emerging organic pollutant. A low-permeability keyed enclosure was built at the location of the DNAPL source zone in order to isolate a finite undisturbed volume of soil, and a 3-month pulsed pumping process was applied inside the enclosure to exclusively extract the DNAPL. The water/DNAPL interface elevation at both the pumping and observation wells and the cumulated pumped volume of DNAPL were also recorded. A total volume of about 20m³ of purely DNAPL was recovered since no water was extracted during the process. The three-dimensional and multiphase flow simulator TMVOC was used, and a conceptual model was elaborated and generated with the pre/post-processing tool mView. Numerical model consisted of 10 layers of variable thickness and 5060 grid cells. Numerical simulations reproduce the pulsed pumping process and show an excellent match between simulated, and field data of DNAPL cumulated pumped volume and a reasonable agreement between modelled and observed data for the evolution of the water/DNAPL interface elevations at the two wells. This study offers a new perspective in remediation since DNAPL pumping system optimisation may be performed where a large amount of DNAPL is encountered.

Keywords: dense non-aqueous phase liquid (DNAPL), hexachlorobutadiene, in situ pulsed pumping, multiphase flow, numerical modelling, porous media

Procedia PDF Downloads 168
34838 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm

Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima

Abstract:

In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.

Keywords: cloud space, AES, FTP, NetBeans IDE

Procedia PDF Downloads 193
34837 The Effect of Flue Gas Condensation on the Exergy Efficiency and Economic Performance of a Waste-To-Energy Plant

Authors: Francis Chinweuba Eboh, Tobias Richards

Abstract:

In this study, a waste-to-energy combined heat and power plant under construction was modelled and simulated with the Aspen Plus software. The base case process plant was evaluated and compared when integrated with flue gas condensation (FGC) in order to find out the impact of the exergy efficiency and economic feasibility as well as the effect of overall system exergy losses and revenue generated in the investigated plant. The economic evaluations were carried out using the vendor cost data from Aspen process economic analyser. The results indicate that 4 % increase in the exergy efficiency and 29 % reduction in the exergy loss in the flue gas were obtained when the flue gas condensation was incorporated. Furthermore, with the integrated FGC, the net present values (NPV) and income generated in the base process plant were increased by 29 % and 10 % respectively after 20 years of operation.

Keywords: economic feasibility, exergy efficiency, exergy losses, flue gas condensation, waste-to-energy

Procedia PDF Downloads 178
34836 Youth Voter Turnout in Jamaica: A Case Study of the 2016 General Election

Authors: Tracy-Ann Johnson-Myers

Abstract:

Since the early 1990’s voter turnout in Jamaica has been abysmal. More troubling, the group less interested in voting are the ‘articulate minority’ (educated youths, aged 18-35). Using surveys, media commentaries and data from the Electoral Commission of Jamaica, this study explores the relationship between educated youths and traditional politics in Jamaica. Specifically, it raises questions about why the ‘articulate minority’ did not vote in the 2016 general election. This will be done by highlighting the political and socio-economic reasons affecting their participation in the electoral process, their opinions of who is responsible for low voter turnout in Jamaica, and what they think needs to be done to encourage people in general to vote. The findings reveal that lack of interest in the democratic and electoral process by the ‘articulate minority’ is due to their growing distrust of politicians and political parties, and lack of confidence in the political process.

Keywords: articulate minority, Jamaica, voter apathy, voter turnout

Procedia PDF Downloads 216
34835 Designing and Implementing a Tourist-Guide Web Service Based on Volunteer Geographic Information Using Open-Source Technologies

Authors: Javad Sadidi, Ehsan Babaei, Hani Rezayan

Abstract:

The advent of web 2.0 gives a possibility to scale down the costs of data collection and mapping, specifically if the process is done by volunteers. Every volunteer can be thought of as a free and ubiquitous sensor to collect spatial, descriptive as well as multimedia data for tourist services. The lack of large-scale information, such as real-time climate and weather conditions, population density, and other related data, can be considered one of the important challenges in developing countries for tourists to make the best decision in terms of time and place of travel. The current research aims to design and implement a spatiotemporal web map service using volunteer-submitted data. The service acts as a tourist-guide service in which tourists can search interested places based on their requested time for travel. To design the service, three tiers of architecture, including data, logical processing, and presentation tiers, have been utilized. For implementing the service, open-source software programs, client and server-side programming languages (such as OpenLayers2, AJAX, and PHP), Geoserver as a map server, and Web Feature Service (WFS) standards have been used. The result is two distinct browser-based services, one for sending spatial, descriptive, and multimedia volunteer data and another one for tourists and local officials. Local official confirms the veracity of the volunteer-submitted information. In the tourist interface, a spatiotemporal search engine has been designed to enable tourists to find a tourist place based on province, city, and location at a specific time of interest. Implementing the tourist-guide service by this methodology causes the following: the current tourists participate in a free data collection and sharing process for future tourists, a real-time data sharing and accessing for all, avoiding a blind selection of travel destination and significantly, decreases the cost of providing such services.

Keywords: VGI, tourism, spatiotemporal, browser-based, web mapping

Procedia PDF Downloads 80
34834 The Solvent Extraction of Uranium, Plutonium and Thorium from Aqueous Solution by 1-Hydroxyhexadecylidene-1,1-Diphosphonic Acid

Authors: M. Bouhoun Ali, A. Y. Badjah Hadj Ahmed, M. Attou, A. Elias, M. A. Didi

Abstract:

In this paper, the solvent extraction of uranium(VI), plutonium(IV) and thorium(IV) from aqueous solutions using 1-hydroxyhexadecylidene-1,1-diphosphonic acid (HHDPA) in treated kerosene has been investigated. The HHDPA was previously synthesized and characterized by FT-IR, 1H NMR, 31P NMR spectroscopy and elemental analysis. The effects contact time, initial pH, initial metal concentration, aqueous/organic phase ratio, extractant concentration and temperature on the extraction process have been studied. An empirical modelling was performed by using a 25 full factorial design, and regression equation for extraction metals was determined from the data. The conventional log-log analysis of the extraction data reveals that ratios of extractant to extracted U(VI), Pu(IV) and Th(IV) are 1:1, 1:2 and 1:2, respectively. Thermodynamic parameters showed that the extraction process was exothermic heat and spontaneous. The obtained optimal parameters were applied to real effluents containing uranium(VI), plutonium(IV) and thorium(IV) ions.

Keywords: solvent extraction, uranium, plutonium, thorium, 1-hydroxyhexadecylidene-1-1-diphosphonic acid, aqueous solution

Procedia PDF Downloads 280
34833 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 234
34832 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 360
34831 GRABTAXI: A Taxi Revolution in Thailand

Authors: Danuvasin Charoen

Abstract:

The study investigates the business process and business model of GRABTAXI. The paper also discusses how the company implemented strategies to gain competitive advantages. The data is derived from the analysis of secondary data and the in-depth interviews among staffs, taxi drivers, and key customers. The findings indicated that the company’s competitive advantages come from being the first mover, emphasising on the ease of use and tangible benefits of application, and using network effect strategy.

Keywords: taxi, mobile application, innovative business model, Thailand

Procedia PDF Downloads 293
34830 Adaptive Decision Feedback Equalizer Utilizing Fixed-Step Error Signal for Multi-Gbps Serial Links

Authors: Alaa Abdullah Altaee

Abstract:

This paper presents an adaptive decision feedback equalizer (ADFE) for multi-Gbps serial links utilizing a fix-step error signal extracted from cross-points of received data symbols. The extracted signal is generated based on violation of received data symbols with minimum detection requirements at the clock and data recovery (CDR) stage. The iterations of the adaptation process search for the optimum feedback tap coefficients to maximize the data eye-opening and minimize the adaptation convergence time. The effectiveness of the proposed architecture is validated using the simulation results of a serial link designed in an IBM 130 nm 1.2V CMOS technology. The data link with variable channel lengths is analyzed using Spectre from Cadence Design Systems with BSIM4 device models.

Keywords: adaptive DFE, CMOS equalizer, error detection, serial links, timing jitter, wire-line communication

Procedia PDF Downloads 106
34829 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 223
34828 Assessment of Factors Influencing Business Process Harmonization: A Case Study in an Industrial Company

Authors: J. J. M. Trienekens, H. L. Romero, L. Cuenca

Abstract:

While process harmonization is increasingly mentioned and unanimously associated with several benefits, there is a need for more understanding of how it contributes to business process redesign and improvement. This paper presents the application, in an industrial case study, of a conceptual harmonization model on the relationship between drivers and effects of process harmonization. The drivers are called contextual factors which influence harmonization. Assessment of these contextual factors in a particular business domain, clarifies the extent of harmonization that can be achieved, or that should be strived at. The case study shows how the conceptual harmonization model can be made operational and can act as a valuable assessment tool. From both qualitative, as well as some quantitative, assessment results, insights are being discussed on the extent of harmonization that can be achieved, and action plans are being defined for business (process) harmonization.

Keywords: case study, contextual factors, process harmonization, industrial company

Procedia PDF Downloads 380
34827 Early Childhood Education: Teachers Ability to Assess

Authors: Ade Dwi Utami

Abstract:

Pedagogic competence is the basic competence of teachers to perform their tasks as educators. The ability to assess has become one of the demands in teachers pedagogic competence. Teachers ability to assess is related to curriculum instructions and applications. This research is aimed at obtaining data concerning teachers ability to assess that comprises of understanding assessment, determining assessment type, tools and procedure, conducting assessment process, and using assessment result information. It uses mixed method of explanatory technique in which qualitative data is used to verify the quantitative data obtained through a survey. The technique of quantitative data collection is by test whereas the qualitative data collection is by observation, interview and documentation. Then, the analyzed data is processed through a proportion study technique to be categorized into high, medium and low. The result of the research shows that teachers ability to assess can be grouped into 3 namely, 2% of high, 4% of medium and 94% of low. The data shows that teachers ability to assess is still relatively low. Teachers are lack of knowledge and comprehension in assessment application. The statement is verified by the qualitative data showing that teachers did not state which aspect was assessed in learning, record children’s behavior, and use the data result as a consideration to design a program. Teachers have assessment documents yet they only serve as means of completing teachers administration for the certification program. Thus, assessment documents were not used with the basis of acquired knowledge. The condition should become a consideration of the education institution of educators and the government to improve teachers pedagogic competence, including the ability to assess.

Keywords: assessment, early childhood education, pedagogic competence, teachers

Procedia PDF Downloads 239
34826 Factors Affecting Green Supply Chain Management of Lampang Ceramics Industry

Authors: Nattida Wannaruk, Wasawat Nakkiew

Abstract:

This research aims to study the factors that affect the performance of green supply chain management in the Lampang ceramics industry. The data investigation of this research was questionnaires which were gathered from 20 factories in the Lampang ceramics industry. The research factors are divided into five major groups which are green design, green purchasing, green manufacturing, green logistics and reverse logistics. The questionnaire has consisted of four parts that related to factors green supply chain management and general information of the Lampang ceramics industry. Then, the data were analyzed using descriptive statistic and priority of each factor by using the analytic hierarchy process (AHP). The understanding of factors affecting the green supply chain management of Lampang ceramics industry was indicated in the summary result along with each factor weight. The result of this research could be contributed to the development of indicators or performance evaluation in the future.

Keywords: Lampang ceramics industry, green supply chain management, analysis hierarchy process (AHP), factors affecting

Procedia PDF Downloads 323
34825 Trust and Conflict Resolution: Relationship Building for Learning

Authors: Jeff Dickie

Abstract:

This research paper combined grounded coding and research questions with the objective to investigate conflict resolution in the classroom. The students’ answers concerning teaching were coded according to phrasal meanings which revealed concepts. These concept codes then became input data into theoretical frameworks. The investigation indicated two conflicts: whether the information was valid and whether to make the study effort which was discussed as perceptions of teacher’s competence in helping to learn. The relevant factors in helping to learn were predominately emotional. These factors were important in the negotiation process to develop relationships. Information validity seemed to be the motivator to begin and participate effectively with the learning process. In effect, confidence in the learning negotiation process with the focus towards relationship building with the subject matter seemed to be the motivator to make the study effort.

Keywords: coding, confidence, competence, conflict resolution, risk, trust, relationship building

Procedia PDF Downloads 416
34824 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data

Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.

Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query

Procedia PDF Downloads 147
34823 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: information visualization, visual analytics, text mining, visual text analytics tools, big data visualization

Procedia PDF Downloads 388
34822 Students' ExperiEnce Enhancement Through Simulaton. A Process Flow in Logistics and Transportation Field

Authors: Nizamuddin Zainuddin, Adam Mohd Saifudin, Ahmad Yusni Bahaudin, Mohd Hanizan Zalazilah, Roslan Jamaluddin

Abstract:

Students’ enhanced experience through simulation is a crucial factor that brings reality to the classroom. The enhanced experience is all about developing, enriching and applications of a generic process flow in the field of logistics and transportations. As educational technology has improved, the effective use of simulations has greatly increased to the point where simulations should be considered a valuable, mainstream pedagogical tool. Additionally, in this era of ongoing (some say never-ending) assessment, simulations offer a rich resource for objective measurement and comparisons. Simulation is not just another in the long line of passing fads (or short-term opportunities) in educational technology. It is rather a real key to helping our students understand the world. It is a way for students to acquire experience about how things and systems in the world behave and react, without actually touching them. In short, it is about interactive pretending. Simulation is all about representing the real world which includes grasping the complex issues and solving intricate problems. Therefore, it is crucial before stimulate the real process of inbound and outbound logistics and transportation a generic process flow shall be developed. The paper will be focusing on the validization of the process flow by looking at the inputs gains from the sample. The sampling of the study focuses on multi-national and local manufacturing companies, third party companies (3PL) and government agency, which are selected in Peninsular Malaysia. A simulation flow chart was proposed in the study that will be the generic flow in logistics and transportation. A qualitative approach was mainly conducted to gather data in the study. It was found out from the study that the systems used in the process of outbound and inbound are System Application Products (SAP) and Material Requirement Planning (MRP). Furthermore there were some companies using Enterprises Resources Planning (ERP) and Electronic Data Interchange (EDI) as part of the Suppliers Own Inventories (SOI) networking as a result of globalized business between one countries to another. Computerized documentations and transactions were all mandatory requirement by the Royal Custom and Excise Department. The generic process flow will be the basis of developing a simulation program that shall be used in the classroom with the objective of further enhanced the students’ learning experience. Thus it will contributes to the body of knowledge on the enrichment of the student’s employability and also shall be one of the way to train new workers in the logistics and transportation filed.

Keywords: enhancement, simulation, process flow, logistics, transportation

Procedia PDF Downloads 320
34821 Efficient Pre-Processing of Single-Cell Assay for Transposase Accessible Chromatin with High-Throughput Sequencing Data

Authors: Fan Gao, Lior Pachter

Abstract:

The primary tool currently used to pre-process 10X Chromium single-cell ATAC-seq data is Cell Ranger, which can take very long to run on standard datasets. To facilitate rapid pre-processing that enables reproducible workflows, we present a suite of tools called scATAK for pre-processing single-cell ATAC-seq data that is 15 to 18 times faster than Cell Ranger on mouse and human samples. Our tool can also calculate chromatin interaction potential matrices, and generate open chromatin signal and interaction traces for cell groups. We use scATAK tool to explore the chromatin regulatory landscape of a healthy adult human brain and unveil cell-type specific features, and show that it provides a convenient and computational efficient approach for pre-processing single-cell ATAC-seq data.

Keywords: single-cell, ATAC-seq, bioinformatics, open chromatin landscape, chromatin interactome

Procedia PDF Downloads 146
34820 Application of FT-NIR Spectroscopy and Electronic Nose in On-line Monitoring of Dough Proofing

Authors: Madhuresh Dwivedi, Navneet Singh Deora, Aastha Deswal, H. N. Mishra

Abstract:

FT-NIR spectroscopy and electronic nose was used to study the kinetics of dough proofing. Spectroscopy was conducted with an optic probe in the diffuse reflectance mode. The dough leavening was carried out at different temperatures (25 and 35°C) and constant RH (80%). Spectra were collected in the range of wave numbers from 12,000 to 4,000 cm-1 directly on the samples, every 5 min during proofing, up to 2 hours. NIR spectra were corrected for scatter effect and second order derivatization was done to transform the spectra. Principal component analysis (PCA) was applied for the leavening process and process kinetics was calculated. PCA was performed on data set and loadings were calculated. For leavening, four absorption zones (8,950-8,850, 7,200-6,800, 5,250-5,150 and 4,700-4,250 cm-1) were involved in describing the process. Simultaneously electronic nose was also used for understanding the development of odour compounds during fermentation. The electronic nose was able to differential the sample on the basis of aroma generation at different time during fermentation. In order to rapidly differentiate samples based on odor, a Principal component analysis is performed and successfully demonstrated in this study. The result suggests that electronic nose and FT-NIR spectroscopy can be utilized for the online quality control of the fermentation process during leavening of bread dough.

Keywords: FT-NIR, dough, e-nose, proofing, principal component analysis

Procedia PDF Downloads 375