Search results for: process data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35196

Search results for: process data

33906 Design and Implementation of LabVIEW Based Relay Autotuning Controller for Level Setup

Authors: Manoj M. Sarode, Sharad P. Jadhav, Mukesh D. Patil, Pushparaj S. Suryawanshi

Abstract:

Even though the PID controller is widely used in industrial process, tuning of PID parameters are not easy. It is a time consuming and requires expert people. Another drawback of PID controller is that process dynamics might change over time. This can happen due to variation of the process load, normal wear and tear etc. To compensate for process behavior change over time, expert users are required to recalibrate the PID gains. Implementation of model based controllers usually needs a process model. Identification of process model is time consuming job and no guaranty of model accuracy. If the identified model is not accurate, performance of the controller may degrade. Model based controllers are quite expensive and the whole procedure for the implementation is sometimes tedious. To eliminate such issues Autotuning PID controller becomes vital element. Software based Relay Feedback Autotuning Controller proves to be efficient, upgradable and maintenance free controller. In Relay Feedback Autotune controller PID parameters can be achieved with a very short span of time. This paper presents the real time implementation of LabVIEW based Relay Feedback Autotuning PID controller. It is successfully developed and implemented to control level of a laboratory setup. Its performance is analyzed for different setpoints and found satisfactorily.

Keywords: autotuning, PID, liquid level control, recalibrate, labview, controller

Procedia PDF Downloads 374
33905 Project Management Agile Model Based on Project Management Body of Knowledge Guideline

Authors: Mehrzad Abdi Khalife, Iraj Mahdavi

Abstract:

This paper presents the agile model for project management process. For project management process, the Project Management Body of Knowledge (PMBOK) guideline has been selected as platform. Combination of computational science and artificial intelligent methodology has been added to the guideline to transfer the standard to agile project management process. The model is the combination of practical standard, computational science and artificial intelligent. In this model, we present communication model and protocols to keep process agile. Here, we illustrate the collaboration man and machine in project management area with artificial intelligent approach.

Keywords: artificial intelligent, conceptual model, man-machine collaboration, project management, standard

Procedia PDF Downloads 322
33904 TessPy – Spatial Tessellation Made Easy

Authors: Jonas Hamann, Siavash Saki, Tobias Hagen

Abstract:

Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.

Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies

Procedia PDF Downloads 110
33903 Artificial Neural Network in Predicting the Soil Response in the Discrete Element Method Simulation

Authors: Zhaofeng Li, Jun Kang Chow, Yu-Hsing Wang

Abstract:

This paper attempts to bridge the soil properties and the mechanical response of soil in the discrete element method (DEM) simulation. The artificial neural network (ANN) was therefore adopted, aiming to reproduce the stress-strain-volumetric response when soil properties are given. 31 biaxial shearing tests with varying soil parameters (e.g., initial void ratio and interparticle friction coefficient) were generated using the DEM simulations. Based on these 45 sets of training data, a three-layer neural network was established which can output the entire stress-strain-volumetric curve during the shearing process from the input soil parameters. Beyond the training data, 2 additional sets of data were generated to examine the validity of the network, and the stress-strain-volumetric curves for both cases were well reproduced using this network. Overall, the ANN was found promising in predicting the soil behavior and reducing repetitive simulation work.

Keywords: artificial neural network, discrete element method, soil properties, stress-strain-volumetric response

Procedia PDF Downloads 378
33902 Public Participation as a Social Inclusion Tool in the Urban Planning Process: A Case Study of Abuja, Nigeria

Authors: Nwachi Prosper Louis, Cynthia Ogonna Ikesee

Abstract:

The urban planning system of cities varies by country, but in general, it is an instrument for establishing long-term sustainable frameworks and plans for social, institutional and economic development. There is limited knowledge, development, and implementation of effective and sustainable urban planning structures and plans that encourage social inclusion in most communities. This has led to social, economic and environmental deficiencies resulting in community isolation and segregation in class, ethnicity, and race. Encouraging public participation in the urban planning process is one of the instruments that cities can utilise to achieve better social inclusion outcomes. This paper explores how public participation can be used as a social inclusion tool in the urban planning process to achieve better outcomes in Abuja urban planning system. The purpose of this study is to investigate the effectiveness of this approach. Also, a conceptual model was developed which evaluates the relationship between public participation and social inclusion outcomes in the urban planning process. It was seen that every community has its peculiar way of life and challenges, and an understanding of these social societal needs is paramount in the urban planning process. Therefore, the involvement of the public in identifying their needs, selecting priorities and identifying strategies offer better chances for developing solutions that are sustainable, feasible and implementable.

Keywords: public participation, social inclusion, urban planning, urban planning process

Procedia PDF Downloads 170
33901 System-Driven Design Process for Integrated Multifunctional Movable Concepts

Authors: Oliver Bertram, Leonel Akoto Chama

Abstract:

In today's civil transport aircraft, the design of flight control systems is based on the experience gained from previous aircraft configurations with a clear distinction between primary and secondary flight control functions for controlling the aircraft altitude and trajectory. Significant system improvements are now seen particularly in multifunctional moveable concepts where the flight control functions are no longer considered separate but integral. This allows new functions to be implemented in order to improve the overall aircraft performance. However, the classical design process of flight controls is sequential and insufficiently interdisciplinary. In particular, the systems discipline is involved only rudimentarily in the early phase. In many cases, the task of systems design is limited to meeting the requirements of the upstream disciplines, which may lead to integration problems later. For this reason, approaching design with an incremental development is required to reduce the risk of a complete redesign. Although the potential and the path to multifunctional moveable concepts are shown, the complete re-engineering of aircraft concepts with less classic moveable concepts is associated with a considerable risk for the design due to the lack of design methods. This represents an obstacle to major leaps in technology. This gap in state of the art is even further increased if, in the future, unconventional aircraft configurations shall be considered, where no reference data or architectures are available. This means that the use of the above-mentioned experience-based approach used for conventional configurations is limited and not applicable to the next generation of aircraft. In particular, there is a need for methods and tools for a rapid trade-off between new multifunctional flight control systems architectures. To close this gap in the state of the art, an integrated system-driven design process for multifunctional flight control systems of non-classical aircraft configurations will be presented. The overall goal of the design process is to find optimal solutions for single or combined target criteria in a fast process from the very large solution space for the flight control system. In contrast to the state of the art, all disciplines are involved for a holistic design in an integrated rather than a sequential process. To emphasize the systems discipline, this paper focuses on the methodology for designing moveable actuation systems in the context of this integrated design process of multifunctional moveables. The methodology includes different approaches for creating system architectures, component design methods as well as the necessary process outputs to evaluate the systems. An application example of a reference configuration is used to demonstrate the process and validate the results. For this, new unconventional hydraulic and electrical flight control system architectures are calculated which result from the higher requirements for multifunctional moveable concept. In addition to typical key performance indicators such as mass and required power requirements, the results regarding the feasibility and wing integration aspects of the system components are examined and discussed here. This is intended to show how the systems design can influence and drive the wing and overall aircraft design.

Keywords: actuation systems, flight control surfaces, multi-functional movables, wing design process

Procedia PDF Downloads 127
33900 Challenges of School Leadership

Authors: Stefan Ninković

Abstract:

The main purpose of this paper is to examine the different theoretical approaches and relevant empirical evidence and thus, recognize some of the most pressing challenges faced by school leaders. This paper starts from the fact that the new mission of the school is characterized by the need for stronger coordination among students' academic, social and emotional learning. In this sense, school leaders need to focus their commitment, vision and leadership on the issues of students' attitudes, language, cultural and social background, and sexual orientation. More specifically, they should know what a good teaching is for student’s at-risk, students whose first language is not dominant in school, those who’s learning styles are not in accordance with usual teaching styles, or who are stigmatized. There is a rather wide consensus around the fact that the traditionally popular concept of instructional leadership of the school principal is no longer sufficient. However, in a number of "pro-leadership" circles, including certain groups of academic researchers, consultants and practitioners, there is an established tendency of attributing school principal an extraordinary influence towards school achievements. On the other hand, the situation in which all employees in the school are leaders is a utopia par excellence. Although leadership obviously can be efficiently distributed across the school, there are few findings that speak about sources of this distribution and factors making it sustainable. Another idea that is not particularly new, but has only recently gained in importance is related to the fact that the collective capacity of the school is an important resource that often remains under-cultivated. To understand the nature and power of collaborative school cultures, it is necessary to know that these operate in a way that they make their all collective members' tacit knowledge explicit. In this sense, the question is how leaders in schools can shape collaborative culture and create social capital in the school. Pressure exerted on schools to systematically collect and use the data has been accompanied by the need for school leaders to develop new competencies. The role of school leaders is critical in the process of assessing what data are needed and for what purpose. Different types of data are important: test results, data on student’s absenteeism, satisfaction with school, teacher motivation, etc. One of the most important tasks of school leaders are data-driven decision making as well as ensuring transparency of the decision-making process. Finally, the question arises whether the existing models of school leadership are compatible with the current social and economic trends. It is necessary to examine whether and under what conditions schools are in need for forms of leadership that are different from those that currently prevail. Closely related to this issue is also to analyze the adequacy of different approaches to leadership development in the school.

Keywords: educational changes, leaders, leadership, school

Procedia PDF Downloads 316
33899 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 416
33898 Adjusting Electricity Demand Data to Account for the Impact of Loadshedding in Forecasting Models

Authors: Migael van Zyl, Stefanie Visser, Awelani Phaswana

Abstract:

The electricity landscape in South Africa is characterized by frequent occurrences of loadshedding, a measure implemented by Eskom to manage electricity generation shortages by curtailing demand. Loadshedding, classified into stages ranging from 1 to 8 based on severity, involves the systematic rotation of power cuts across municipalities according to predefined schedules. However, this practice introduces distortions in recorded electricity demand, posing challenges to accurate forecasting essential for budgeting, network planning, and generation scheduling. Addressing this challenge requires the development of a methodology to quantify the impact of loadshedding and integrate it back into metered electricity demand data. Fortunately, comprehensive records of loadshedding impacts are maintained in a database, enabling the alignment of Loadshedding effects with hourly demand data. This adjustment ensures that forecasts accurately reflect true demand patterns, independent of loadshedding's influence, thereby enhancing the reliability of electricity supply management in South Africa. This paper presents a methodology for determining the hourly impact of load scheduling and subsequently adjusting historical demand data to account for it. Furthermore, two forecasting models are developed: one utilizing the original dataset and the other using the adjusted data. A comparative analysis is conducted to evaluate forecast accuracy improvements resulting from the adjustment process. By implementing this methodology, stakeholders can make more informed decisions regarding electricity infrastructure investments, resource allocation, and operational planning, contributing to the overall stability and efficiency of South Africa's electricity supply system.

Keywords: electricity demand forecasting, load shedding, demand side management, data science

Procedia PDF Downloads 41
33897 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 77
33896 Software User Experience Enhancement through Collaborative Design

Authors: Shan Wang, Fahad Alhathal, Daniel Hobson

Abstract:

User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023, aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight workshops with a diverse group of 11 individuals. Throughout these sessions, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.

Keywords: user experiences, co-design, design process, knowledge management tool, user-centered design

Procedia PDF Downloads 41
33895 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 142
33894 Geospatial Analysis for Predicting Sinkhole Susceptibility in Greene County, Missouri

Authors: Shishay Kidanu, Abdullah Alhaj

Abstract:

Sinkholes in the karst terrain of Greene County, Missouri, pose significant geohazards, imposing challenges on construction and infrastructure development, with potential threats to lives and property. To address these issues, understanding the influencing factors and modeling sinkhole susceptibility is crucial for effective mitigation through strategic changes in land use planning and practices. This study utilizes geographic information system (GIS) software to collect and process diverse data, including topographic, geologic, hydrogeologic, and anthropogenic information. Nine key sinkhole influencing factors, ranging from slope characteristics to proximity to geological structures, were carefully analyzed. The Frequency Ratio method establishes relationships between attribute classes of these factors and sinkhole events, deriving class weights to indicate their relative importance. Weighted integration of these factors is accomplished using the Analytic Hierarchy Process (AHP) and the Weighted Linear Combination (WLC) method in a GIS environment, resulting in a comprehensive sinkhole susceptibility index (SSI) model for the study area. Employing Jenk's natural break classifier method, the SSI values are categorized into five distinct sinkhole susceptibility zones: very low, low, moderate, high, and very high. Validation of the model, conducted through the Area Under Curve (AUC) and Sinkhole Density Index (SDI) methods, demonstrates a robust correlation with sinkhole inventory data. The prediction rate curve yields an AUC value of 74%, indicating a 74% validation accuracy. The SDI result further supports the success of the sinkhole susceptibility model. This model offers reliable predictions for the future distribution of sinkholes, providing valuable insights for planners and engineers in the formulation of development plans and land-use strategies. Its application extends to enhancing preparedness and minimizing the impact of sinkhole-related geohazards on both infrastructure and the community.

Keywords: sinkhole, GIS, analytical hierarchy process, frequency ratio, susceptibility, Missouri

Procedia PDF Downloads 59
33893 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data

Procedia PDF Downloads 572
33892 CMMI Key Process Areas and FDD Practices

Authors: Rituraj Deka, Nomi Baruah

Abstract:

The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.

Keywords: Agile, CMMI, FDD, KPAs

Procedia PDF Downloads 441
33891 Practice on Design Knowledge Management and Transfer across the Life Cycle of a New-Built Nuclear Power Plant in China

Authors: Danying Gu, Xiaoyan Li, Yuanlei He

Abstract:

As a knowledge-intensive industry, nuclear industry highly values the importance of safety and quality. The life cycle of a NPP (Nuclear Power Plant) can last 100 years from the initial research and design to its decommissioning. How to implement the high-quality knowledge management and how to contribute to a more safe, advanced and economic NPP (Nuclear Power Plant) is the most important issue and responsibility for knowledge management. As the lead of nuclear industry, nuclear research and design institute has competitive advantages of its advanced technology, knowledge and information, DKM (Design Knowledge Management) of nuclear research and design institute is the core of the knowledge management in the whole nuclear industry. In this paper, the study and practice on DKM and knowledge transfer across the life cycle of a new-built NPP in China is introduced. For this digital intelligent NPP, the whole design process is based on a digital design platform which includes NPP engineering and design dynamic analyzer, visualization engineering verification platform, digital operation maintenance support platform and digital equipment design, manufacture integrated collaborative platform. In order to make all the design data and information transfer across design, construction, commissioning and operation, the overall architecture of new-built digital NPP should become a modern knowledge management system. So a digital information transfer model across the NPP life cycle is proposed in this paper. The challenges related to design knowledge transfer is also discussed, such as digital information handover, data center and data sorting, unified data coding system. On the other hand, effective delivery of design information during the construction and operation phase will contribute to the comprehensive understanding of design ideas and components and systems for the construction contractor and operation unit, largely increasing the safety, quality and economic benefits during the life cycle. The operation and maintenance records generated from the NPP operation process have great significance for maintaining the operating state of NPP, especially the comprehensiveness, validity and traceability of the records. So the requirements of an online monitoring and smart diagnosis system of NPP is also proposed, to help utility-owners to improve the safety and efficiency.

Keywords: design knowledge management, digital nuclear power plant, knowledge transfer, life cycle

Procedia PDF Downloads 259
33890 Nanoparticle Based Green Inhibitor for Corrosion Protection of Zinc in Acidic Medium

Authors: Neha Parekh, Divya Ladha, Poonam Wadhwani, Nisha Shah

Abstract:

Nano scaled materials have attracted tremendous interest as corrosion inhibitor due to their high surface area on the metal surfaces. It is well known that the zinc oxide nanoparticles have higher reactivity towards aqueous acidic solution. This work presents a new method to incorporate zinc oxide nanoparticles with white sesame seeds extract (nano-green inhibitor) for corrosion protection of zinc in acidic medium. The morphology of the zinc oxide nanoparticles was investigated by TEM and DLS. The corrosion inhibition efficiency of the green inhibitor and nano-green inhibitor was determined by Gravimetric and electrochemical impedance spectroscopy (EIS) methods. Gravimetric measurements suggested that nano-green inhibitor is more effective than green inhibitor. Furthermore, with the increasing temperature, inhibition efficiency increases for both the inhibitors. In addition, it was established the Temkin adsorption isotherm fits well with the experimental data for both the inhibitors. The effect of temperature and Temkin adsorption isotherm revealed Chemisorption mechanism occurring in the system. The activation energy (Ea) and other thermodynamic parameters for inhibition process were calculated. The data of EIS showed that the charge transfer controls the corrosion process. The surface morphology of zinc metal (specimen) in absence and presence of green inhibitor and nano-green inhibitor were performed using Scanning Electron Microscopy (SEM) and Atomic Force Microscopy (AFM) techniques. The outcomes indicated a formation of a protective layer over zinc metal (specimen).

Keywords: corrosion, green inhibitor, nanoparticles, zinc

Procedia PDF Downloads 429
33889 A NoSQL Based Approach for Real-Time Managing of Robotics's Data

Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir

Abstract:

This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.

Keywords: NoSQL databases, database management systems, robotics, big data

Procedia PDF Downloads 330
33888 Increasing the Mastery of Kanji with Language Learning Strategies through Multimedia

Authors: Sherly Ferro Lensun, Donal Matheos Ratu, Elni Jeini Usoh, Helena M. L. Pandi, Mayske Rinny Liando

Abstract:

This study aims to gain a deep understanding of the process and the increase resulting in mastery of Kanji with a Language Learning Strategies through multimedia. This research aims to gain scientific data on process and the result of improving kanji mastery by using Chokusetsu strategy in Kanji learning. The method used in this research is Action Research developed by Kemmis and Mc. Taggart is known as Spiral Model. This model consists of following stages: planning, implementation, observation, and reflection. The research results in following findings: (1) Kanji mastery comprises 4 major aspects, those are reading, writing, the use in sentence, and memorizing, and those aspects show gradual improvement from time to time. (2) Students have more participation in learning activities which can be identified from some positive behaviours such giving respond in finishing exercise in class. (3) Students’ better attention to the lesson shown by active behaviour in giving more questions or asking for more explanation to the lecturers, memorizing Kanji card, finishing the task of making Kanji card/house, doing the exercises more seriously, and finishing homework assignment punctually. (4) More attractive learning activities and tasks in the forms of more engaging colour and pictures enables students to conduct self-evaluation on their learning process.

Keywords: Kanji, action research, language learning strategies, multimedia

Procedia PDF Downloads 159
33887 Evaluation and Assessment of Bioinformatics Methods and Their Applications

Authors: Fatemeh Nokhodchi Bonab

Abstract:

Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.

Keywords: methods, applications, transcriptional regulatory systems, techniques

Procedia PDF Downloads 105
33886 Printed Thai Character Recognition Using Particle Swarm Optimization Algorithm

Authors: Phawin Sangsuvan, Chutimet Srinilta

Abstract:

This Paper presents the applications of Particle Swarm Optimization (PSO) Method for Thai optical character recognition (OCR). OCR consists of the pre-processing, character recognition and post-processing. Before enter into recognition process. The Character must be “Prepped” by pre-processing process. The PSO is an optimization method that belongs to the swarm intelligence family based on the imitation of social behavior patterns of animals. Route of each particle is determined by an individual data among neighborhood particles. The interaction of the particles with neighbors is the advantage of Particle Swarm to determine the best solution. So PSO is interested by a lot of researchers in many difficult problems including character recognition. As the previous this research used a Projection Histogram to extract printed digits features and defined the simple Fitness Function for PSO. The results reveal that PSO gives 67.73% for testing dataset. So in the future there can be explored enhancement the better performance of PSO with improve the Fitness Function.

Keywords: character recognition, histogram projection, particle swarm optimization, pattern recognition techniques

Procedia PDF Downloads 457
33885 Interior Design Pedagogy in the 21st Century: Personalised Design Process

Authors: Roba Zakariah Shaheen

Abstract:

In the 21st-century Interior, design pedagogy has developed rapidly due to social and economical factors. Socially, this paper presents research findings that shows a significant relationship between educators and students in interior design education. It shows that students’ personal traits, design process, and thinking process are significantly interrelated. Constructively, this paper presented how personal traits can guide educators in the interior design education domain to develop students’ thinking process. In the same time, it demonstrated how students should use their own personal traits to create their own design process. Constructivism was the theory underneath this research, as it supports the grounded theory, which is the methodological approach of this research. Moreover, Mayer’s Briggs Type Indicator strategy was used to investigate the personality traits scientifically, as a psychological strategy that related to cognitive ability. Conclusions from this research strongly recommends that educators and students should utilize their personal traits to foster interior design education.

Keywords: interior design, pedagogy, constructivism, grounded theory, personality traits, creativity

Procedia PDF Downloads 190
33884 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 165
33883 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 129
33882 The Relationship between Job Stress and Handover Effectiveness of Nurses

Authors: Rujnan Tuna, Ayse Cil Akinci

Abstract:

Work life takes up an important place in human life, and an employed person faces many stimuli from internal and external environments and is affected by them in a positive or negative way. Also, the handover process, which is the process of sharing information about the patient with other health professionals, is an important criterion to maintain patient care and enhance the quality of care provided. Handover is a key component for sustaining daily basic clinical practices and is also essential to maintain the safe patient care. This investigation followed a descriptive and correlation design in order to establish job stress and the handover efficiency of nurses and the relationship in between. The study was conducted with 192 nurses working in a public hospital in Istanbul between January and March 2017. Descriptive information form, Job Stressors Scale, and Handover Evaluation Scale were used to collect the data of the study. The data were analyzed by using IBM SPSS Statistics 22.0 statistical software. Approvals from participants, managers of institution, and ethics committee were taken for the study. As a result of the research, it was found that job stress was above the median value, and the highest score in the ‘work role conflict’ subdimension. Also, it was found that the effectiveness of the nurses' handover effectiviness was above the median value and the highest score in the ‘quality of information’ subdimension. In the study, there was a negatively weak correlation between ‘work role overload’ subdimension of Job Stressors Scale and ‘interaction and support’ subdimension of Handover Evaluation Scale. There is a need for further study in order to maintain patient safety.

Keywords: handover, job stress, nurse, patient

Procedia PDF Downloads 146
33881 Benchmarking Service Quality among Quick-Service Restaurants towards Service Innovations

Authors: Scott Earthy Baldo, Anna Cred Patricia Barroma, Miguel Angelo Eñano, John Ares Hipolito, Orange Sundra Sison, Rixielle Gwendale Tumambing

Abstract:

Service Innovation is the introduction of several new-fangled ways on how to deliver service to customers with the intention to improve one’s existing service quality and to attract more customers. This research paper aims to identify the various service practices being implemented on the different quick-service restaurants within Morayta Street, Manila, Philippines and compare each establishment to the best within the industry through the process of benchmarking towards service innovations. In order for the gathering of valuable data to be possible, a mixed-method approach was used, wherein qualitative data were taken from the managers of each establishment, indicating the service practices being used, and quantitative data were collected from the customers and employees regarding their perception towards the present service quality of each selected quick-service restaurants, in line with the current service innovations being implemented. This research was conducted in order to discern which service practices are effective in attracting customers and boosting their satisfaction for future references of practitioners who are planning to manage a quick-service restaurant and for students studying in the field of hospitality, specifically on service.

Keywords: benchmarking, quick-service restaurants, service innovations, service quality

Procedia PDF Downloads 352
33880 Analysis of the Current and Ideal Situation of Iran’s Football Talent Management Process from the Perspective of the Elites

Authors: Mehran Nasiri, Ardeshir Poornemat

Abstract:

The aim of this study was to investigate the current and ideal situations of the process of talent identification in Iranian football from the point of view of Iranian instructors of the Asian Football Confederation (AFC). This research was a descriptive-analytical study; in data collection phase a questionnaire was used, whose face validity was confirmed by experts of Physical Education and Sports Science. The reliability of questionnaire was estimated through the use of Cronbach's alpha method (0.91). This study involved 122 participants of Iranian instructors of the AFC who were selected based on stratified random sampling method. Descriptive statistics were used to describe the variables and inferential statistics (Chi-square) were used to test the hypotheses of the study at significant level (p ≤ 0.05). The results of Chi-square test related to the point of view of Iranian instructors of the AFC showed that the grass-roots scientific method was the best way to identify football players (0.001), less than 10 years old were the best ages for talent identification (0.001), the Football Federation was revealed to be the most important organization in talent identification (0.002), clubs were shown to be the most important institution in developing talents (0.001), trained scouts of Football Federation were demonstrated to be the best and most appropriate group for talent identification (0.001), and being referred by the football academy coaches was shown to be the best way to attract talented football players in Iran (0.001). It was also found that there was a huge difference between the current and ideal situation of the process of talent identification in Iranian football from the point of view of Iranian instructors of the AFC. Hence, it is recommended that the policy makers of talent identification for Iranian football provide a comprehensive, clear and systematic model of talent identification and development processes for the clubs and football teams, so that the talent identification process helps to nurture football talents more efficiently.

Keywords: current situation, talent finding, ideal situation, instructors (AFC)

Procedia PDF Downloads 197
33879 Simulation of Multistage Extraction Process of Co-Ni Separation Using Ionic Liquids

Authors: Hongyan Chen, Megan Jobson, Andrew J. Masters, Maria Gonzalez-Miquel, Simon Halstead, Mayri Diaz de Rienzo

Abstract:

Ionic liquids offer excellent advantages over conventional solvents for industrial extraction of metals from aqueous solutions, where such extraction processes bring opportunities for recovery, reuse, and recycling of valuable resources and more sustainable production pathways. Recent research on the use of ionic liquids for extraction confirms their high selectivity and low volatility, but there is relatively little focus on how their properties can be best exploited in practice. This work addresses gaps in research on process modelling and simulation, to support development, design, and optimisation of these processes, focusing on the separation of the highly similar transition metals, cobalt, and nickel. The study exploits published experimental results, as well as new experimental results, relating to the separation of Co and Ni using trihexyl (tetradecyl) phosphonium chloride. This extraction agent is attractive because it is cheaper, more stable and less toxic than fluorinated hydrophobic ionic liquids. This process modelling work concerns selection and/or development of suitable models for the physical properties, distribution coefficients, for mass transfer phenomena, of the extractor unit and of the multi-stage extraction flowsheet. The distribution coefficient model for cobalt and HCl represents an anion exchange mechanism, supported by the literature and COSMO-RS calculations. Parameters of the distribution coefficient models are estimated by fitting the model to published experimental extraction equilibrium results. The mass transfer model applies Newman’s hard sphere model. Diffusion coefficients in the aqueous phase are obtained from the literature, while diffusion coefficients in the ionic liquid phase are fitted to dynamic experimental results. The mass transfer area is calculated from the surface to mean diameter of liquid droplets of the dispersed phase, estimated from the Weber number inside the extractor. New experiments measure the interfacial tension between the aqueous and ionic phases. The empirical models for predicting the density and viscosity of solutions under different metal loadings are also fitted to new experimental data. The extractor is modelled as a continuous stirred tank reactor with mass transfer between the two phases and perfect phase separation of the outlet flows. A multistage separation flowsheet simulation is set up to replicate a published experiment and compare model predictions with the experimental results. This simulation model is implemented in gPROMS software for dynamic process simulation. The results of single stage and multi-stage flowsheet simulations are shown to be in good agreement with the published experimental results. The estimated diffusion coefficient of cobalt in the ionic liquid phase is in reasonable agreement with published data for the diffusion coefficients of various metals in this ionic liquid. A sensitivity study with this simulation model demonstrates the usefulness of the models for process design. The simulation approach has potential to be extended to account for other metals, acids, and solvents for process development, design, and optimisation of extraction processes applying ionic liquids for metals separations, although a lack of experimental data is currently limiting the accuracy of models within the whole framework. Future work will focus on process development more generally and on extractive separation of rare earths using ionic liquids.

Keywords: distribution coefficient, mass transfer, COSMO-RS, flowsheet simulation, phosphonium

Procedia PDF Downloads 173
33878 Residual Life Estimation Based on Multi-Phase Nonlinear Wiener Process

Authors: Hao Chen, Bo Guo, Ping Jiang

Abstract:

Residual life (RL) estimation based on multi-phase nonlinear Wiener process was studied in this paper, which is significant for complicated products with small samples. Firstly, nonlinear Wiener model with random parameter was introduced and multi-phase nonlinear Wiener model was proposed to model degradation process of products that were nonlinear and separated into different phases. Then the multi-phase RL probability density function based on the presented model was derived approximately in a closed form and parameters estimation was achieved with the method of maximum likelihood estimation (MLE). Finally, the method was applied to estimate the RL of high voltage plus capacitor. Compared with the other three different models by log-likelihood function (Log-LF) and Akaike information criterion (AIC), the results show that the proposed degradation model can capture degradation process of high voltage plus capacitors in a better way and provide a more reliable result.

Keywords: multi-phase nonlinear wiener process, residual life estimation, maximum likelihood estimation, high voltage plus capacitor

Procedia PDF Downloads 439
33877 Perceived Barriers and Benefits of Technology-Based Progress Monitoring for Non-Academic Individual Education Program Goals

Authors: A. Drelick, T. Sondergeld, M. Decarlo-Tecce, K. McGinley

Abstract:

In 1975, a free, appropriate public education (FAPE) was granted for all students in the United States regardless of their disabilities. As a result, the special education landscape has been reshaped through new policies and legislation. Progress monitoring, a specific component of an Individual Education Program (IEP) calls, for the use of data collection to determine the appropriateness of services provided to students with disabilities. The recent US Supreme Court ruling in Endrew F. v. Douglas County warrants giving increased attention to student progress, specifically pertaining to improving functional, or non-academic, skills that are addressed outside the general education curriculum. While using technology to enhance data collection has become a common practice for measuring academic growth, its application for non-academic IEP goals is uncertain. A mixed-methods study examined current practices and rationales for implementing technology-based progress monitoring focused on non-academic IEP goals. Fifty-seven participants responded to an online survey regarding their progress monitoring programs for non-academic goals. After isolated analysis and interpretation of quantitative and qualitative results, data were synthesized to produce meta-inferences that drew broader conclusions on the topic. For the purpose of this paper, specific focus will be placed on the perceived barriers and benefits of implementing technology-based progress monitoring protocols for non-academic IEP goals. The findings of this study highlight facts impacting the use of technology-based progress monitoring. Perceived barriers to implementation include: (1) lack of training, (2) access to technology, (3) outdated or inoperable technology, (4) reluctance to change, (5) cost, (6) lack of individualization within technology-based programs, and (7) legal issues in special education; while perceived benefits include: (1) overall ease of use, (2) accessibility, (3) organization, (4) potential for improved presentation of data, (5) streamlining the progress-monitoring process, and (6) legal issues in special education. Based on these conclusions, recommendations are made to IEP teams, school districts, and software developers to improve the progress-monitoring process for functional skills.

Keywords: special education, progress monitoring, functional skills, technology

Procedia PDF Downloads 224