Search results for: software process engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20516

Search results for: software process engineering

17786 Polymer Recycling by Biomaterial and Its Application in Grease Formulation

Authors: Amitkumar Barot, Vijaykumar Sinha

Abstract:

There is growing interest in the development of new materials based on recycled polymers from plastic waste, and also in the field of lubricants much effort has been spent on substitution of petro-based raw materials by natural-based renewable ones. This is due to the facts of depleting fossil fuels and due to strict environmental laws. In relevance to this, new technique for the formulation of grease that combines the chemical recycling of poly (ethylene terephthalate) PET with the use of castor oil (CO) has been developed. Comparison to diols used in chemical recycling of PET, castor oil is renewable, easily available, environmentally friendly, economically cheaper and hence sustainability indeed. The process parameters like CO concentration and temperature were altered, and further, the influences of the process parameters have been studied in order to establish technically and commercially viable process. Further thereby formed depolymerized product find an application as base oil in the formulation of grease. A depolymerized product has been characterized by various chemical and instrumental methods, while formulated greases have been evaluated for its tribological properties. The grease formulated using this new environmentally friendly approach presents applicative properties similar, and in some cases superior, compared to those of a commercial grease obtained from non-renewable resources.

Keywords: castor oil, grease formulation, recycling, sustainability

Procedia PDF Downloads 203
17785 Personnel Selection Based on Step-Wise Weight Assessment Ratio Analysis and Multi-Objective Optimization on the Basis of Ratio Analysis Methods

Authors: Emre Ipekci Cetin, Ebru Tarcan Icigen

Abstract:

Personnel selection process is considered as one of the most important and most difficult issues in human resources management. At the stage of personnel selection, the applicants are handled according to certain criteria, the candidates are dealt with, and efforts are made to select the most appropriate candidate. However, this process can be more complicated in terms of the managers who will carry out the staff selection process. Candidates should be evaluated according to different criteria such as work experience, education, foreign language level etc. It is crucial that a rational selection process is carried out by considering all the criteria in an integrated structure. In this study, the problem of choosing the front office manager of a 5 star accommodation enterprise operating in Antalya is addressed by using multi-criteria decision-making methods. In this context, SWARA (Step-wise weight assessment ratio analysis) and MOORA (Multi-Objective Optimization on the basis of ratio analysis) methods, which have relatively few applications when compared with other methods, have been used together. Firstly SWARA method was used to calculate the weights of the criteria and subcriteria that were determined by the business. After the weights of the criteria were obtained, the MOORA method was used to rank the candidates using the ratio system and the reference point approach. Recruitment processes differ from sector to sector, from operation to operation. There are a number of criteria that must be taken into consideration by businesses in accordance with the structure of each sector. It is of utmost importance that all candidates are evaluated objectively in the framework of these criteria, after these criteria have been carefully selected in the selection of suitable candidates for employment. In the study, staff selection process was handled by using SWARA and MOORA methods together.

Keywords: accommodation establishments, human resource management, multi-objective optimization on the basis of ratio analysis, multi-criteria decision making, step-wise weight assessment ratio analysis

Procedia PDF Downloads 325
17784 Decision-Making Process Based on Game Theory in the Process of Urban Transformation

Authors: Cemil Akcay, Goksun Yerlikaya

Abstract:

Buildings are the living spaces of people with an active role in every aspect of life in today's world. While some structures have survived from the early ages, most of the buildings that completed their lifetime have not transported to the present day. Nowadays, buildings that do not meet the social, economic, and safety requirements of the age return to life with a transformation process. This transformation is called urban transformation. Urban transformation is the renewal of the areas with a risk of disaster and the technological infrastructure required by the structure. The transformation aims to prevent damage to earthquakes and other disasters by rebuilding buildings that have completed their non-earthquake-resistant economic life. It is essential to decide on other issues related to conversion and transformation in places where most of the building stock should transform into the first-degree earthquake belt, such as Istanbul. In urban transformation, property owners, local authority, and contractor must deal at a common point. Considering that hundreds of thousands of property owners are sometimes in the areas of transformation, it is evident how difficult it is to make the deal and decide. For the optimization of these decisions, the use of game theory is foreseeing. The main problem in this study is that the urban transformation is carried out in place, or the building or buildings are transport to a different location. There are many stakeholders in the Istanbul University Cerrahpaşa Medical Faculty Campus, which is planned to be carried out in the process of urban transformation, was tried to solve the game theory applications. An analysis of the decisions given on a real urban transformation project and the logical suitability of decisions taken without the use of game theory were also supervised using game theory. In each step of this study, many decision-makers are classifying according to a specific logical sequence, and in the game trees that emerged as a result of this classification, Nash balances were tried to observe, and optimum decisions were determined. All decisions taken for this project have been subjected to two significant differentiated comparisons using game theory, and as decisions are taken without the use of game theory, and according to the results, solutions for the decision phase of the urban transformation process introduced. The game theory model developed from beginning to the end of the urban transformation process, particularly as a solution to the difficulty of making rational decisions in large-scale projects with many participants in the decision-making process. The use of a decision-making mechanism can provide an optimum answer to the demands of the stakeholders. In today's world for the construction sector, it is also seeing that the game theory is a non-surprising consequence of the fact that it is the most critical issues of planning and making the right decision in future years.

Keywords: urban transformation, the game theory, decision making, multi-actor project

Procedia PDF Downloads 120
17783 Optimization of Titanium Leaching Process Using Experimental Design

Authors: Arash Rafiei, Carroll Moore

Abstract:

Leaching process as the first stage of hydrometallurgy is a multidisciplinary system including material properties, chemistry, reactor design, mechanics and fluid dynamics. Therefore, doing leaching system optimization by pure scientific methods need lots of times and expenses. In this work, a mixture of two titanium ores and one titanium slag are used for extracting titanium for leaching stage of TiO2 pigment production procedure. Optimum titanium extraction can be obtained from following strategies: i) Maximizing titanium extraction without selective digestion; and ii) Optimizing selective titanium extraction by balancing between maximum titanium extraction and minimum impurity digestion. The main difference between two strategies is due to process optimization framework. For the first strategy, the most important stage of production process is concerned as the main stage and rest of stages would be adopted with respect to the main stage. The second strategy optimizes performance of more than one stage at once. The second strategy has more technical complexity compared to the first one but it brings more economical and technical advantages for the leaching system. Obviously, each strategy has its own optimum operational zone that is not as same as the other one and the best operational zone is chosen due to complexity, economical and practical aspects of the leaching system. Experimental design has been carried out by using Taguchi method. The most important advantages of this methodology are involving different technical aspects of leaching process; minimizing the number of needed experiments as well as time and expense; and concerning the role of parameter interactions due to principles of multifactor-at-time optimization. Leaching tests have been done at batch scale on lab with appropriate control on temperature. The leaching tank geometry has been concerned as an important factor to provide comparable agitation conditions. Data analysis has been done by using reactor design and mass balancing principles. Finally, optimum zone for operational parameters are determined for each leaching strategy and discussed due to their economical and practical aspects.

Keywords: titanium leaching, optimization, experimental design, performance analysis

Procedia PDF Downloads 353
17782 Random Vertical Seismic Vibrations of the Long Span Cantilever Beams

Authors: Sergo Esadze

Abstract:

Seismic resistance norms require calculation of cantilevers on vertical components of the base seismic acceleration. Long span cantilevers, as a rule, must be calculated as a separate construction element. According to the architectural-planning solution, functional purposes and environmental condition of a designing buildings/structures, long span cantilever construction may be of very different types: both by main bearing element (beam, truss, slab), and by material (reinforced concrete, steel). A choice from these is always linked with bearing construction system of the building. Research of vertical seismic vibration of these constructions requires individual approach for each (which is not specified in the norms) in correlation with model of seismic load. The latest may be given both as deterministic load and as a random process. Loading model as a random process is more adequate to this problem. In presented paper, two types of long span (from 6m – up to 12m) reinforcement concrete cantilever beams have been considered: a) bearing elements of cantilevers, i.e., elements in which they fixed, have cross-sections with large sizes and cantilevers are made with haunch; b) cantilever beam with load-bearing rod element. Calculation models are suggested, separately for a) and b) types. They are presented as systems with finite quantity degree (concentrated masses) of freedom. Conditions for fixing ends are corresponding with its types. Vertical acceleration and vertical component of the angular acceleration affect masses. Model is based on assumption translator-rotational motion of the building in the vertical plane, caused by vertical seismic acceleration. Seismic accelerations are considered as random processes and presented by multiplication of the deterministic envelope function on stationary random process. Problem is solved within the framework of the correlation theory of random process. Solved numerical examples are given. The method is effective for solving the specific problems.

Keywords: cantilever, random process, seismic load, vertical acceleration

Procedia PDF Downloads 172
17781 Application of the Best Technique for Estimating the Rest-Activity Rhythm Period in Shift Workers

Authors: Rakesh Kumar Soni

Abstract:

Under free living conditions, human biological clocks show a periodicity of 24 hour for numerous physiological, behavioral and biochemical variables. However, this period is not the original period; rather it merely exhibits synchronization with the solar clock. It is, therefore, most important to investigate characteristics of human circadian clock, essentially in shift workers, who normally confront with contrasting social clocks. Aim of the present study was to investigate rest-activity rhythm and to vouch for the best technique for the computation of periods in this rhythm in subjects randomly selected from different groups of shift workers. The rest-activity rhythm was studied in forty-eight shift workers from three different organizations, namely Newspaper Printing Press (NPP), Chhattisgarh State Electricity Board (CSEB) and Raipur Alloys (RA). Shift workers of NPP (N = 20) were working on a permanent night shift schedule (NS; 20:00-04:00). However, in CSEB (N = 14) and RA (N = 14), shift workers were working in a 3-shift system comprising of rotations from night (NS; 22:00-06:00) to afternoon (AS; 14:00-22:00) and to morning shift (MS; 06:00-14:00). Each subject wore an Actiwatch (AW64, Mini Mitter Co. Inc., USA) for 7 and/or 21 consecutive days, only after furnishing a certificate of consent. One-minute epoch length was chosen for the collection of wrist activity data. Period was determined by using Actiware sleep software (Periodogram), Lomb-Scargle Periodogram (LSP) and Spectral analysis software (Spectre). Other statistical techniques, such as ANOVA and Duncan’s multiple-range test were also used whenever required. A statistically significant circadian rhythm in rest-activity, gauged by cosinor, was documented in all shift workers, irrespective of shift work. Results indicate that the efficiency of the technique to determine the period (τ) depended upon the clipping limits of the τs. It appears that the technique of spectre is more reliable.

Keywords: biological clock, rest activity rhythm, spectre, periodogram

Procedia PDF Downloads 147
17780 Energy Performance Gaps in Residences: An Analysis of the Variables That Cause Energy Gaps and Their Impact

Authors: Amrutha Kishor

Abstract:

Today, with the rising global warming and depletion of resources every industry is moving toward sustainability and energy efficiency. As part of this movement, it is nowadays obligatory for architects to play their part by creating energy predictions for their designs. But in a lot of cases, these predictions do not reflect the real quantities of energy in newly built buildings when operating. These can be described as ‘Energy Performance Gaps’. This study aims to determine the underlying reasons for these gaps. Seven houses designed by Allan Joyce Architects, UK from 1998 until 2019 were considered for this study. The data from the residents’ energy bills were cross-referenced with the predictions made with the software SefairaPro and from energy reports. Results indicated that the predictions did not match the actual energy usage. An account of how energy was used in these seven houses was made by means of personal interviews. The main factors considered in the study were occupancy patterns, heating systems and usage, lighting profile and usage, and appliances’ profile and usage. The study found that the main reasons for the creation of energy gaps were the discrepancies in occupant usage and patterns of energy consumption that are predicted as opposed to the actual ones. This study is particularly useful for energy-conscious architectural firms to fine-tune the approach to designing houses and analysing their energy performance. As the findings reveal that energy usage in homes varies based on the way residents use the space, it helps deduce the most efficient technological combinations. This information can be used to set guidelines for future policies and regulations related to energy consumption in homes. This study can also be used by the developers of simulation software to understand how architects use their product and drive improvements in its future versions.

Keywords: architectural simulation, energy efficient design, energy performance gaps, environmental design

Procedia PDF Downloads 99
17779 Life Cycle Assessment of Residential Buildings: A Case Study in Canada

Authors: Venkatesh Kumar, Kasun Hewage, Rehan Sadiq

Abstract:

Residential buildings consume significant amounts of energy and produce a large amount of emissions and waste. However, there is a substantial potential for energy savings in this sector which needs to be evaluated over the life cycle of residential buildings. Life Cycle Assessment (LCA) methodology has been employed to study the primary energy uses and associated environmental impacts of different phases (i.e., product, construction, use, end of life, and beyond building life) for residential buildings. Four different alternatives of residential buildings in Vancouver (BC, Canada) with a 50-year lifespan have been evaluated, including High Rise Apartment (HRA), Low Rise Apartment (LRA), Single family Attached House (SAH), and Single family Detached House (SDH). Life cycle performance of the buildings is evaluated for embodied energy, embodied environmental impacts, operational energy, operational environmental impacts, total life-cycle energy, and total life cycle environmental impacts. Estimation of operational energy and LCA are performed using DesignBuilder software and Athena Impact estimator software respectively. The study results revealed that over the life span of the buildings, the relationship between the energy use and the environmental impacts are identical. LRA is found to be the best alternative in terms of embodied energy use and embodied environmental impacts; while, HRA showed the best life-cycle performance in terms of minimum energy use and environmental impacts. Sensitivity analysis has also been carried out to study the influence of building service lifespan over 50, 75, and 100 years on the relative significance of embodied energy and total life cycle energy. The life-cycle energy requirements for SDH is found to be a significant component among the four types of residential buildings. The overall disclose that the primary operations of these buildings accounts for 90% of the total life cycle energy which far outweighs minor differences in embodied effects between the buildings.

Keywords: building simulation, environmental impacts, life cycle assessment, life cycle energy analysis, residential buildings

Procedia PDF Downloads 447
17778 Optimization of Monascus Orange Pigments Production Using pH-Controlled Fed-Batch Fermentation

Authors: Young Min Kim, Deokyeong Choe, Chul Soo Shin

Abstract:

Monascus pigments, commonly used as a natural colorant in Asia, have many biological activities, such as cholesterol level control, anti-obesity, anti-cancer, and anti-oxidant, that have recently been elucidated. Especially, amino acid derivatives of Monascus pigments are receiving much attention because they have higher biological activities than original Monascus pigments. Previously, there have been two ways to produce amino acid derivatives: one-step production and two-step production. However, the one-step production has low purity, and the two-step production—precursor(orange pigments) fermentation and derivatives synthesis—has low productivity and growth rate during its precursor fermentation step. In this study, it was verified that pH is a key factor that affects the stability of orange pigments and the growth rate of Monascus. With an optimal pH profile obtained by pH-stat fermentation, we designed a process of precursor(orange pigments) fermentation that is a pH-controlled fed-batch fermentation. The final concentration of orange pigments in this process increased to 5.5g/L which is about 30% higher than the concentration produced from the previously used precursor fermentation step.

Keywords: cultivation process, fed-batch fermentation, monascus pigments, pH stability

Procedia PDF Downloads 284
17777 Energy-Led Sustainability Assessment Approach for Energy-Efficient Manufacturing

Authors: Aldona Kluczek

Abstract:

In recent years, manufacturing processes have interacted with sustainability issues realized in the cost-effective ways that minimalize energy, decrease negative impacts on the environment and are safe for society. However, the attention has been on separate sustainability assessment methods considering energy and material flow, energy consumption, and emission release or process control. In this paper, the energy-led sustainability assessment approach combining the methods: energy Life Cycle Assessment to assess environmental impact, Life Cycle Cost to analyze costs, and Social Life Cycle Assessment through ‘energy LCA-based value stream map’, is used to assess the energy sustainability of the hardwood lumber manufacturing process in terms of technologies. The approach integrating environmental, economic and social issues can be visualized in the considered energy-efficient technologies on the map of an energy LCA-related (input and output) inventory data. It will enable the identification of efficient technology of a given process to be reached, through the effective analysis of energy flow. It is also indicated that interventions in the considered technology should focus on environmental, economic improvements to achieve energy sustainability. The results have indicated that the most intense energy losses are caused by a cogeneration technology. The environmental impact analysis shows that a substantial reduction by 34% can be achieved with the improvement of it. From the LCC point of view, the result seems to be cost-effective, when done at that plant where the improvement is used. By demonstrating the social dimension, every component of the energy of plant labor use in the life-cycle process of the lumber production has positive energy benefits. The energy required to install the energy-efficient technology amounts to 30.32 kJ compared to others components of the energy of plant labor and it has the highest value in terms of energy-related social indicators. The paper depicts an example of hardwood lumber production in order to prove the applicability of a sustainability assessment method.

Keywords: energy efficiency, energy life cycle assessment, life cycle cost, social life cycle analysis, manufacturing process, sustainability assessment

Procedia PDF Downloads 233
17776 Feasibility of an Extreme Wind Risk Assessment Software for Industrial Applications

Authors: Francesco Pandolfi, Georgios Baltzopoulos, Iunio Iervolino

Abstract:

The impact of extreme winds on industrial assets and the built environment is gaining increasing attention from stakeholders, including the corporate insurance industry. This has led to a progressively more in-depth study of building vulnerability and fragility to wind. Wind vulnerability models are used in probabilistic risk assessment to relate a loss metric to an intensity measure of the natural event, usually a gust or a mean wind speed. In fact, vulnerability models can be integrated with the wind hazard, which consists of associating a probability to each intensity level in a time interval (e.g., by means of return periods) to provide an assessment of future losses due to extreme wind. This has also given impulse to the world- and regional-scale wind hazard studies.Another approach often adopted for the probabilistic description of building vulnerability to the wind is the use of fragility functions, which provide the conditional probability that selected building components will exceed certain damage states, given wind intensity. In fact, in wind engineering literature, it is more common to find structural system- or component-level fragility functions rather than wind vulnerability models for an entire building. Loss assessment based on component fragilities requires some logical combination rules that define the building’s damage state given the damage state of each component and the availability of a consequence model that provides the losses associated with each damage state. When risk calculations are based on numerical simulation of a structure’s behavior during extreme wind scenarios, the interaction of component fragilities is intertwined with the computational procedure. However, simulation-based approaches are usually computationally demanding and case-specific. In this context, the present work introduces the ExtReMe wind risk assESsment prototype Software, ERMESS, which is being developed at the University of Naples Federico II. ERMESS is a wind risk assessment tool for insurance applications to industrial facilities, collecting a wide assortment of available wind vulnerability models and fragility functions to facilitate their incorporation into risk calculations based on in-built or user-defined wind hazard data. This software implements an alternative method for building-specific risk assessment based on existing component-level fragility functions and on a number of simplifying assumptions for their interactions. The applicability of this alternative procedure is explored by means of an illustrative proof-of-concept example, which considers four main building components, namely: the roof covering, roof structure, envelope wall and envelope openings. The application shows that, despite the simplifying assumptions, the procedure can yield risk evaluations that are comparable to those obtained via more rigorous building-level simulation-based methods, at least in the considered example. The advantage of this approach is shown to lie in the fact that a database of building component fragility curves can be put to use for the development of new wind vulnerability models to cover building typologies not yet adequately covered by existing works and whose rigorous development is usually beyond the budget of portfolio-related industrial applications.

Keywords: component wind fragility, probabilistic risk assessment, vulnerability model, wind-induced losses

Procedia PDF Downloads 171
17775 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics

Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur

Abstract:

Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.

Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics

Procedia PDF Downloads 90
17774 Effects of Inlet Filtration Pressure Loss on Single and Two-Spool Gas Turbine

Authors: Enyia James Diwa, Dodeye Ina Igbong, Archibong Archibong Eso

Abstract:

Gas turbine operators have been faced with the dramatic financial setback resulting from compressor fouling. In a highly deregulated power industry where there is stiffness in the market competition, has made it imperative to improvise means of reducing maintenance cost in other to yield maximum profit. Compressor fouling results from the deposition of contaminants in the presence of oil and moisture on the compressor blade or annulus surfaces, which leads to a loss in flow capacity and compressor efficiency. These combined effects reduce power output, increase heat rate and cause creep life reduction. This paper also contains a model of two gas turbine engines via Cranfield University software known as TURBOMATCH, which is simulation software for detecting engine fouling rate. The model engines are of different configurations and capacities, and are operating in two different modes of constant output power and turbine inlet temperature for a two and three stage filter system. The idea is to investigate the more economically viable filtration systems by gas turbine users based on performance only. It has been demonstrated in the results that the two spool engine is a little more beneficial compared to the single spool. This is as a result of a higher pressure ratio of the two spools as well as the deceleration of the high-pressure compressor and high-pressure turbine speed in a constant TET. Meanwhile, the inlet filtration system was properly designed and balanced with a well-timed and economical compressor washing regime/scheme to control compressor fouling. The different technologies of inlet air filtration and compressor washing are considered and an attempt at optimization with respect to the cost of a combination of both control measures are made.

Keywords: inlet filtration, pressure loss, single spool, two spool

Procedia PDF Downloads 303
17773 Behavior of hFOB 1.19 Cells in Injectable Scaffold Composing of Pluronic F127 and Carboxymethyl Hexanoyl Chitosan

Authors: Lie-Sian Yap, Ming-Chien Yang

Abstract:

This study demonstrated a novel injectable hydrogel scaffold composing of Pluronic F127, carboxymethyl hexanoyl chitosan (CA) and glutaraldehyde (GA) for encapsulating human fetal osteoblastic cells (hFOB) 1.19. The hydrogel was prepared by mixing F127 and GA in CA solution at 4°C. The mechanical properties and cytotoxicity of this hydrogel were determined through rheological measurements and MTT assay, respectively. After encapsulation process, the hFOB 1.19 cells morphology was examined using fluorescent and confocal imaging. The results indicated that the Tgel of this system was around 30°C, where sol-gel transformation occurred within 90s and F127/CA/GA gel was able to remain intact in the medium for more than 1 month. In vitro cell culture assay revealed that F127/CA/GA hydrogels were non-cytotoxic. Encapsulated hFOB 1.19 cells not only showed the spherical shape and formed colonies, but also reduced their size. Moreover, the hFOB 1.19 cells showed that cells remain alive after the encapsulation process. Based on these results, these F127/CA/GA hydrogels can be used to encapsulate cells for tissue engineering applications.

Keywords: carboxymethyl hexanoyl chitosan, cell encapsulation, hFOB 1.19, Pluronic F127

Procedia PDF Downloads 225
17772 Road Maintenance Management Decision System Using Multi-Criteria and Geographical Information System for Takoradi Roads, Ghana

Authors: Eric Mensah, Carlos Mensah

Abstract:

The road maintenance backlogs created as a result of deferred maintenance especially in developing countries has caused considerable deterioration of many road assets. This is usually due to difficulties encountered in selecting and prioritising maintainable roads based on objective criteria rather than some political or other less important criteria. In order to ensure judicious use of limited resources for road maintenance, five factors were identified as the most important criteria for road management within the study area. This was based on the judgements of 40 experts. The results were further used to develop weightings using the Multi-Criteria Decision Process (MCDP) to analyse and select road alternatives according to maintenance goal. Using Geographical Information Systems (GIS), maintainable roads were grouped using the Jenk’s natural breaks to allow for further prioritised in order of importance for display on a dashboard of maps, charts, and tables. This reduces the problems of subjective maintenance and road selections, thereby reducing wastage of resources and easing the maintenance process through an object organised spatial decision support system.

Keywords: decision support, geographical information systems, multi-criteria decision process, weighted sum

Procedia PDF Downloads 358
17771 Dynamic Process Model for Designing Smart Spaces Based on Context-Awareness and Computational Methods Principles

Authors: Heba M. Jahin, Ali F. Bakr, Zeyad T. Elsayad

Abstract:

As smart spaces can be defined as any working environment which integrates embedded computers, information appliances and multi-modal sensors to remain focused on the interaction between the users, their activity, and their behavior in the space; hence, smart space must be aware of their contexts and automatically adapt to their changing context-awareness, by interacting with their physical environment through natural and multimodal interfaces. Also, by serving the information used proactively. This paper suggests a dynamic framework through the architectural design process of the space based on the principles of computational methods and context-awareness principles to help in creating a field of changes and modifications. It generates possibilities, concerns about the physical, structural and user contexts. This framework is concerned with five main processes: gathering and analyzing data to generate smart design scenarios, parameters, and attributes; which will be transformed by coding into four types of models. Furthmore, connecting those models together in the interaction model which will represent the context-awareness system. Then, transforming that model into a virtual and ambient environment which represents the physical and real environments, to act as a linkage phase between the users and their activities taking place in that smart space . Finally, the feedback phase from users of that environment to be sure that the design of that smart space fulfill their needs. Therefore, the generated design process will help in designing smarts spaces that can be adapted and controlled to answer the users’ defined goals, needs, and activity.

Keywords: computational methods, context-awareness, design process, smart spaces

Procedia PDF Downloads 305
17770 Information Communication Technology (ICT) Using Management in Nursing College under the Praboromarajchanok Institute

Authors: Suphaphon Udomluck, Pannathorn Chachvarat

Abstract:

Information Communication Technology (ICT) using management is essential for effective decision making in organization. The Concerns Based Adoption Model (CBAM) was employed as the conceptual framework. The purposes of the study were to assess the situation of Information Communication Technology (ICT) using management in College of Nursing under the Praboromarajchanok Institute. The samples were multi – stage sampling of 10 colleges of nursing that participated include directors, vice directors, head of learning groups, teachers, system administrator and responsible for ICT. The total participants were 280; the instrument used were questionnaires that include 4 parts, general information, Information Communication Technology (ICT) using management, the Stage of concern Questionnaires (SoC), and the Levels of Use (LoU) ICT Questionnaires respectively. Reliability coefficients were tested; alpha coefficients were 0.967for Information Communication Technology (ICT) using management, 0.884 for SoC and 0.945 for LoU. The data were analyzed by frequency, percentage, mean, standard deviation, Pearson Product Moment Correlation and Multiple Regression. They were founded as follows: The high level overall score of Information Communication Technology (ICT) using management and issue were administration, hardware, software, and people. The overall score of the Stage of concern (SoC)ICTis at high level and the overall score of the Levels of Use (LoU) ICTis at moderate. The Information Communication Technology (ICT) using management had the positive relationship with the Stage of concern (SoC)ICTand the Levels of Use (LoU) ICT(p < .01). The results of Multiple Regression revealed that administration hardwear, software and people ware could predict SoC of ICT (18.5%) and LoU of ICT (20.8%).The factors that were significantly influenced by SoCs were people ware. The factors that were significantly influenced by LoU of ICT were administration hardware and people ware.

Keywords: information communication technology (ICT), management, the concerns-based adoption model (CBAM), stage of concern(SoC), the levels of use(LoU)

Procedia PDF Downloads 293
17769 The Implementation of the European Landscape Convention in Turkey: Opportunities and Constraints

Authors: Tutku Ak, Abdullah Kelkit, Cihad Öztürk

Abstract:

An increase has been witnessed with the number of multinational environmental agreements in the past decade, particularly in Europe. Success with implementation, however, shows variation. While many countries are willing to join these agreements, they do not always fully honor their obligations to put their commitments into practice. One reason for this is that countries have different legal and administrative systems. One example of an international multilateral environmental agreement is the European Landscape Convention (ELC). ELC expresses a concern to achieve sustainable development based on a balanced and harmonious relationship between social needs, economic activity, and the environment. Member states are required to implement the convention in accordance with their own administrative structure, respecting subsidiarity. In particular, the importance of cooperation in the protection, management, and planning of the resources is expressed through the convention. In this paper, it is intended to give a broad view of ELC’s implementation process in Turkey and what factors have influenced by the process. Under this context, the paper will focus on the objectives of the convention for addressing the issue of the loss of European landscapes, and the justification and tools used to accomplish these objectives. The degree to which these objectives have been implemented in Turkey and the opportunities and constraints that have been faced during this process have been discussed.

Keywords: European landscape convention, implementation, multinational environmental agreements, policy tools

Procedia PDF Downloads 281
17768 An Evaluation of Education Provision for Students with Autism Spectrum Disorder in Ireland: The Role of the Special Needs Assistant

Authors: Claire P. Griffin

Abstract:

The education provision for students with special educational needs, including students with Autism Spectrum Disorder (ASD), has undergone significant national and international changes in recent years. In particular, an increase in resource-based provision has occurred across educational settings in an effort to support inclusive practices. This paper seeks to explore the role of the Special Needs Assistant (SNA) in supporting children with ASD in Irish schools. This research stems from the second national evaluation of ‘Education Provision for Students with Autism Spectrum Disorder in Ireland’ (NCSE, 2016). This research was commissioned by the National Council for Special Education (NCSE) in Ireland and conducted by a team of researchers from Mary Immaculate College, Limerick from February to July 2014. This study involved a multiple case study research strategy across 24 educational sites, as selected through a stratified sampling process. Research strategies included semi-structured interviews, classroom observations, documentary review and child conversations. Data analysis was conducted electronically using Nvivo software, with use of an additional quantitative recording mechanism based on scaled weighting criteria for collected data. Based on such information, key findings from the NCSE national evaluation will be presented and critically reviewed, with particular reference to the role of the SNA in supporting pupils with ASD. Examples of positive practice inherent within the SNA role will be outlined and contrasted with discrete areas for development. Based on such findings, recommendations for the evolving role of the SNA will be presented, with the aim of informing both policy and best practice within the field.

Keywords: autism spectrum disorder, inclusive education , paraprofessional, special needs assistant

Procedia PDF Downloads 261
17767 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets

Authors: Ece Cigdem Mutlu, Burak Alakent

Abstract:

Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.

Keywords: average run length, M-estimators, quality control, robust estimators

Procedia PDF Downloads 175
17766 Nanda Ways of Knowing, Being and Doing: Our Process of Research Engagement and Research Impacts

Authors: Steven Kelly

Abstract:

A fundament role of the researcher is research engagement, that is, the interaction between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods, or resources. While research impact is the contribution that research makes to the economy, society, environment, or culture beyond the contribution to academic research. Ironically, traditional impact metrics in the academy are designed to focus on the outputs; it dismisses the important role engagement plays in fostering a collaborative process that leads to meaningful, ethical, and useful impacts. Dr. Kelly, aNanda (First Nations) man himself, has worked closely with the Nanda community over the past decade, ensuring cultural protocols are upheld and implemented while doing research engagement. The focus was on the process, which was essential to foster a positive research impact culture. The contributions that flowed from this process were the naming of a new species of squat lobster in the Nanda language, a poster design in collaboration with The University of Melbourne, Museums Victoria and Bundiyarra - IrraWanga language centre, media coverage, and the formation of the “Nanda language, Nanda country project”. The Nanda language, Nanda country project is a language revitalization project that focused on reconnecting Nanda people with the language & culture on Nanda Country. Such outcomes are imperative on the eve of the United Nations International Decade of Indigenous Languages. In this paperDr, Kellywill discuss howNanda cultural practicesinformed research engagement to foster a collaborative processthat, in turn, ledto meaningful, ethical, and useful impacts within and outside of the academy.

Keywords: community collaboration, indigenous, nanda, research engagement, research impacts

Procedia PDF Downloads 95
17765 The Automatic Transliteration Model of Images of the Book Hamong Tani Using Statistical Approach

Authors: Agustinus Rudatyo Himamunanto, Anastasia Rita Widiarti

Abstract:

Transliteration using Javanese manuscripts is one of methods to preserve and legate the wealth of literature in the past for the present generation in Indonesia. The transliteration manual process commonly requires philologists and takes a relatively long time. The automatic transliteration process is expected to shorten the time so as to help the works of philologists. The preprocessing and segmentation stage firstly done is used to manage the document images, thus obtaining image script units that will compile input document images free from noise and have the similarity in properties in the thickness, size, and slope. The next stage of characteristic extraction is used to find unique characteristics that will distinguish each Javanese script image. One of characteristics that is used in this research is the number of black pixels in each image units. Each image of Java scripts contained in the data training will undergo the same process similar to the input characters. The system testing was performed with the data of the book Hamong Tani. The book Hamong Tani was selected due to its content, age and number of pages. Those were considered sufficient as a model experimental input. Based on the results of random page automatic transliteration process testing, it was determined that the maximum percentage correctness obtained was 81.53%. The percentage of success was obtained in 32x32 pixel input image size with the 5x5 image window. With regard to the results, it can be concluded that the automatic transliteration model offered is relatively good.

Keywords: Javanese script, character recognition, statistical, automatic transliteration

Procedia PDF Downloads 326
17764 Imputation Technique for Feature Selection in Microarray Data Set

Authors: Younies Saeed Hassan Mahmoud, Mai Mabrouk, Elsayed Sallam

Abstract:

Analysing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.

Keywords: DNA microarray, feature selection, missing data, bioinformatics

Procedia PDF Downloads 551
17763 The Role and Impact of Cold Spray Technology on Surface Engineering

Authors: Ionel Botef

Abstract:

Studies show that, for viable product realisation and maintenance, a spectrum of novel processing technologies and materials to improve performance and reduce costs and environmental impact must constantly be addressed. One of these technologies, namely the cold spray process, has enabled a broad range of coatings and applications, including many that have not been previously possible or commercially practical, hence its potential for new aerospace, electronics, or medical applications. Therefore, the purpose of this paper is to summarise the state of the art of this technology alongside its theoretical and experimental studies, and explore the role and impact of cold spraying on surface engineering.

Keywords: surface engineering, cold spray, ageing aircrafts, corrosion, microchannels, maintenance

Procedia PDF Downloads 589
17762 Towards a Comprehensive Framework on Civic Competence Development of Teachers: A Systematic Review of Literature

Authors: Emilie Vandevelde, Ellen Claes

Abstract:

This study aims to develop a comprehensive model for the civic socialization process of teachers. Citizenship has become one of the main objectives for the European education systems. It is expected that teachers are well prepared and equipped with the necessary knowledge, skills, and attitudes to also engage students in democratic citizenship. While a lot is known about young peoples’ civic competence development and how schools and teachers (don’t) support this process, less is known about how teachers themselves engage with (the teaching of) civics. Other than the civic socialization process of young adolescents that focuses on personal competence development, the civic socialization process of teachers includes the development of professional, civic competences. These professional competences make that they are able to prepare pupils to carry out their civic responsibilities in thoughtful ways. Existing models for the civic socialization process of young adolescents do not take this dual purpose into account. Based on these observations, this paper will investigate (1)What personal and professional civic competences teachers need to effectively teach civic education and (2) how teachers acquire these personal and professional civic competences. To answer the first research question, a systematic review of literature of existing civic education frameworks was carried out and linked to literature on teacher training. The second research question was addressed by adapting the Octagon model, developed by the International Association for the Evaluation of Educational Achievement (IEA), to the context of teachers. This was done by carrying out a systematic review of the recent literature linking three theoretical topics involved in teachers’ civic competence development: theories about the civic socialization process of young adolescents, Schulmans (1987) theoretical assumptions on pedagogical content knowledge (PCK), and Nogueira & Moreira’s (2012) framework for civic education teachers’ knowledge and literature on teachers’ professional development. This resulted in a comprehensive conceptual framework describing the personal and professional civic competences of civic education teachers. In addition, this framework is linked to the OctagonT model: a model that describes the processes through which teachers acquire these personal and professional civic competences. This model recognizes that teachers’ civic socialization process is influenced by interconnected variables located at different levels in a multi-level structure (the individual teacher (e.g., civic beliefs), everyday contacts (e.g., teacher educators, the intended, informal and hidden curriculum of the teacher training program, internship contacts, participation opportunities in teacher training, etc.) and the influence of the national educational context (e.g., vision on civic education)). Furthermore, implications for teacher education programs are described.

Keywords: civic education, civic competences, civic socialization, octagon model, teacher training

Procedia PDF Downloads 252
17761 Integrated Gas Turbine Performance Diagnostics and Condition Monitoring Using Adaptive GPA

Authors: Yi-Guang Li, Suresh Sampath

Abstract:

Gas turbine performance degrades over time, and the degradation is greatly affected by environmental, ambient, and operating conditions. The engines may degrade slowly under favorable conditions and result in a waste of engine life if a scheduled maintenance scheme is followed. They may also degrade fast and fail before a scheduled overhaul if the conditions are unfavorable, resulting in serious secondary damage, loss of engine availability, and increased maintenance costs. To overcome these problems, gas turbine owners are gradually moving from scheduled maintenance to condition-based maintenance, where condition monitoring is one of the key supporting technologies. This paper presents an integrated adaptive GPA diagnostics and performance monitoring system developed at Cranfield University for gas turbine gas path condition monitoring. It has the capability to predict the performance degradation of major gas path components of gas turbine engines, such as compressors, combustors, and turbines, using gas path measurement data. It is also able to predict engine key performance parameters for condition monitoring, such as turbine entry temperature that cannot be directly measured. The developed technology has been implemented into digital twin computer Software, Pythia, to support the condition monitoring of gas turbine engines. The capabilities of the integrated GPA condition monitoring system are demonstrated in three test cases using a model gas turbine engine similar to the GE aero-derivative LM2500 engine widely used in power generation and marine propulsion. It shows that when the compressor of the model engine degrades, the Adaptive GPA is able to predict the degradation and the changing engine performance accurately using gas path measurements. Such a presented technology and software are generic, can be applied to different types of gas turbine engines, and provide crucial engine health and performance parameters to support condition monitoring and condition-based maintenance.

Keywords: gas turbine, adaptive GPA, performance, diagnostics, condition monitoring

Procedia PDF Downloads 65
17760 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 131
17759 Preparation Non-Woven Nanofiber Structures for Uniform and Rapid Drug Releasing Applications Using an Electrospinning Process

Authors: Cho-Liang Chung

Abstract:

Uniform and rapid drug release are important for trauma dressing application. Low glass transition polymer system and non-woven nanofiber structures as the designs conduct rapid-release characteristics. In this study, polyvinylpyrrolidone, polysulfone, and polystyrene were dissolved in dimethylformamide to form precursor solution. These solutions were blended with vitamin C to form the electrospinning solutions. The non-woven nanofibers structures were successfully prepared using an electrospinning process. The following instruments were used to analyze the characteristics of non-woven nanofibers structures: Atomic force microscopy (AFM), Field Emission Scanning Electron Microscope (FE-SEM), and X-ray Diffraction (XRD). The AFM was used to scan the nanofibers. 3D Graphics were applied to explore the surface morphology of nanofibers. FE-SEM was used to explore the morphology of non-woven structures. XRD was used to identify crystal structures in the non-woven structures. The evolution of morphology of non-woven structures was changed dramatically in different durations, because of the moisture absorption and decreasing glass transition temperature; the non-woven nanofiber structures can be applied to uniform and rapid drug release for trauma dressing application.

Keywords: nanofibers, non-woven, electrospinning process, rapid drug releasing

Procedia PDF Downloads 127
17758 Synthesis and Characterization of PVDF, FG, PTFE, and PES Membrane Distillation Modified with Silver Nanoparticles

Authors: Lopez J., Mehrvar M., Quinones E., Suarez A., RomeroC.

Abstract:

The Silver Nanoparticles (AgNP) are used as deliver of heat on surface of Membrane Distillation in order to fight against Thermal Polarization and improving the Desalination Process. In this study AgNPwere deposited by dip coating process over PVDF, FG hydrophilic, and PTFE hydrophobic commercial membranes as substrate. Membranes were characterized by SEM, EDS, contact angle, Pore size distributionand using a UV lamp and a thermal camera were measured the performance of heat deliver. The presence of AgNP 50 – 150 nm and the increase in absorption of energy over membrane were verified.

Keywords: silver nanoparticles, membrane distillation, plasmon effect, heat deliver

Procedia PDF Downloads 108
17757 Electrochemical Regeneration of GIC Adsorbent in a Continuous Electrochemical Reactor

Authors: S. N. Hussain, H. M. A. Asghar, H. Sattar, E. P. L. Roberts

Abstract:

Arvia™ introduced a novel technology consisting of adsorption followed by electrochemical regeneration with a graphite intercalation compound adsorbent that takes place in a single unit. The adsorbed species may lead to the formation of intermediate by-products products due to incomplete mineralization during electrochemical regeneration. Therefore, the investigation of breakdown products due to incomplete oxidation is of great concern regarding the commercial applications of this process. In the present paper, the formation of the chlorinated breakdown products during continuous process of adsorption and electrochemical regeneration based on a graphite intercalation compound adsorbent has been investigated.

Keywords: GIC, adsorption, electrochemical regeneration, chlorphenols

Procedia PDF Downloads 293