Search results for: integrated definition for process description capture (IDEF3) method
33194 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 7833193 Arabic Lexicon Learning to Analyze Sentiment in Microblogs
Authors: Mahmoud B. Rokaya
Abstract:
The study of opinion mining and sentiment analysis includes analysis of opinions, sentiments, evaluations, attitudes, and emotions. The rapid growth of social media, social networks, reviews, forum discussions, microblogs, and Twitter, leads to a parallel growth in the field of sentiment analysis. The field of sentiment analysis tries to develop effective tools to make it possible to capture the trends of people. There are two approaches in the field, lexicon-based and corpus-based methods. A lexicon-based method uses a sentiment lexicon which includes sentiment words and phrases with assigned numeric scores. These scores reveal if sentiment phrases are positive or negative, their intensity, and/or their emotional orientations. Creation of manual lexicons is hard. This brings the need for adaptive automated methods for generating a lexicon. The proposed method generates dynamic lexicons based on the corpus and then classifies text using these lexicons. In the proposed method, different approaches are combined to generate lexicons from text. The proposed method classifies the tweets into 5 classes instead of +ve or –ve classes. The sentiment classification problem is written as an optimization problem, finding optimum sentiment lexicons are the goal of the optimization process. The solution was produced based on mathematical programming approaches to find the best lexicon to classify texts. A genetic algorithm was written to find the optimal lexicon. Then, extraction of a meta-level feature was done based on the optimal lexicon. The experiments were conducted on several datasets. Results, in terms of accuracy, recall and F measure, outperformed the state-of-the-art methods proposed in the literature in some of the datasets. A better understanding of the Arabic language and culture of Arab Twitter users and sentiment orientation of words in different contexts can be achieved based on the sentiment lexicons proposed by the algorithm.Keywords: social media, Twitter sentiment, sentiment analysis, lexicon, genetic algorithm, evolutionary computation
Procedia PDF Downloads 18833192 Integrated Power Saving for Multiple Relays and UEs in LTE-TDD
Authors: Chun-Chuan Yang, Jeng-Yueng Chen, Yi-Ting Mai, Chen-Ming Yang
Abstract:
In this paper, the design of integrated sleep scheduling for relay nodes and user equipments under a Donor eNB (DeNB) in the mode of Time Division Duplex (TDD) in LTE-A is presented. The idea of virtual time is proposed to deal with the discontinuous pattern of the available radio resource in TDD, and based on the estimation of the traffic load, three power saving schemes in the top-down strategy are presented. Associated mechanisms in each scheme including calculation of the virtual subframe capacity, the algorithm of integrated sleep scheduling, and the mapping mechanisms for the backhaul link and the access link are presented in the paper. Simulation study shows the advantage of the proposed schemes in energy saving over the standard DRX scheme.Keywords: LTE-A, relay, TDD, power saving
Procedia PDF Downloads 51633191 A Relational View for Financial Metrics in Logistics Service Providers
Authors: Paulo Sergio Altman Ferreira
Abstract:
Relationship development plays an essential role in every logistics company. Logistics companies are service-based businesses essentially performing the flow of materials, housing, and inventory management for a wide range of customers. The service encounter between the logistics provider’s personnel and the customers may form a connection that will demonstrate a strong impact, not only to the customers' overall satisfaction but may also provide the perception of individualized services. Logistics services must drive value. It also shows a close influence on the quality and costs of client-centered services. If we describe logistics value creation as the function of quality perception of the client divided by service costs, there is a requirement to better outline and explain the measures and analytics for logistics costs and relationship performance. This critical shift to understand logistics services is a relevant contribution to capture how relationship value can be quantified. This might involve changing our current perspective on logistics providers beyond uniquely measuring the services in terms of activities, personnel levels, and financial/costs ratios. This paper argues that measuring value creation accomplishments of logistics services needs to consider the relational improvements for the wider range of logistics companies. Accurate logistics value requires a description of the financial impact of the relational perspective of the service.Keywords: logistics services providers, financial metrics, relationship management, value creation
Procedia PDF Downloads 15033190 A Gamification Teaching Method for Software Measurement Process
Authors: Lennon Furtado, Sandro Oliveira
Abstract:
The importance of an effective measurement program lies in the ability to control and predict what can be measured. Thus, the measurement program has the capacity to provide bases in decision-making to support the interests of an organization. Therefore, it is only possible to apply for an effective measurement program with a team of software engineers well trained in the measurement area. However, the literature indicates that are few computer science courses that have in their program the teaching of the software measurement process. And even these, generally present only basic theoretical concepts of said process and little or no measurement in practice, which results in the student's lack of motivation to learn the measurement process. In this context, according to some experts in software process improvements, one of the most used approaches to maintaining the motivation and commitment to software process improvements program is the use of the gamification. Therefore, this paper aims to present a proposal of teaching the measurement process by gamification. Which seeks to improve student motivation and performance in the assimilation of tasks related to software measurement, by incorporating elements of games into the practice of measurement process, making it more attractive for learning. And as a way of validating the proposal will be made a comparison between two distinct groups of 20 students of Software Quality class, a control group, and an experiment group. The control group will be the students that will not make use of the gamification proposal to learn software measurement process, while the experiment group, will be the students that will make use of the gamification proposal to learn software measurement process. Thus, this paper will analyze the objective and subjective results of each group. And as objective result will be analyzed the student grade reached at the end of the course, and as subjective results will be analyzed a post-course questionnaire with the opinion of each student about the teaching method. Finally, this paper aims to prove or refute the following hypothesis: If the gamification proposal to teach software measurement process does appropriate motivate the student, in order to attribute the necessary competence to the practical application of the measurement process.Keywords: education, gamification, software measurement process, software engineering
Procedia PDF Downloads 31433189 Structural Health Monitoring-Integrated Structural Reliability Based Decision Making
Authors: Caglayan Hizal, Kutay Yuceturk, Ertugrul Turker Uzun, Hasan Ceylan, Engin Aktas, Gursoy Turan
Abstract:
Monitoring concepts for structural systems have been investigated by researchers for decades since such tools are quite convenient to determine intervention planning of structures. Despite the considerable development in this regard, the efficient use of monitoring data in reliability assessment, and prediction models are still in need of improvement in their efficiency. More specifically, reliability-based seismic risk assessment of engineering structures may play a crucial role in the post-earthquake decision-making process for the structures. After an earthquake, professionals could identify heavily damaged structures based on visual observations. Among these, it is hard to identify the ones with minimum signs of damages, even if they would experience considerable structural degradation. Besides, visual observations are open to human interpretations, which make the decision process controversial, and thus, less reliable. In this context, when a continuous monitoring system has been previously installed on the corresponding structure, this decision process might be completed rapidly and with higher confidence by means of the observed data. At this stage, the Structural Health Monitoring (SHM) procedure has an important role since it can make it possible to estimate the system reliability based on a recursively updated mathematical model. Therefore, integrating an SHM procedure into the reliability assessment process comes forward as an important challenge due to the arising uncertainties for the updated model in case of the environmental, material and earthquake induced changes. In this context, this study presents a case study on SHM-integrated reliability assessment of the continuously monitored progressively damaged systems. The objective of this study is to get instant feedback on the current state of the structure after an extreme event, such as earthquakes, by involving the observed data rather than the visual inspections. Thus, the decision-making process after such an event can be carried out on a rational basis. In the near future, this can give wing to the design of self-reported structures which can warn about its current situation after an extreme event.Keywords: condition assessment, vibration-based SHM, reliability analysis, seismic risk assessment
Procedia PDF Downloads 14333188 Measuring Social Dimension of Sustainable Development in New Zealand Cities
Authors: Taimaz Larimian
Abstract:
During recent years, sustainable development has increasingly influenced urban policy, housing and planning in cities all over the world. Debates about sustainability no longer consider it solely as an environmental concern, but also incorporate social and economic dimensions. However, while a social dimension of sustainability is extensively accepted, the exact definition of the concept is still vague and unclear. This study is addressing this lack of specificity through a detailed exploration of social sustainability as the least studied pillar of sustainable development and sheds light on the debate over the definition of social sustainability through developing a measurement model of the constitutive dimensions of the concept. With this aim, a conceptual framework is developed based on the existing literature, determining seven main dimensions of the social sustainability concept namely: social interaction, safety and security, social equity, social participation, neighborhood satisfaction, housing satisfaction and sense of place. The validity and reliability of the model is then tested using exploratory and confirmatory factor analysis. In order to do so, five case study neighborhoods from Dunedin city with a range of urban forms and characters are investigated, to define social sustainability concept and its consisting dimensions from people’s perspective. The findings of this study present a clear definition of social sustainability at neighborhood scale and highlight all different dimensions of the concept in the context of New Zealand cities. According to the results, among the investigated dimensions, neighborhood satisfaction and safety and security had the most influence on people’s feeling of social sustainability in their neighborhood.Keywords: social sustainability, factor analysis, neighborhood level, New Zealand cities
Procedia PDF Downloads 30033187 Face Sketch Recognition in Forensic Application Using Scale Invariant Feature Transform and Multiscale Local Binary Patterns Fusion
Authors: Gargi Phadke, Mugdha Joshi, Shamal Salunkhe
Abstract:
Facial sketches are used as a crucial clue by criminal investigators for identification of suspects when the description of eyewitness or victims are only available as evidence. A forensic artist develops a sketch as per the verbal description is given by an eyewitness that shows the facial look of the culprit. In this paper, the fusion of Scale Invariant Feature Transform (SIFT) and multiscale local binary patterns (MLBP) are proposed as a feature to recognize a forensic face sketch images from a gallery of mugshot photos. This work focuses on comparative analysis of proposed scheme with existing algorithms in different challenges like illumination change and rotation condition. Experimental results show that proposed scheme can lead to better performance for the defined problem.Keywords: SIFT feature, MLBP, PCA, face sketch
Procedia PDF Downloads 33633186 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods
Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun
Abstract:
Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics
Procedia PDF Downloads 46933185 Increasing the Efficiency of the Biomass Gasification Technology with Using the Organic Rankin Cycle
Authors: Jaroslav Frantík, Jan Najser
Abstract:
This article deals with increasing the energy efficiency of a plant in terms of optimizing the process. The European Union is striving to achieve the climate-energy package in the area increasing of energy efficiency. The goal of energy efficiency is to reduce primary energy consumption by 20% within the EU until 2020. The objective of saving energy consumption in the Czech Republic was set at 47.84 PJ (13.29 TWh). For reducing electricity consumption, it is possible to choose: a) mandatory increasing of energy efficiency, b) alternative scheme, c) combination of both actions. The Czech Republic has chosen for reducing electricity consumption using-alternative scheme. The presentation is focused on the proposal of a technological unit dealing with the gasification process of processing of biomass with an increase of power in the output. The synthesis gas after gasification of biomass is used as fuel in a cogeneration process of reciprocating internal combustion engine with the classic production of heat and electricity. Subsequently, there is an explanation of the ORC system dealing with the conversion of waste heat to electricity with the using closed cycle of the steam process with organic medium. The arising electricity is distributed to the power grid as a further energy source, or it is used for needs of the partial coverage of the technological unit. Furthermore, there is a presented schematic description of the technology with the identification of energy flows starting from the biomass treatment by drying, through its conversion to gaseous fuel, producing of electricity and utilize of thermal energy with minimizing losses. It has been found that using of ORC system increased the efficiency of the produced electricity by 7.5%.Keywords: biomass, efficiency, gasification, ORC system
Procedia PDF Downloads 21733184 The Implementation of Secton Method for Finding the Root of Interpolation Function
Authors: Nur Rokhman
Abstract:
A mathematical function gives relationship between the variables composing the function. Interpolation can be viewed as a process of finding mathematical function which goes through some specified points. There are many interpolation methods, namely: Lagrange method, Newton method, Spline method etc. For some specific condition, such as, big amount of interpolation points, the interpolation function can not be written explicitly. This such function consist of computational steps. The solution of equations involving the interpolation function is a problem of solution of non linear equation. Newton method will not work on the interpolation function, for the derivative of the interpolation function cannot be written explicitly. This paper shows the use of Secton method to determine the numerical solution of the function involving the interpolation function. The experiment shows the fact that Secton method works better than Newton method in finding the root of Lagrange interpolation function.Keywords: Secton method, interpolation, non linear function, numerical solution
Procedia PDF Downloads 37933183 Construction Technology of Modified Vacuum Pre-Loading Method for Slurry Dredged Soil
Authors: Ali H. Mahfouz, Gao Ming-Jun, Mohamad Sharif
Abstract:
Slurry dredged soil at coastal area has a high water content, poor permeability, and low surface intensity. Hence, it is infeasible to use vacuum preloading method to treat this type of soil foundation. For the special case of super soft ground, a floating bridge is first constructed on muddy soil and used as a service road and platform for implementing the modified vacuum preloading method. The modified technique of vacuum preloading and its construction process for the super soft soil foundation improvement is then studied. Application of modified vacuum preloading method shows that the technology and its construction process are highly suitable for improving the super soft soil foundation in coastal areas.Keywords: super soft foundation, dredger fill, vacuum preloading, foundation treatment, construction technology
Procedia PDF Downloads 60933182 Performance Improvement of Information System of a Banking System Based on Integrated Resilience Engineering Design
Authors: S. H. Iranmanesh, L. Aliabadi, A. Mollajan
Abstract:
Integrated resilience engineering (IRE) is capable of returning banking systems to the normal state in extensive economic circumstances. In this study, information system of a large bank (with several branches) is assessed and optimized under severe economic conditions. Data envelopment analysis (DEA) models are employed to achieve the objective of this study. Nine IRE factors are considered to be the outputs, and a dummy variable is defined as the input of the DEA models. A standard questionnaire is designed and distributed among executive managers to be considered as the decision-making units (DMUs). Reliability and validity of the questionnaire is examined based on Cronbach's alpha and t-test. The most appropriate DEA model is determined based on average efficiency and normality test. It is shown that the proposed integrated design provides higher efficiency than the conventional RE design. Results of sensitivity and perturbation analysis indicate that self-organization, fault tolerance, and reporting culture respectively compose about 50 percent of total weight.Keywords: banking system, Data Envelopment Analysis (DEA), Integrated Resilience Engineering (IRE), performance evaluation, perturbation analysis
Procedia PDF Downloads 18833181 Investigation of Light Transmission Characteristics and CO2 Capture Potential of Microalgae Panel Bioreactors for Building Façade Applications
Authors: E. S. Umdu, Ilker Kahraman, Nurdan Yildirim, Levent Bilir
Abstract:
Algae-culture offers new applications in sustainable architecture with its continuous productive cycle, and a potential for high carbon dioxide capture. Microalgae itself has multiple functions such as carbon dioxide fixation, biomass production, oxygen generation and waste water treatment. Incorporating microalgae cultivation processes and systems to building design to utilize this potential is promising. Microalgae cultivation systems, especially closed photo bioreactors can be implemented as components in buildings. And these systems be accommodated in the façade of a building, or in other urban infrastructure in the future. Application microalgae bio-reactors of on building’s façade has the added benefit of acting as an effective insulation system, keeping out the heat of the summer and the chill of the winter. Furthermore, microalgae can give a dynamic appearance with a liquid façade that also works as an adaptive sunshade. Recently, potential of microalgae to use as a building component to reduce net energy demand in buildings becomes a popular topic and innovative design proposals and a handful of pilot applications appeared. Yet there is only a handful of examples in application and even less information on how these systems affect building energy behavior. Further studies on microalgae mostly focused on single application approach targeting either carbon dioxide utilization through biomass production or biofuel production. The main objective of this study is to investigate effects of design parameters of microalgae panel bio-reactors on light transmission characteristics and CO2 capture potential during growth of Nannochloropsis occulata sp. A maximum reduction of 18 ppm in CO2 levels of input air during the experiments with a % light transmission of 14.10, was achieved in 6 day growth cycles. Heat transfer behavior during these cycles was also inspected for possible façade applications.Keywords: building façade, CO2 capture, light transmittance, microalgae
Procedia PDF Downloads 19033180 Effect of Electromagnetic Fields on Protein Extraction from Shrimp By-Products for Electrospinning Process
Authors: Guido Trautmann-Sáez, Mario Pérez-Won, Vilbett Briones, María José Bugueño, Gipsy Tabilo-Munizaga, Luis Gonzáles-Cavieres
Abstract:
Shrimp by-products are a valuable source of protein. However, traditional protein extraction methods have limitations in terms of their efficiency. Protein extraction from shrimp (Pleuroncodes monodon) industrial by-products assisted with ohmic heating (OH), microwave (MW) and pulsed electric field (PEF). It was performed by chemical method (using NaOH and HCl 2M) assisted with OH, MW and PEF in a continuous flow system (5 ml/s). Protein determination, differential scanning calorimetry (DSC) and Fourier-transform infrared (FTIR). Results indicate a 19.25% (PEF) 3.65% (OH) and 28.19% (MW) improvement in protein extraction efficiency. The most efficient method was selected for the electrospinning process and obtaining fiber.Keywords: electrospinning process, emerging technology, protein extraction, shrimp by-products
Procedia PDF Downloads 8933179 Application of Semantic Technologies in Rapid Reconfiguration of Factory Systems
Authors: J. Zhang, K. Agyapong-Kodua
Abstract:
Digital factory based on visual design and simulation has emerged as a mainstream to reduce digital development life cycle. Some basic industrial systems are being integrated via semantic modelling, and products (P) matching process (P)-resource (R) requirements are designed to fulfill current customer demands. Nevertheless, product design is still limited to fixed product models and known knowledge of product engineers. Therefore, this paper presents a rapid reconfiguration method based on semantic technologies with PPR ontologies to reuse known and unknown knowledge. In order to avoid the influence of big data, our system uses a cloud manufactory and distributed database to improve the efficiency of querying meeting PPR requirements.Keywords: semantic technologies, factory system, digital factory, cloud manufactory
Procedia PDF Downloads 48733178 Evaluation of Double Displacement Process via Gas Dumpflood from Multiple Gas Reservoirs
Authors: B. Rakjarit, S. Athichanagorn
Abstract:
Double displacement process is a method in which gas is injected at an updip well to displace the oil bypassed by waterflooding operation from downdip water injector. As gas injection is costly and a large amount of gas is needed, gas dump-flood from multiple gas reservoirs is an attractive alternative. The objective of this paper is to demonstrate the benefits of the novel approach of double displacement process via gas dump-flood from multiple gas reservoirs. A reservoir simulation model consisting of a dipping oil reservoir and several underlying layered gas reservoirs was constructed in order to investigate the performance of the proposed method. Initially, water was injected via the downdip well to displace oil towards the producer located updip. When the water cut at the producer became high, the updip well was shut in and perforated in the gas zones in order to dump gas into the oil reservoir. At this point, the downdip well was open for production. In order to optimize oil recovery, oil production and water injection rates and perforation strategy on the gas reservoirs were investigated for different numbers of gas reservoirs having various depths and thicknesses. Gas dump-flood from multiple gas reservoirs can help increase the oil recovery after implementation of waterflooding upto 10%. Although the amount of additional oil recovery is slightly lower than the one obtained in conventional double displacement process, the proposed process requires a small completion cost of the gas zones and no operating cost while the conventional method incurs high capital investment in gas compression facility and high-pressure gas pipeline and additional operating cost. From the simulation study, oil recovery can be optimized by producing oil at a suitable rate and perforating the gas zones with the right strategy which depends on depths, thicknesses and number of the gas reservoirs. Conventional double displacement process has been studied and successfully implemented in many fields around the world. However, the method of dumping gas into the oil reservoir instead of injecting it from surface during the second displacement process has never been studied. The study of this novel approach will help a practicing engineer to understand the benefits of such method and can implement it with minimum cost.Keywords: gas dump-flood, multi-gas layers, double displacement process, reservoir simulation
Procedia PDF Downloads 40833177 Material Failure Process Simulation by Improved Finite Elements with Embedded Discontinuities
Authors: Gelacio Juárez-Luna, Gustavo Ayala, Jaime Retama-Velasco
Abstract:
This paper shows the advantages of the material failure process simulation by improve finite elements with embedded discontinuities, using a new definition of traction vector, dependent on the discontinuity length and the angle. Particularly, two families of this kind of elements are compared: kinematically optimal symmetric and statically and kinematically optimal non-symmetric. The constitutive model to describe the behavior of the material in the symmetric formulation is a traction-displacement jump relationship equipped with softening after reaching the failure surface. To show the validity of this symmetric formulation, representative numerical examples illustrating the performance of the proposed formulation are presented. It is shown that the non-symmetric family may over or underestimate the energy required to create a discontinuity, as this effect is related with the total length of the discontinuity, fact that is not noticed when the discontinuity path is a straight line.Keywords: variational formulation, strong discontinuity, embedded discontinuities, strain localization
Procedia PDF Downloads 78133176 On the Bootstrap P-Value Method in Identifying out of Control Signals in Multivariate Control Chart
Authors: O. Ikpotokin
Abstract:
In any production process, every product is aimed to attain a certain standard, but the presence of assignable cause of variability affects our process, thereby leading to low quality of product. The ability to identify and remove this type of variability reduces its overall effect, thereby improving the quality of the product. In case of a univariate control chart signal, it is easy to detect the problem and give a solution since it is related to a single quality characteristic. However, the problems involved in the use of multivariate control chart are the violation of multivariate normal assumption and the difficulty in identifying the quality characteristic(s) that resulted in the out of control signals. The purpose of this paper is to examine the use of non-parametric control chart (the bootstrap approach) for obtaining control limit to overcome the problem of multivariate distributional assumption and the p-value method for detecting out of control signals. Results from a performance study show that the proposed bootstrap method enables the setting of control limit that can enhance the detection of out of control signals when compared, while the p-value method also enhanced in identifying out of control variables.Keywords: bootstrap control limit, p-value method, out-of-control signals, p-value, quality characteristics
Procedia PDF Downloads 34733175 Rehabilitation of the Blind Using Sono-Visualization Tool
Authors: Ashwani Kumar
Abstract:
In human beings, eyes play a vital role. A very less research has been done for rehabilitation of blindness for the blind people. This paper discusses the work that helps blind people for recognizing the basic shapes of the objects like circle, square, triangle, horizontal lines, vertical lines, diagonal lines and the wave forms like sinusoidal, square, triangular etc. This is largely achieved by using a digital camera, which is used to capture the visual information present in front of the blind person and a software program, which achieves the image processing operations, and finally the processed image is converted into sound. After the sound generation process, the generated sound is fed to the blind person through headphones for visualizing the imaginary image of the object. For visualizing the imaginary image of the object, it needs to train the blind person. Various training process methods had been applied for recognizing the object.Keywords: image processing, pixel, pitch, loudness, sound generation, edge detection, brightness
Procedia PDF Downloads 38833174 Developing Digital Twins of Steel Hull Processes
Authors: V. Ložar, N. Hadžić, T. Opetuk, R. Keser
Abstract:
The development of digital twins strongly depends on efficient algorithms and their capability to mirror real-life processes. Nowadays, such efforts are required to establish factories of the future faced with new demands of custom-made production. The ship hull processes face these challenges too. Therefore, it is important to implement design and evaluation approaches based on production system engineering. In this study, the recently developed finite state method is employed to describe the stell hull process as a platform for the implementation of digital twinning technology. The application is justified by comparing the finite state method with the analytical approach. This method is employed to rebuild a model of a real shipyard ship hull process using a combination of serial and splitting lines. The key performance indicators such as the production rate, work in process, probability of starvation, and blockade are calculated and compared to the corresponding results obtained through a simulation approach using the software tool Enterprise dynamics. This study confirms that the finite state method is a suitable tool for digital twinning applications. The conclusion highlights the advantages and disadvantages of methods employed in this context.Keywords: digital twin, finite state method, production system engineering, shipyard
Procedia PDF Downloads 9933173 Optimizing of Machining Parameters of Plastic Material Using Taguchi Method
Authors: Jumazulhisham Abdul Shukor, Mohd. Sazali Said, Roshanizah Harun, Shuib Husin, Ahmad Razlee Ab Kadir
Abstract:
This paper applies Taguchi Optimization Method in determining the best machining parameters for pocket milling process on Polypropylene (PP) using CNC milling machine where the surface roughness is considered and the Carbide inserts cutting tool are used. Three machining parameters; speed, feed rate and depth of cut are investigated along three levels; low, medium and high of each parameter (Taguchi Orthogonal Arrays). The setting of machining parameters were determined by using Taguchi Method and the Signal-to-Noise (S/N) ratio are assessed to define the optimal levels and to predict the effect of surface roughness with assigned parameters based on L9. The final experimental outcomes are presented to prove the optimization parameters recommended by manufacturer are accurate.Keywords: inserts, milling process, signal-to-noise (S/N) ratio, surface roughness, Taguchi Optimization Method
Procedia PDF Downloads 63733172 Combustion and Emission Characteristics in a Can-Type Combustion Chamber
Authors: Selvakuma Kumaresh, Man Young Kim
Abstract:
Combustion phenomenon will be accomplished effectively by the development of low emission combustor. One of the significant factors influencing the entire Combustion process is the mixing between a swirling angular jet (Primary Air) and the non-swirling inner jet (fuel). To study this fundamental flow, the chamber had to be designed in such a manner that the combustion process to sustain itself in a continuous manner and the temperature of the products is sufficiently below the maximum working temperature in the turbine. This study is used to develop the effective combustion with low unburned combustion products by adopting the concept of high swirl flow and motility of holes in the secondary chamber. The proper selection of a swirler is needed to reduce emission which can be concluded from the emission of Nox and CO2. The capture of CO2 is necessary to mitigate CO2 emissions from natural gas. Thus the suppression of unburned gases is a meaningful objective for the development of high performance combustor without affecting turbine blade temperature.Keywords: combustion, emission, can-type combustion chamber, CFD, motility of holes, swirl flow
Procedia PDF Downloads 37433171 The Use of Surveys to Combat Fake News in Media Literacy Education
Authors: Jaejun Jong
Abstract:
Fake news has recently become a serious international problem. Therefore, researchers and policymakers worldwide have sought to understand fake news and develop strategies to combat it. This study consists of two primary parts: (1) a literature review of how surveys were used to understand fake news and identify problems caused by fake news, and (2) a discussion of how surveys were used to fight back against fake news in educational settings. This second section specifically analyzes surveys used to evaluate a South Korean elementary school program designed to improve students’ metacognition and critical thinking. This section seeks to identify potential problems that may occur in the elementary school setting. The literature review shows that surveys can help people to understand fake news based on its traits rather than its definition due to the lack of agreement on the definition of fake news. The literature review also shows that people are not good at identifying fake news or evaluating their own ability to identify fake news; indeed, they are more likely to share information that aligns with their previous beliefs. In addition, the elementary school survey data shows that there may be substantial errors in the program evaluation process, likely caused by processing errors or the survey procedure, though the exact cause is not specified. Such a significant error in evaluating the effects of the educational program prevents teachers from making proper decisions and accurately evaluating the program. Therefore, identifying the source of such errors would improve the overall quality of education, which would benefit both teachers and students.Keywords: critical thinking, elementary education, program evaluation, survey
Procedia PDF Downloads 10333170 Development and Validation of Integrated Continuous Improvement Framework for Competitiveness: Mixed Research of Ethiopian Manufacturing Industries
Authors: Haftu Hailu Berhe, Hailekiros Sibhato Gebremichael, Kinfe Tsegay Beyene, Haileselassie Mehari
Abstract:
The purpose of the study is to develop and validate integrated literature-based JIT, TQM, TPM, SCM and LSS framework through a combination of the PDCA cycle and DMAIC methodology. The study adopted a mixed research approach. Accordingly, the qualitative study employed to develop the framework is based on identifying the uniqueness and common practices of JIT, TQM, TPM, SCM and LSS initiatives, the existing practice of the integration, identifying the existing gaps in the framework and practices, developing new integrated JIT, TQM, TPM, SCM and LSS practice framework. Previous very few studies of the uniqueness and common practices of the five initiatives are preserved. Whereas the quantitative study working to validate the framework is based on empirical analysis of the self-administered questionnaire using a statistical package for social science. A combination of the PDCA cycle and DMAIC methodology stand integrated CI framework is developed. The proposed framework is constructed as a project-based framework with five detailed implementation phases. Besides, the empirical analysis demonstrated that the proposed framework is valuable if adopted and implemented correctly. So far, there is no study proposed & validated the integrated CI framework within the scope of the study. Therefore, this is the earliest study that proposed and validated the framework for manufacturing industries. The proposed framework is applicable to manufacturing industries and can assist in achieving competitive advantages when the manufacturing industries, institutions and government offer unconditional efforts in implementing the full contents of the framework.Keywords: integrated continuous improvement framework, just in time, total quality management, total productive maintenance, supply chain management, lean six sigma
Procedia PDF Downloads 13933169 A Data-Driven Monitoring Technique Using Combined Anomaly Detectors
Authors: Fouzi Harrou, Ying Sun, Sofiane Khadraoui
Abstract:
Anomaly detection based on Principal Component Analysis (PCA) was studied intensively and largely applied to multivariate processes with highly cross-correlated process variables. Monitoring metrics such as the Hotelling's T2 and the Q statistics are usually used in PCA-based monitoring to elucidate the pattern variations in the principal and residual subspaces, respectively. However, these metrics are ill suited to detect small faults. In this paper, the Exponentially Weighted Moving Average (EWMA) based on the Q and T statistics, T2-EWMA and Q-EWMA, were developed for detecting faults in the process mean. The performance of the proposed methods was compared with that of the conventional PCA-based fault detection method using synthetic data. The results clearly show the benefit and the effectiveness of the proposed methods over the conventional PCA method, especially for detecting small faults in highly correlated multivariate data.Keywords: data-driven method, process control, anomaly detection, dimensionality reduction
Procedia PDF Downloads 29933168 An Image Based Visual Servoing (IBVS) Approach Using a Linear-Quadratic Regulator (LQR) for Quadcopters
Authors: C. Gebauer, C. Henke, R. Vossen
Abstract:
Within the Mohamed Bin Zayed International Robotics Challenge (MBZIRC) 2020, a team of unmanned aerial vehicles (UAV) is used to capture intruder drones by physical interaction. The challenge is motivated by UAV safety. The purpose of this work is to investigate the agility of a quadcopter being controlled visually. The aim is to track and follow a highly dynamic target, e.g., an intruder quadcopter. The following is realized in close range and the opponent has a velocity of up to 10 m/s. Additional limitations are given by the hardware itself, where only monocular vision is present, and no additional knowledge about the targets state is available. An image based visual servoing (IBVS) approach is applied in combination with a Linear Quadratic Regulator (LQR). The IBVS is integrated into the LQR and an optimal trajectory is computed within the projected three-dimensional image-space. The approach has been evaluated on real quadcopter systems in different flight scenarios to demonstrate the system's stability.Keywords: image based visual servoing, quadcopter, dynamic object tracking, linear-quadratic regulator
Procedia PDF Downloads 14933167 Positive Disruption: Towards a Definition of Artist-in-Residence Impact on Organisational Creativity
Authors: Denise Bianco
Abstract:
Several studies on innovation and creativity in organisations emphasise the need to expand horizons and take on alternative and unexpected views to produce something new. This paper theorises the potential impact artists can have as creative catalysts, working embedded in non-artistic organisations. It begins from an understanding that in today's ever-changing scenario, organisations are increasingly seeking to open up new creative thinking through deviant behaviours to produce innovation and that art residencies need to be critically revised in this specific context in light of their disruptive potential. On the one hand, this paper builds upon recent contributions made on workplace creativity and related concepts of deviance and disruption. Research suggests that creativity is likely to be lower in work contexts where utter conformity is a cardinal value and higher in work contexts that show some tolerance for uncertainty and deviance. On the other hand, this paper draws attention to Artist-in-Residence as a vehicle for epistemic friction between divergent and convergent thinking, which allows the creation of unparalleled ways of knowing in the dailiness of situated and contextualised social processes. In order to do so, this contribution brings together insights from the most relevant theories on organisational creativity and unconventional agile methods such as Art Thinking and direct insights from ethnographic fieldwork in the context of embedded art residencies within work organisations to propose a redefinition of Artist-in-Residence and their potential impact on organisational creativity. The result is a re-definition of embedded Artist-in-Residence in organisational settings from a more comprehensive, multi-disciplinary, and relational perspective that builds on three focal points. First the notion that organisational creativity is a dynamic and synergistic process throughout which an idea is framed by recurrent activities subjected to multiple influences. Second, the definition of embedded Artist-in-Residence as an assemblage of dynamic, productive relations and unexpected possibilities for new networks of relationality that encourage the recombination of knowledge. Third, and most importantly, the acknowledgment that embedded residencies are, at the very essence, bi-cultural knowledge contexts where creativity flourishes as the result of open-to-change processes that are highly relational, constantly negotiated, and contextualised in time and space.Keywords: artist-in-residence, convergent and divergent thinking, creativity, creative friction, deviance and creativity
Procedia PDF Downloads 9733166 Hydrogeological Study of the Different Aquifers in the Area of Biskra
Authors: A. Sengouga, Y. Imessaoudene, A. Semar, B. Mouhouche, M. Kadir
Abstract:
Biskra or Zibans, is located in a structural transition zone between the chain of the Saharan Atlas Mountains and the Sahara. It is an arid region where the superficial water resource is the mild, hence the importance of the lithological description and the evaluation of aquifers rock’s volumes, which are highly dependent on the mobilized water contained in the various reservoirs (Quaternary, Mio-Pliocene, Eocene and Continental intercalary). Through a data synthesis which is particularly based on stratigraphic logs of drilling, the description of aquifers heterogeneity and the determining of the spatial variability of aquifer appearance became possible, by using geostatistical analysis, which allowed the representation of the aquifer thicknesses mapping and their space variation. The different thematic maps realized focus on drilling position, the substratum shape and finally the aquifers thicknesses of the region. It is found that the high density of water points especially these of drilling points are superposed on the hydrologic reservoirs with significant thicknesses.Keywords: log stratigraphic ArcGIS 10, geometry of aquifers, rocks reservoir volume, Biskra
Procedia PDF Downloads 46033165 Modeling and Simulation of Pad Surface Topography by Diamond Dressing in Chemical-Mechanical Polishing Process
Authors: A.Chen Chao-Chang, Phong Pham-Quoc
Abstract:
Chemical-mechanical polishing (CMP) process has been widely applied on fabricating integrated circuits (IC) with a soft polishing pad combined with slurry composed of micron or nano-scaled abrasives for generating chemical reaction to remove substrate or film materials from wafer. During CMP process, pad uniformity usually works as a datum surface of wafer planarization and pad asperities can dominate the microscopic pad-slurry-wafer interaction. However, pad topography can be changed by related mechanism factors of CMP and it needs to be re-conditioned or dressed by a diamond dresser of well-distributed diamond grits on a disc surface. It is still very complicated to analyze and understand kinematic of diamond dressing process under the effects of input variables including oscillatory of diamond dresser and rotation speed ratio between the pad and the diamond dresser. This paper has developed a generic geometric model to clarify the kinematic modeling of diamond dressing processes such as dresser/pad motion, pad cutting locus, the relative velocity of the diamond abrasive grits on pad surface, and overlap of cutting for prediction of pad surface topography. Simulation results focus on comparing and analysis kinematics of the diamond dressing on certain CMP tools. Results have shown the significant parameters for diamond dressing process and also discussed. Future study can apply on diamond dresser design and experimental verification of pad dressing process.Keywords: kinematic modeling, diamond dresser, pad cutting locus, CMP
Procedia PDF Downloads 255