Search results for: process charts
15320 K-Pop Fandom: A Sub-Cultural Influencer on K-Pop Brand Attitude
Authors: Patricia P. M. C. Lourenco, Sang Yong Kim, Anaisa D. A. De Sena
Abstract:
K-Pop fandom is a paradoxical dichotomy of two conceptual contexts: the Korean single fandom and the international fandom; both strongly influence K-Pop brand attitude. Collectivist, South Korea’s fans showcase their undivided support to one artist comeback towards earning a triple-crown in domestic music charts. In contrast, individualist international fans collectively ship a plethora of artists and collaborate amongst themselves to the continuous expansion of K-Pop into a mainstream cultural glocalization in international music charts. The distinct idiosyncrasies between the two groups creates a heterogeneous K-Pop brand attitude that is challenging to tackle marketing wise for lack of homogeneity in the sub-cultural K-Pop fandom.Keywords: K-Pop fandom, single-fandom, multi-fandom, individualism, collectivism, brand attitude, sub-culture
Procedia PDF Downloads 28615319 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods
Authors: Amare Setegn Enyew, Bikila Teklu Wodajo
Abstract:
The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA
Procedia PDF Downloads 6315318 Application of Hyperbinomial Distribution in Developing a Modified p-Chart
Authors: Shourav Ahmed, M. Gulam Kibria, Kais Zaman
Abstract:
Control charts graphically verify variation in quality parameters. Attribute type control charts deal with quality parameters that can only hold two states, e.g., good or bad, yes or no, etc. At present, p-control chart is most commonly used to deal with attribute type data. In construction of p-control chart using binomial distribution, the value of proportion non-conforming must be known or estimated from limited sample information. As the probability distribution of fraction non-conforming (p) is considered in hyperbinomial distribution unlike a constant value in case of binomial distribution, it reduces the risk of false detection. In this study, a statistical control chart is proposed based on hyperbinomial distribution when prior estimate of proportion non-conforming is unavailable and is estimated from limited sample information. We developed the control limits of the proposed modified p-chart using the mean and variance of hyperbinomial distribution. The proposed modified p-chart can also utilize additional sample information when they are available. The study also validates the use of modified p-chart by comparing with the result obtained using cumulative distribution function of hyperbinomial distribution. The study clearly indicates that the use of hyperbinomial distribution in construction of p-control chart yields much accurate estimate of quality parameters than using binomial distribution.Keywords: binomial distribution, control charts, cumulative distribution function, hyper binomial distribution
Procedia PDF Downloads 27915317 GRCNN: Graph Recognition Convolutional Neural Network for Synthesizing Programs from Flow Charts
Authors: Lin Cheng, Zijiang Yang
Abstract:
Program synthesis is the task to automatically generate programs based on user specification. In this paper, we present a framework that synthesizes programs from flow charts that serve as accurate and intuitive specification. In order doing so, we propose a deep neural network called GRCNN that recognizes graph structure from its image. GRCNN is trained end-to-end, which can predict edge and node information of the flow chart simultaneously. Experiments show that the accuracy rate to synthesize a program is 66.4%, and the accuracy rates to recognize edge and node are 94.1% and 67.9%, respectively. On average, it takes about 60 milliseconds to synthesize a program.Keywords: program synthesis, flow chart, specification, graph recognition, CNN
Procedia PDF Downloads 11915316 Characterising the Dynamic Friction in the Staking of Plain Spherical Bearings
Authors: Jacob Hatherell, Jason Matthews, Arnaud Marmier
Abstract:
Anvil Staking is a cold-forming process that is used in the assembly of plain spherical bearings into a rod-end housing. This process ensures that the bearing outer lip conforms to the chamfer in the matching rod end to produce a lightweight mechanical joint with sufficient strength to meet the pushout load requirement of the assembly. Finite Element (FE) analysis is being used extensively to predict the behaviour of metal flow in cold forming processes to support industrial manufacturing and product development. On-going research aims to validate FE models across a wide range of bearing and rod-end geometries by systematically isolating and understanding the uncertainties caused by variations in, material properties, load-dependent friction coefficients and strain rate sensitivity. The improved confidence in these models aims to eliminate the costly and time-consuming process of experimental trials in the introduction of new bearing designs. Previous literature has shown that friction coefficients do not remain constant during cold forming operations, however, the understanding of this phenomenon varies significantly and is rarely implemented in FE models. In this paper, a new approach to evaluate the normal contact pressure versus friction coefficient relationship is outlined using friction calibration charts generated via iterative FE models and ring compression tests. When compared to previous research, this new approach greatly improves the prediction of forming geometry and the forming load during the staking operation. This paper also aims to standardise the FE approach to modelling ring compression test and determining the friction calibration charts.Keywords: anvil staking, finite element analysis, friction coefficient, spherical plain bearing, ring compression tests
Procedia PDF Downloads 20515315 A Posterior Predictive Model-Based Control Chart for Monitoring Healthcare
Authors: Yi-Fan Lin, Peter P. Howley, Frank A. Tuyl
Abstract:
Quality measurement and reporting systems are used in healthcare internationally. In Australia, the Australian Council on Healthcare Standards records and reports hundreds of clinical indicators (CIs) nationally across the healthcare system. These CIs are measures of performance in the clinical setting, and are used as a screening tool to help assess whether a standard of care is being met. Existing analysis and reporting of these CIs incorporate Bayesian methods to address sampling variation; however, such assessments are retrospective in nature, reporting upon the previous six or twelve months of data. The use of Bayesian methods within statistical process control for monitoring systems is an important pursuit to support more timely decision-making. Our research has developed and assessed a new graphical monitoring tool, similar to a control chart, based on the beta-binomial posterior predictive (BBPP) distribution to facilitate the real-time assessment of health care organizational performance via CIs. The BBPP charts have been compared with the traditional Bernoulli CUSUM (BC) chart by simulation. The more traditional “central” and “highest posterior density” (HPD) interval approaches were each considered to define the limits, and the multiple charts were compared via in-control and out-of-control average run lengths (ARLs), assuming that the parameter representing the underlying CI rate (proportion of cases with an event of interest) required estimation. Preliminary results have identified that the BBPP chart with HPD-based control limits provides better out-of-control run length performance than the central interval-based and BC charts. Further, the BC chart’s performance may be improved by using Bayesian parameter estimation of the underlying CI rate.Keywords: average run length (ARL), bernoulli cusum (BC) chart, beta binomial posterior predictive (BBPP) distribution, clinical indicator (CI), healthcare organization (HCO), highest posterior density (HPD) interval
Procedia PDF Downloads 20115314 The Evaluation of Soil Liquefaction Potential Using Shear Wave Velocity
Authors: M. Nghizaderokni, A. Janalizadechobbasty, M. Azizi, M. Naghizaderokni
Abstract:
The liquefaction resistance of soils can be evaluated using laboratory tests such as cyclic simple shear, cyclic triaxial, cyclic tensional shear, and field methods such as Standard Penetration Test (SPT), Cone Penetration Test (CPT), and Shear Wave Velocity (Vs). This paper outlines a great correlation between shear wave velocity and standard penetration resistance of granular soils was obtained. Using Seeds standard penetration test (SPT) based soil liquefaction charts, new charts of soil liquefaction evaluation based on shear wave velocity data were developed for various magnitude earthquakes.Keywords: soil, liquefaction, shear wave velocity, standard penetration resistance
Procedia PDF Downloads 39515313 Challenges to Safe and Effective Prescription Writing in the Environment Where Digital Prescribing is Absent
Authors: Prashant Neupane, Asmi Pandey, Mumna Ehsan, Katie Davies, Richard Lowsby
Abstract:
Introduction/Background & aims: Safe and effective prescribing in hospitals, directly and indirectly, impacts the health of the patients. Even though digital prescribing in the National Health Service (NHS), UK has been used in lots of tertiary centers along with district general hospitals, a significant number of NHS trusts are still using paper prescribing. We came across lots of irregularities in our daily clinical practice when we are doing paper prescribing. The main aim of the study was to assess how safely and effectively are we prescribing at our hospital where there is no access to digital prescribing. Method/Summary of work: We conducted a prospective audit in the critical care department at Mid Cheshire Hopsitals NHS Foundation Trust in which 20 prescription charts from different patients were randomly selected over a period of 1 month. We assessed 16 multiple categories from each prescription chart and compared them to the standard trust guidelines on prescription. Results/Discussion: We collected data from 20 different prescription charts. 16 categories were evaluated within each prescription chart. The results showed there was an urgent need for improvement in 8 different sections. In 85% of the prescription chart, all the prescribers who prescribed the medications were not identified. Name, GMC number and signature were absent in the required prescriber identification section of the prescription chart. In 70% of prescription charts, either indication or review date of the antimicrobials was absent. Units of medication were not documented correctly in 65% and the allergic status of the patient was absent in 30% of the charts. The start date of medications was missing and alternations of the medications were not done properly in 35%of charts. The patient's name was not recorded in all desired sections of the chart in 50% of cases and cancellations of the medication were not done properly in 45% of the prescription charts. Conclusion(s): From the audit and data analysis, we assessed the areas in which we needed improvement in prescription writing in the Critical care department. However, during the meetings and conversations with the experts from the pharmacy department, we realized this audit is just a representation of the specialized department of the hospital where access to prescribing is limited to a certain number of prescribers. But if we consider bigger departments of the hospital where patient turnover is much more, the results could be much worse. The findings were discussed in the Critical care MDT meeting where suggestions regarding digital/electronic prescribing were discussed. A poster and presentation regarding safe and effective prescribing were done, awareness poster was prepared and attached alongside every bedside in critical care where it is visible to prescribers. We consider this as a temporary measure to improve the quality of prescribing, however, we strongly believe digital prescribing will help to a greater extent to control weak areas which are seen in paper prescribing.Keywords: safe prescribing, NHS, digital prescribing, prescription chart
Procedia PDF Downloads 11915312 Development of Interaction Factors Charts for Piled Raft Foundation
Authors: Abdelazim Makki Ibrahim, Esamaldeen Ali
Abstract:
This study aims at analysing the load settlement behavior and predict the bearing capacity of piled raft foundation a series of finite element models with different foundation configurations and stiffness were established. Numerical modeling is used to study the behavior of the piled raft foundation due to the complexity of piles, raft, and soil interaction and also due to the lack of reliable analytical method that can predict the behavior of the piled raft foundation system. Simple analytical models are developed to predict the average settlement and the load sharing between the piles and the raft in piled raft foundation system. A simple example to demonstrate the applications of these charts is included.Keywords: finite element, pile-raft foundation, method, PLAXIS software, settlement
Procedia PDF Downloads 55715311 Process Monitoring Based on Parameterless Self-Organizing Map
Authors: Young Jae Choung, Seoung Bum Kim
Abstract:
Statistical Process Control (SPC) is a popular technique for process monitoring. A widely used tool in SPC is a control chart, which is used to detect the abnormal status of a process and maintain the controlled status of the process. Traditional control charts, such as Hotelling’s T2 control chart, are effective techniques to detect abnormal observations and monitor processes. However, many complicated manufacturing systems exhibit nonlinearity because of the different demands of the market. In this case, the unregulated use of a traditional linear modeling approach may not be effective. In reality, many industrial processes contain the nonlinear and time-varying properties because of the fluctuation of process raw materials, slowing shift of the set points, aging of the main process components, seasoning effects, and catalyst deactivation. The use of traditional SPC techniques with time-varying data will degrade the performance of the monitoring scheme. To address these issues, in the present study, we propose a parameterless self-organizing map (PLSOM)-based control chart. The PLSOM-based control chart not only can manage a situation where the distribution or parameter of the target observations changes, but also address the nonlinearity of modern manufacturing systems. The control limits of the proposed PLSOM chart are established by estimating the empirical level of significance on the percentile using a bootstrap method. Experimental results with simulated data and actual process data from a thin-film transistor-liquid crystal display process demonstrated the effectiveness and usefulness of the proposed chart.Keywords: control chart, parameter-less self-organizing map, self-organizing map, time-varying property
Procedia PDF Downloads 27515310 Improving the Academic Performance of Students: Management Role of Head Teachers as a Key Contributing Factor
Authors: Dominic Winston Kaku
Abstract:
The academic performance of students is an area of great concern in education to the various stakeholders of education. This is because the academic performance of students is widely used as a measure of the success of the educational process. There are several factors, such as school-related factors, teachers related factors, pupils or students’ factors, and many others determining their academic performance. It appears that the management role of head teachers as a determining factor of pupils’ academic achievement is not much investigated. The management role of head teachers is an essential element in the educational process that has a huge influence on students’ academic performance. The aim of the research was to examine the management role of head teachers in improving the academic performance of students. The study employed a descriptive survey and was conducted among Junior High Schools in the Ellembelle District of the Western Region of Ghana. The respondents for the study were mainly all the head teachers, teachers, and some selected basic school pupils (JHS) in four-selected public basic schools in the Ellembelle district in the Western part of Ghana. A questionnaire was used to collect primary data from a sampling size of 252 persons, including 226 JHS pupils, all JHS teachers, and head teachers of all four selected schools. Descriptive statistics, specifically frequencies, percentages, pie charts, bar charts, means, and standard deviation, were used to analyse the data, and that formed the basis of the presentation of findings. The study discovered that planning academic activities, fostering relationships between the school and the community, supervising lessons, staff motivation, and punishing students who go wrong are some of the activities the head teachers participate in to help improve students’ academic performance. The academic performance of students is an area of great concern in education to the various stakeholders of education. This is because the academic performance of students is widely used as a measure of the success of the educational process. There are several factors, such as school-related factors, teachers related factors, pupils or students’ factors, and many others determining their academic performance. It appears that the management role of head teachers as a determining factor of pupils’ academic achievement is not much investigated. The management role of head teachers is an essential element in the educational process that has a huge influence on students’ academic performance. The aim of the research was to examine the management role of head teachers in improving the academic performance of students. The study employed a descriptive survey and was conducted among Junior High Schools in the Ellembelle District of the Western Region of Ghana. The respondents for the study were mainly all the head teachers, teachers, and some selected basic school pupils (JHS) in four-selected public basic schools in the Ellembelle district in the Western part of Ghana. A questionnaire was used to collect primary data from a sampling size of 252 persons, including 226 JHS pupils, all JHS teachers, and head teachers of all four selected schools. Descriptive statistics, specifically frequencies, percentages, pie charts, bar charts, means, and standard deviation, were used to analyse the data, and that formed the basis of the presentation of findings. The study discovered that planning academic activities, fostering relationships between the school and the community, supervising lessons, staff motivation, and punishing students who go wrong are some of the activities the head teachers participate in to help improve students’ academic performance.Keywords: supervision, head teacher, academic performance, planning, motivation, relationships
Procedia PDF Downloads 6415309 Strategies for Synchronizing Chocolate Conching Data Using Dynamic Time Warping
Authors: Fernanda A. P. Peres, Thiago N. Peres, Flavio S. Fogliatto, Michel J. Anzanello
Abstract:
Batch processes are widely used in food industry and have an important role in the production of high added value products, such as chocolate. Process performance is usually described by variables that are monitored as the batch progresses. Data arising from these processes are likely to display a strong correlation-autocorrelation structure, and are usually monitored using control charts based on multiway principal components analysis (MPCA). Process control of a new batch is carried out comparing the trajectories of its relevant process variables with those in a reference set of batches that yielded products within specifications; it is clear that proper determination of the reference set is key for the success of a correct signalization of non-conforming batches in such quality control schemes. In chocolate manufacturing, misclassifications of non-conforming batches in the conching phase may lead to significant financial losses. In such context, the accuracy of process control grows in relevance. In addition to that, the main assumption in MPCA-based monitoring strategies is that all batches are synchronized in duration, both the new batch being monitored and those in the reference set. Such assumption is often not satisfied in chocolate manufacturing process. As a consequence, traditional techniques as MPCA-based charts are not suitable for process control and monitoring. To address that issue, the objective of this work is to compare the performance of three dynamic time warping (DTW) methods in the alignment and synchronization of chocolate conching process variables’ trajectories, aimed at properly determining the reference distribution for multivariate statistical process control. The power of classification of batches in two categories (conforming and non-conforming) was evaluated using the k-nearest neighbor (KNN) algorithm. Real data from a milk chocolate conching process was collected and the following variables were monitored over time: frequency of soybean lecithin dosage, rotation speed of the shovels, current of the main motor of the conche, and chocolate temperature. A set of 62 batches with durations between 495 and 1,170 minutes was considered; 53% of the batches were known to be conforming based on lab test results and experts’ evaluations. Results showed that all three DTW methods tested were able to align and synchronize the conching dataset. However, synchronized datasets obtained from these methods performed differently when inputted in the KNN classification algorithm. Kassidas, MacGregor and Taylor’s (named KMT) method was deemed the best DTW method for aligning and synchronizing a milk chocolate conching dataset, presenting 93.7% accuracy, 97.2% sensitivity and 90.3% specificity in batch classification, being considered the best option to determine the reference set for the milk chocolate dataset. Such method was recommended due to the lowest number of iterations required to achieve convergence and highest average accuracy in the testing portion using the KNN classification technique.Keywords: batch process monitoring, chocolate conching, dynamic time warping, reference set distribution, variable duration
Procedia PDF Downloads 16715308 Social Semantic Web-Based Analytics Approach to Support Lifelong Learning
Authors: Khaled Halimi, Hassina Seridi-Bouchelaghem
Abstract:
The purpose of this paper is to describe how learning analytics approaches based on social semantic web techniques can be applied to enhance the lifelong learning experiences in a connectivist perspective. For this reason, a prototype of a system called SoLearn (Social Learning Environment) that supports this approach. We observed and studied literature related to lifelong learning systems, social semantic web and ontologies, connectivism theory, learning analytics approaches and reviewed implemented systems based on these fields to extract and draw conclusions about necessary features for enhancing the lifelong learning process. The semantic analytics of learning can be used for viewing, studying and analysing the massive data generated by learners, which helps them to understand through recommendations, charts and figures their learning and behaviour, and to detect where they have weaknesses or limitations. This paper emphasises that implementing a learning analytics approach based on social semantic web representations can enhance the learning process. From one hand, the analysis process leverages the meaning expressed by semantics presented in the ontology (relationships between concepts). From the other hand, the analysis process exploits the discovery of new knowledge by means of inferring mechanism of the semantic web.Keywords: connectivism, learning analytics, lifelong learning, social semantic web
Procedia PDF Downloads 21415307 Simulation Model of Induction Heating in COMSOL Multiphysics
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element
Procedia PDF Downloads 31615306 Developing Variable Repetitive Group Sampling Control Chart Using Regression Estimator
Authors: Liaquat Ahmad, Muhammad Aslam, Muhammad Azam
Abstract:
In this article, we propose a control chart based on repetitive group sampling scheme for the location parameter. This charting scheme is based on the regression estimator; an estimator that capitalize the relationship between the variables of interest to provide more sensitive control than the commonly used individual variables. The control limit coefficients have been estimated for different sample sizes for less and highly correlated variables. The monitoring of the production process is constructed by adopting the procedure of the Shewhart’s x-bar control chart. Its performance is verified by the average run length calculations when the shift occurs in the average value of the estimator. It has been observed that the less correlated variables have rapid false alarm rate.Keywords: average run length, control charts, process shift, regression estimators, repetitive group sampling
Procedia PDF Downloads 56515305 Mobile Learning: Toward Better Understanding of Compression Techniques
Authors: Farouk Lawan Gambo
Abstract:
Data compression shrinks files into fewer bits then their original presentation. It has more advantage on internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature therefore making them difficult to digest by some students (Engineers in particular). To determine the best approach toward learning data compression technique, this paper first study the learning preference of engineering students who tend to have strong active, sensing, visual and sequential learning preferences, the paper also study the advantage that mobility of learning have experienced; Learning at the point of interest, efficiency, connection, and many more. A survey is carried out with some reasonable number of students, through random sampling to see whether considering the learning preference and advantages in mobility of learning will give a promising improvement over the traditional way of learning. Evidence from data analysis using Ms-Excel as a point of concern for error-free findings shows that there is significance different in the students after using learning content provided on smart phone, also the result of the findings presented in, bar charts and pie charts interpret that mobile learning has to be promising feature of learning.Keywords: data analysis, compression techniques, learning content, traditional learning approach
Procedia PDF Downloads 34715304 Crack Width Evaluation for Flexural RC Members with Axial Tension
Authors: Sukrit Ghorai
Abstract:
Proof of controlling crack width is a basic condition for securing suitable performance in serviceability limit state. The cracking in concrete can occur at any time from the casting of time to the years after the concrete has been set in place. Most codes struggle with offering procedure for crack width calculation. There is lack in availability of design charts for designers to compute crack width with ease. The focus of the study is to utilize design charts and parametric equations in calculating crack width with minimum error. The paper contains a simplified procedure to calculate crack width for reinforced concrete (RC) sections subjected to bending with axial tensile force following the guidelines of Euro code [DS EN-1992-1-1 & DS EN-1992-1-2]. Numerical examples demonstrate the application of the suggested procedure. Comparison with parallel analytical tools support the validity of result and show the percentage deviation of crack width in both the procedures. The technique is simple, user-friendly and ready to evolve for a greater spectrum of section sizes and materials.Keywords: concrete structures, crack width calculation, serviceability limit state, structural design, bridge engineering
Procedia PDF Downloads 38315303 Information Communication Technology in Early Childhood Education: An Assessment of the Quality of ICT in the New Mega Primary Schools in Ondo State, Southwestern Nigeria
Authors: Oluyemi Christianah Ojo
Abstract:
This study seeks to investigate the quality of ICT provided in the new Caring Heart schools in Ondo State, Nigeria. The population for the study was all caring Heart Mega Schools in Ondo State, Nigeria. Research questions were generated; two instruments CCCMS and TQCUC were used to elicit information from the schools and the teachers. The study adopts descriptive survey approach. The studies revealed and concluded that ICT components were available and adequate in these schools, Charts showing ICT components and other forms of computer devices used as instructional materials were available but were not adequate; teachers teaching computer studies are competent in the delivery of instructions and in handling computer gadgets in the laboratory. The study recommended the provision of steady electricity, uninterrupted internet facilities and provision of adequate ICT components and charts for effective teaching delivery and learning.Keywords: facilities, information communication technology, mega primary school, primary education
Procedia PDF Downloads 29515302 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables
Authors: Edokpa Idemudia Waziri, Salisu S. Umar
Abstract:
The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter
Procedia PDF Downloads 42215301 Road Maintenance Management Decision System Using Multi-Criteria and Geographical Information System for Takoradi Roads, Ghana
Authors: Eric Mensah, Carlos Mensah
Abstract:
The road maintenance backlogs created as a result of deferred maintenance especially in developing countries has caused considerable deterioration of many road assets. This is usually due to difficulties encountered in selecting and prioritising maintainable roads based on objective criteria rather than some political or other less important criteria. In order to ensure judicious use of limited resources for road maintenance, five factors were identified as the most important criteria for road management within the study area. This was based on the judgements of 40 experts. The results were further used to develop weightings using the Multi-Criteria Decision Process (MCDP) to analyse and select road alternatives according to maintenance goal. Using Geographical Information Systems (GIS), maintainable roads were grouped using the Jenk’s natural breaks to allow for further prioritised in order of importance for display on a dashboard of maps, charts, and tables. This reduces the problems of subjective maintenance and road selections, thereby reducing wastage of resources and easing the maintenance process through an object organised spatial decision support system.Keywords: decision support, geographical information systems, multi-criteria decision process, weighted sum
Procedia PDF Downloads 37615300 Reduction of Defects Using Seven Quality Control Tools for Productivity Improvement at Automobile Company
Authors: Abdul Sattar Jamali, Imdad Ali Memon, Maqsood Ahmed Memon
Abstract:
Quality of production near to zero defects is an objective of every manufacturing and service organization. In order to maintain and improve the quality by reduction in defects, Statistical tools are being used by any organizations. There are many statistical tools are available to assess the quality. Keeping in view the importance of many statistical tools, traditional 7QC tools has been used in any manufacturing and automobile Industry. Therefore, the 7QC tools have been successfully applied at one of the Automobile Company Pakistan. Preliminary survey has been done for the implementation of 7QC tool in the assembly line of Automobile Industry. During preliminary survey two inspection points were decided to collect the data, which are Chassis line and trim line. The data for defects at Chassis line and trim line were collected for reduction in defects which ultimately improve productivity. Every 7QC tools has its benefits observed from the results. The flow charts developed for better understanding about inspection point for data collection. The check sheets developed for helps for defects data collection. Histogram represents the severity level of defects. Pareto charts show the cumulative effect of defects. The Cause and Effect diagrams developed for finding the root causes of each defects. Scatter diagram developed the relation of defects increasing or decreasing. The P-Control charts developed for showing out of control points beyond the limits for corrective actions. The successful implementation of 7QC tools at the inspection points at Automobile Industry concluded that the considerable amount of reduction on defects level, as in Chassis line from 132 defects to 13 defects. The total 90% defects were reduced in Chassis Line. In Trim line defects were reduced from 157 defects to 28 defects. The total 82% defects were reduced in Trim Line. As the Automobile Company exercised only few of the 7 QC tools, not fully getting the fruits by the application of 7 QC tools. Therefore, it is suggested the company may need to manage a mechanism for the application of 7 QC tools at every section.Keywords: check sheet, cause and effect diagram, control chart, histogram
Procedia PDF Downloads 32615299 Non-Parametric Changepoint Approximation for Road Devices
Authors: Loïc Warscotte, Jehan Boreux
Abstract:
The scientific literature of changepoint detection is vast. Today, a lot of methods are available to detect abrupt changes or slight drift in a signal, based on CUSUM or EWMA charts, for example. However, these methods rely on strong assumptions, such as the stationarity of the stochastic underlying process, or even the independence and Gaussian distributed noise at each time. Recently, the breakthrough research on locally stationary processes widens the class of studied stochastic processes with almost no assumptions on the signals and the nature of the changepoint. Despite the accurate description of the mathematical aspects, this methodology quickly suffers from impractical time and space complexity concerning the signals with high-rate data collection, if the characteristics of the process are completely unknown. In this paper, we then addressed the problem of making this theory usable to our purpose, which is monitoring a high-speed weigh-in-motion system (HS-WIM) towards direct enforcement without supervision. To this end, we first compute bounded approximations of the initial detection theory. Secondly, these approximating bounds are empirically validated by generating many independent long-run stochastic processes. The abrupt changes and the drift are both tested. Finally, this relaxed methodology is tested on real signals coming from a HS-WIM device in Belgium, collected over several months.Keywords: changepoint, weigh-in-motion, process, non-parametric
Procedia PDF Downloads 7815298 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario
Authors: Adel Gurel, Ozge Ceylin Yildirim
Abstract:
Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.Keywords: computer technologies, future architecture, scientific developments, transformation
Procedia PDF Downloads 19115297 The Impact of Technology on Architecture and Graphic Designs
Authors: Feby Zaki Raouf Fawzy
Abstract:
Nowadays, design and architecture are being affected and undergoing change with the rapid advancements in technology, economics, politics, society, and culture. Architecture has been transforming with the latest developments after the inclusion of computers in design. Integration of design into the computational environment has revolutionized architecture and unique perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within the historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology is supported by a detailed literature review, and they are consolidated with the examination of focal points of 20th-century architecture under the titles parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present, the developments in architecture cannot keep up with the advancements in technology, and recent developments in technology overshadow architecture; even technology decides the direction of architecture. As a result, a scenario is presented with regard to the reach of technology in the future of architecture and the role of the architect.Keywords: design and development the information technology architecture, enterprise architecture, enterprise architecture design result, TOGAF architecture development method (ADM)
Procedia PDF Downloads 6915296 The Effect of Artificial Intelligence on Urbanism, Architecture and Environmental Conditions
Authors: Abanoub Rady Shaker Saleb
Abstract:
Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.Keywords: design and development the information technology architecture, enterprise architecture, enterprise architecture design result, TOGAF architecture development method (ADM)
Procedia PDF Downloads 6915295 An Excel-Based Educational Platform for Design Analyses of Pump-Pipe Systems
Authors: Mohamed M. El-Awad
Abstract:
This paper describes an educational platform for design analyses of pump-pipe systems by using Microsoft Excel, its Solver add-in, and the associated VBA programming language. The paper demonstrates the capabilities of the Excel-based platform that suits the iterative nature of the design process better than the use of design charts and data tables. While VBA is used for the development of a user-defined function for determining the standard pipe diameter, Solver is used for optimising the pipe diameter of the pipeline and for determining the operating point of the selected pump.Keywords: design analyses, pump-pipe systems, Excel, solver, VBA
Procedia PDF Downloads 16615294 Implementation of Total Quality Management in a Small Scale Industry: A Case Study
Authors: Soham Lalwala, Ronita Singh, Yaman Pattanaik
Abstract:
In the present scenario of globalization and privatization, it becomes difficult for small scale industries to sustain due to rapidly increasing competition. In a developing country, most of the gross output is generally obtained from small scale industries. Thus, quality plays a vital role in maintaining customer satisfaction. Total quality management (TQM) is an approach which enables employees to focus on quality rather quantity, further improving the competitiveness, effectiveness and flexibility of the whole organization. The objective of the paper is to present the application of TQM and develop a TQM Model in a small scale industry of narrow fabrics in Surat, India named ‘Rajdhani Lace & Borders’. Further, critical success factors relating all the fabric processes involved were identified. The data was collected by conducting a questionnaire survey. After data was collected, critical areas were visualized using different tools of TQM such as cause and effect diagram, control charts and run charts. Overall, responses were analyzed, and factor analysis was used to develop the model. The study presented here will aid the management of the above-mentioned industry in identifying the weaker areas and thus give a plausible solution to improve the total productivity of the firm along with effective utilization of resources and better customer satisfaction.Keywords: critical success factors, narrow fabrics, quality, small scale industries, total quality management (TQM)
Procedia PDF Downloads 25315293 Pension Reform in Georgia: Challenges, International Practice and Opportunities for Development
Authors: Manana Lobzhanidze
Abstract:
Reforming the pension system is urgent in Georgia due to socio-economic problems. Replacing the current pension system with a new one requires, on the one hand, an assessment of the challenges in this field and, on the other hand, a study of the best practices of foreign experience. Objectives: The aim of the research is to identify challenges in the pension reform process in Georgia, to study international experience, and to develop recommendations for the implementation of an effective pension system. Methodologies: A desk study was conducted, and methods of analysis, comparison, grouping, matrix charts, and scenario analysis were used. Findings: The advantages of accumulative pension compared to the current pension system are identified. The main challenge is the non-targeting of the pension contributions and the ineffective investment policy; the public's attitude towards the cumulative pension system is determined.Keywords: pension reform, challenges, international practice, opportunity for development
Procedia PDF Downloads 6715292 Post-Traumatic Stress Disorder: Management at the Montfort Hospital
Authors: Kay-Anne Haykal, Issack Biyong
Abstract:
The post-traumatic stress disorder (PTSD) rises from exposure to a traumatic event and appears by a persistent experience of this event. Several psychiatric co-morbidities are associated with PTSD and include mood disorders, anxiety disorders, and substance abuse. The main objective was to compare the criteria for PTSD according to the literature to those used to diagnose a patient in a francophone hospital and to check the correspondence of these two criteria. 700 medical charts of admitted patients on the medicine or psychiatric unit at the Montfort Hospital were identified with the following diagnoses: major depressive disorder, bipolar disorder, anxiety disorder, substance abuse, and PTSD for the period of time between April 2005 and March 2006. Multiple demographic criteria were assembled. Also, for every chart analyzed, the PTSD criteria, according to the Manual of Mental Disorders (DSM) IV were found, identified, and grouped according to pre-established codes. An analysis using the receiver operating characteristic (ROC) method was elaborated for the study of data. A sample of 57 women and 50 men was studied. Age was varying between 18 and 88 years with a median age of 48. According to the PTSD criteria in the DSM IV, 12 patients should have the diagnosis of PTSD in opposition to only two identified in the medical charts. The ROC method establishes that with the combination of data from PTSD and depression, the sensitivity varies between 0,127 and 0,282, and the specificity varies between 0,889 and 0,917. Otherwise, if we examine the PTSD data alone, the sensibility jumps to 0.50, and the specificity varies between 0,781 and 0,895. This study confirms the presence of an underdiagnosed and treated PTSD that causes severe perturbations for the affected individual.Keywords: post-traumatic stress disorder, co-morbidities, diagnosis, mental health disorders
Procedia PDF Downloads 38715291 The Determinants of Corporate Social Responsibility Disclosure Extent and Quality: The Case of Jordan
Authors: Hani Alkayed, Belal Omar, Eileen Roddy
Abstract:
This study focuses on investigating the determinants of Corporate Social Responsibility Disclosure (CSRD) extent and quality in Jordan. The study examines factors that influence CSR disclosure extent and quality, such as corporate characteristics (size, gearing, firm’s age, and industry type), corporate governance (board size, number of meetings, non-executive directors, female directors in the board, family directors in the board, foreign members, audit committee, type of external auditors, and CEO duality) and ownership structure (government ownership, institutional ownership, and ownership concentration). Legitimacy theory is utilised as the main theory for our theoretical framework. A quantitative approach is adopted for this research and content analysis technique is used to gather CSR disclosure extent and quality from the annual reports. The sample is withdrawn from the annual reports of 118 Jordanian companies over the period of 2010-2015. A CSRD index is constructed, and includes the disclosures of the following categories; environmental, human resources, product and consumers, and community involvement. A 7 point-scale measurement was developed to examine the quality of disclosure, were 0= No Disclosures, 1= General disclosures, (Non-monetary), 2= General disclosures, (Non-monetary) with pictures, charts, and graphs 3= Descriptive/ qualitative disclosures, specific details (Non-monetary), 4= Descriptive/ qualitative disclosures, specific details with pictures, charts, and graphs, 5= Numeric disclosures, full descriptions with supporting numbers, 6= Numeric disclosures, full descriptions with supporting numbers, pictures, and Charts. This study fills the gap in the literature regarding CSRD in Jordan, and the fact that all the previous studies have ignored a clear categorisation as a measurement of quality. The result shows that the extent of CSRD is higher than the quality in Jordan. Regarding the determinants of CSR disclosures, the followings were found to have a significant relationship with both extent and quality of CSRD except non-executives, were the significant relationship was found just with the extent of CSRD: board size, non-executive directors, firm’s age, foreign members on the board, number of boards meetings, the presence of audit committees, big 4, government ownership, firm’s size, industry type.Keywords: content analysis, corporate governance, corporate social responsibility disclosure, Jordan, quality of disclosure
Procedia PDF Downloads 230