Search results for: participatory error correction process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16876

Search results for: participatory error correction process

16486 X̄ and S Control Charts based on Weighted Standard Deviation Method

Authors: Derya Karagöz

Abstract:

A Shewhart chart based on normality assumption is not appropriate for skewed distributions since its Type-I error rate is inflated. This study presents X̄ and S control charts for monitoring the process variability for skewed distributions. We propose Weighted Standard Deviation (WSD) X̄ and S control charts. Standard deviation estimator is applied to monitor the process variability for estimating the process standard deviation, in the case of the W SD X̄ and S control charts as this estimator is simple and easy to compute. Unlike the Shewhart control chart, the proposed charts provide asymmetric limits in accordance with the direction and degree of skewness to construct the upper and lower limits. The performances of the proposed charts are compared with other heuristic charts for skewed distributions by using Simulation study. The Simulation studies show that the proposed control charts have good properties for skewed distributions and large sample sizes.

Keywords: weighted standard deviation, MAD, skewed distributions, S control charts

Procedia PDF Downloads 373
16485 Forecasting Models for Steel Demand Uncertainty Using Bayesian Methods

Authors: Watcharin Sangma, Onsiri Chanmuang, Pitsanu Tongkhow

Abstract:

A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.

Keywords: forecasting model, steel demand uncertainty, hierarchical Bayesian framework, exponential smoothing method

Procedia PDF Downloads 326
16484 Controlling Youths Participation in Politics in Sokoto State: A Constructive Inclusiveness for Good Governance in Nigeria

Authors: Umar Ubandawaki

Abstract:

Political participation involves voluntary and deliberate efforts by the members of a political system to determine the kinds of political institution and individuals that will govern them and equally influence the mobilization and allocation of the available societal resources. Over the years, youths in Nigeria participated actively in political party rallies and voting to elect their leaders and representatives in governance. This paper examines categories and nature of participation in politics as well as factors that derived youths into politics in Sokoto State. Through the use of qualitative and quantitative data generated from focus group discussions, interviews and questionnaire, the paper find out that youth, in Sokoto State, have been induced in participatory activities that encourage political thuggery and manipulation of electoral outcomes. Moreover, they are neglected in the mobilization and allocation of the available resources of the society i.e they are denied dividends of good governance. The paper recommends that youths should be engaged into positive participatory activities for ensuring inclusiveness and promotion of good governance in Nigeria. It is hoped that this will enlighten youth and policy implementers on the constructive strategies in controlling youth’s participation in politics in Nigeria.

Keywords: democracy, governance, inclusivenes, participation and politic

Procedia PDF Downloads 324
16483 Implementation of a Lattice Boltzmann Method for Multiphase Flows with High Density Ratios

Authors: Norjan Jumaa, David Graham

Abstract:

We present a Lattice Boltzmann Method (LBM) for multiphase flows with high viscosity and density ratios. The motion of the interface between fluids is modelled by solving the Cahn-Hilliard (CH) equation with LBM. Incompressibility of the velocity fields in each phase is imposed by using a pressure correction scheme. We use a unified LBM approach with separate formulations for the phase field, the pressure less Naiver-Stokes (NS) equations and the pressure Poisson equation required for correction of the velocity field. The implementation has been verified for various test case. Here, we present results for some complex flow problems including two dimensional single and multiple mode Rayleigh-Taylor instability and we obtain good results when comparing with those in the literature. The main focus of our work is related to interactions between aerated or non-aerated waves and structures so we also present results for both high viscosity and low viscosity waves.

Keywords: lattice Boltzmann method, multiphase flows, Rayleigh-Taylor instability, waves

Procedia PDF Downloads 214
16482 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data

Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu

Abstract:

Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.

Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq

Procedia PDF Downloads 124
16481 The Linear Combination of Kernels in the Estimation of the Cumulative Distribution Functions

Authors: Abdel-Razzaq Mugdadi, Ruqayyah Sani

Abstract:

The Kernel Distribution Function Estimator (KDFE) method is the most popular method for nonparametric estimation of the cumulative distribution function. The kernel and the bandwidth are the most important components of this estimator. In this investigation, we replace the kernel in the KDFE with a linear combination of kernels to obtain a new estimator based on the linear combination of kernels, the mean integrated squared error (MISE), asymptotic mean integrated squared error (AMISE) and the asymptotically optimal bandwidth for the new estimator are derived. We propose a new data-based method to select the bandwidth for the new estimator. The new technique is based on the Plug-in technique in density estimation. We evaluate the new estimator and the new technique using simulations and real-life data.

Keywords: estimation, bandwidth, mean square error, cumulative distribution function

Procedia PDF Downloads 549
16480 Estimation of Slab Depth, Column Size and Rebar Location of Concrete Specimen Using Impact Echo Method

Authors: Y. T. Lee, J. H. Na, S. H. Kim, S. U. Hong

Abstract:

In this study, an experimental research for estimation of slab depth, column size and location of rebar of concrete specimen is conducted using the Impact Echo Method (IE) based on stress wave among non-destructive test methods. Estimation of slab depth had total length of 1800×300 and 6 different depths including 150 mm, 180 mm, 210 mm, 240 mm, 270 mm and 300 mm. The concrete column specimen was manufactured by differentiating the size into 300×300×300 mm, 400×400×400 mm and 500×500×500 mm. In case of the specimen for estimation of rebar, rebar of ∅22 mm was used in a specimen of 300×370×200 and arranged at 130 mm and 150 mm from the top to the rebar top. As a result of error rate of slab depth was overall mean of 3.1%. Error rate of column size was overall mean of 1.7%. Mean error rate of rebar location was 1.72% for top, 1.19% for bottom and 1.5% for overall mean showing relative accuracy.

Keywords: impact echo method, estimation, slab depth, column size, rebar location, concrete

Procedia PDF Downloads 323
16479 Perceptual Image Coding by Exploiting Internal Generative Mechanism

Authors: Kuo-Cheng Liu

Abstract:

In the perceptual image coding, the objective is to shape the coding distortion such that the amplitude of distortion does not exceed the error visibility threshold, or to remove perceptually redundant signals from the image. While most researches focus on color image coding, the perceptual-based quantizer developed for luminance signals are always directly applied to chrominance signals such that the color image compression methods are inefficient. In this paper, the internal generative mechanism is integrated into the design of a color image compression method. The internal generative mechanism working model based on the structure-based spatial masking is used to assess the subjective distortion visibility thresholds that are visually consistent to human eyes better. The estimation method of structure-based distortion visibility thresholds for color components is further presented in a locally adaptive way to design quantization process in the wavelet color image compression scheme. Since the lowest subband coefficient matrix of images in the wavelet domain preserves the local property of images in the spatial domain, the error visibility threshold inherent in each coefficient of the lowest subband for each color component is estimated by using the proposed spatial error visibility threshold assessment. The threshold inherent in each coefficient of other subbands for each color component is then estimated in a local adaptive fashion based on the distortion energy allocation. By considering that the error visibility thresholds are estimated using predicting and reconstructed signals of the color image, the coding scheme incorporated with locally adaptive perceptual color quantizer does not require side information. Experimental results show that the entropies of three color components obtained by using proposed IGM-based color image compression scheme are lower than that obtained by using the existing color image compression method at perceptually lossless visual quality.

Keywords: internal generative mechanism, structure-based spatial masking, visibility threshold, wavelet domain

Procedia PDF Downloads 221
16478 Neural Network Approaches for Sea Surface Height Predictability Using Sea Surface Temperature

Authors: Luther Ollier, Sylvie Thiria, Anastase Charantonis, Carlos E. Mejia, Michel Crépon

Abstract:

Sea Surface Height Anomaly (SLA) is a signature of the sub-mesoscale dynamics of the upper ocean. Sea Surface Temperature (SST) is driven by these dynamics and can be used to improve the spatial interpolation of SLA fields. In this study, we focused on the temporal evolution of SLA fields. We explored the capacity of deep learning (DL) methods to predict short-term SLA fields using SST fields. We used simulated daily SLA and SST data from the Mercator Global Analysis and Forecasting System, with a resolution of (1/12)◦ in the North Atlantic Ocean (26.5-44.42◦N, -64.25–41.83◦E), covering the period from 1993 to 2019. Using a slightly modified image-to-image convolutional DL architecture, we demonstrated that SST is a relevant variable for controlling the SLA prediction. With a learning process inspired by the teaching-forcing method, we managed to improve the SLA forecast at five days by using the SST fields as additional information. We obtained predictions of a 12 cm (20 cm) error of SLA evolution for scales smaller than mesoscales and at time scales of 5 days (20 days), respectively. Moreover, the information provided by the SST allows us to limit the SLA error to 16 cm at 20 days when learning the trajectory.

Keywords: deep-learning, altimetry, sea surface temperature, forecast

Procedia PDF Downloads 61
16477 Expanding Chance of Palm Oil Market into ASEAN Community: Case Study of Choomporn Palm Oil Cooperative

Authors: Pichamon Chansuchai

Abstract:

This paper studied the expanding market opportunity palm oil ASEAN community: case study of Choomporn Palm Oil Cooperative as qualitative research. The purpose is to study and analyze expanding and linking the liberalization of trade in palm oil products under the terms of cooperation and ASEAN countries. Collection data were collected using participatory observation, in-depth interviews, focus groups, government officials, palm oil cooperative, entrepreneurs and farmers to exchange opinions. The study found that of major competitors is Indonesia and Malaysia which as ASEAM members countries has the potential to produce over Thailand. Thailand government must have a policy to increase the competitiveness of the palm oil Thailand. Using grants from the Free Trade Area fund should add value to agricultural products, palm oil and the development of standard products to meet the needs of the member countries. And creating a learning center of the palm oil sector can transfer knowledge, development of palm species, solution process from planting to harvest care privatization process. And the development of palm oil in order to expand market opportunities for Thailand's palm oil has the potential to be competitive in the neighboring countries and the region.

Keywords: palm oil, market, cooperative, ASEAN

Procedia PDF Downloads 465
16476 Modelling the Long Rune of Aggregate Import Demand in Libya

Authors: Said Yousif Khairi

Abstract:

Being a developing economy, imports of capital, raw materials and manufactories goods are vital for sustainable economic growth. In 2006, Libya imported LD 8 billion (US$ 6.25 billion) which composed of mainly machinery and transport equipment (49.3%), raw material (18%), and food products and live animals (13%). This represented about 10% of GDP. Thus, it is pertinent to investigate factors affecting the amount of Libyan imports. An econometric model representing the aggregate import demand for Libya was developed and estimated using the bounds test procedure, which based on an unrestricted error correction model (UECM). The data employed for the estimation was from 1970–2010. The results of the bounds test revealed that the volume of imports and its determinants namely real income, consumer price index and exchange rate are co-integrated. The findings indicate that the demand for imports is inelastic with respect to income, index price level and The exchange rate variable in the short run is statistically significant. In the long run, the income elasticity is elastic while the price elasticity and the exchange rate remains inelastic. This indicates that imports are important elements for Libyan economic growth in the long run.

Keywords: import demand, UECM, bounds test, Libya

Procedia PDF Downloads 336
16475 Analytical Performance of Cobas C 8000 Analyzer Based on Sigma Metrics

Authors: Sairi Satari

Abstract:

Introduction: Six-sigma is a metric that quantifies the performance of processes as a rate of Defects-Per-Million Opportunities. Sigma methodology can be applied in chemical pathology laboratory for evaluating process performance with evidence for process improvement in quality assurance program. In the laboratory, these methods have been used to improve the timeliness of troubleshooting, reduce the cost and frequency of quality control and minimize pre and post-analytical errors. Aim: The aim of this study is to evaluate the sigma values of the Cobas 8000 analyzer based on the minimum requirement of the specification. Methodology: Twenty-one analytes were chosen in this study. The analytes were alanine aminotransferase (ALT), albumin, alkaline phosphatase (ALP), Amylase, aspartate transaminase (AST), total bilirubin, calcium, chloride, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, lactate dehydrogenase (LDH), magnesium, potassium, protein, sodium, triglyceride, uric acid and urea. Total error was obtained from Clinical Laboratory Improvement Amendments (CLIA). The Bias was calculated from end cycle report of Royal College of Pathologists of Australasia (RCPA) cycle from July to December 2016 and coefficient variation (CV) from six-month internal quality control (IQC). The sigma was calculated based on the formula :Sigma = (Total Error - Bias) / CV. The analytical performance was evaluated based on the sigma, sigma > 6 is world class, sigma > 5 is excellent, sigma > 4 is good and sigma < 4 is satisfactory and sigma < 3 is poor performance. Results: Based on the calculation, we found that, 96% are world class (ALT, albumin, ALP, amylase, AST, total bilirubin, cholesterol, HDL-cholesterol, creatinine, creatinine kinase, glucose, LDH, magnesium, potassium, triglyceride and uric acid. 14% are excellent (calcium, protein and urea), and 10% ( chloride and sodium) require more frequent IQC performed per day. Conclusion: Based on this study, we found that IQC should be performed frequently for only Chloride and Sodium to ensure accurate and reliable analysis for patient management.

Keywords: sigma matrics, analytical performance, total error, bias

Procedia PDF Downloads 151
16474 Individual Cylinder Ignition Advance Control Algorithms of the Aircraft Piston Engine

Authors: G. Barański, P. Kacejko, M. Wendeker

Abstract:

The impact of the ignition advance control algorithms of the ASz-62IR-16X aircraft piston engine on a combustion process has been presented in this paper. This aircraft engine is a nine-cylinder 1000 hp engine with a special electronic control ignition system. This engine has two spark plugs per cylinder with an ignition advance angle dependent on load and the rotational speed of the crankshaft. Accordingly, in most cases, these angles are not optimal for power generated. The scope of this paper is focused on developing algorithms to control the ignition advance angle in an electronic ignition control system of an engine. For this type of engine, i.e. radial engine, an ignition advance angle should be controlled independently for each cylinder because of the design of such an engine and its crankshaft system. The ignition advance angle is controlled in an open-loop way, which means that the control signal (i.e. ignition advance angle) is determined according to the previously developed maps, i.e. recorded tables of the correlation between the ignition advance angle and engine speed and load. Load can be measured by engine crankshaft speed or intake manifold pressure. Due to a limited memory of a controller, the impact of other independent variables (such as cylinder head temperature or knock) on the ignition advance angle is given as a series of one-dimensional arrays known as corrective characteristics. The value of the ignition advance angle specified combines the value calculated from the primary characteristics and several correction factors calculated from correction characteristics. Individual cylinder control can proceed in line with certain indicators determined from pressure registered in a combustion chamber. Control is assumed to be based on the following indicators: maximum pressure, maximum pressure angle, indicated mean effective pressure. Additionally, a knocking combustion indicator was defined. Individual control can be applied to a single set of spark plugs only, which results from two fundamental ideas behind designing a control system. Independent operation of two ignition control systems – if two control systems operate simultaneously. It is assumed that the entire individual control should be performed for a front spark plug only and a rear spark plug shall be controlled with a fixed (or specific) offset relative to the front one or from a reference map. The developed algorithms will be verified by simulation and engine test sand experiments. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under Grant Agreement No. INNOLOT/I/1/NCBR/2013.

Keywords: algorithm, combustion process, radial engine, spark plug

Procedia PDF Downloads 270
16473 Modeling Studies on the Elevated Temperatures Formability of Tube Ends Using RSM

Authors: M. J. Davidson, N. Selvaraj, L. Venugopal

Abstract:

The elevated temperature forming studies on the expansion of thin walled tubes have been studied in the present work. The influence of process parameters namely the die angle, the die ratio and the operating temperatures on the expansion of tube ends at elevated temperatures is carried out. The range of operating parameters have been identified by perfoming extensive simulation studies. The hot forming parameters have been evaluated for AA2014 alloy for performing the simulation studies. Experimental matrix has been developed from the feasible range got from the simulation results. The design of experiments is used for the optimization of process parameters. Response Surface Method’s (RSM) and Box-Behenken design (BBD) is used for developing the mathematical model for expansion. Analysis of variance (ANOVA) is used to analyze the influence of process parameters on the expansion of tube ends. The effect of various process combinations of expansion are analyzed through graphical representations. The developed model is found to be appropriate as the coefficient of determination value is very high and is equal to 0.9726. The predicted values are found to coincide well with the experimental results, within acceptable error limits.

Keywords: expansion, optimization, Response Surface Method (RSM), ANOVA, bbd, residuals, regression, tube

Procedia PDF Downloads 484
16472 Modeling of Diurnal Pattern of Air Temperature in a Tropical Environment: Ile-Ife and Ibadan, Nigeria

Authors: Rufus Temidayo Akinnubi, M. O. Adeniyi

Abstract:

Existing diurnal air temperature models simulate night time air temperature over Nigeria with high biases. An improved parameterization is presented for modeling the diurnal pattern of air temperature (Ta) which is applicable in the calculation of turbulent heat fluxes in Global climate models, based on Nigeria Micrometeorological Experimental site (NIMEX) surface layer observations. Five diurnal Ta models for estimating hourly Ta from daily maximum, daily minimum, and daily mean air temperature were validated using root-mean-square error (RMSE), Mean Error Bias (MBE) and scatter graphs. The original Fourier series model showed better performance for unstable air temperature parameterizations while the stable Ta was strongly overestimated with a large error. The model was improved with the inclusion of the atmospheric cooling rate that accounts for the temperature inversion that occurs during the nocturnal boundary layer condition. The MBE and RMSE estimated by the modified Fourier series model reduced by 4.45 oC and 3.12 oC during the transitional period from dry to wet stable atmospheric conditions. The modified Fourier series model gave good estimation of the diurnal weather patterns of Ta when compared with other existing models for a tropical environment.

Keywords: air temperature, mean bias error, Fourier series analysis, surface energy balance,

Procedia PDF Downloads 206
16471 Enabling Gender Equality in Leadership: An Exploration of Leadership and Self-Awareness, Using Community Participatory Action Research Methods

Authors: Robyn Jackaman

Abstract:

This research explores the characterization of leadership, self-awareness, and gender identity within a higher educational institution. This is in response to the widely researched area of gender in relation to senior management levels and the contemporary reflection of this issue in leadership, where gender diversity is lacking. Through organizational platforms, the University has self-identified issues relating to gender, equality, and representation. With equality being central to the core of the project, a Community Participatory Action Research approach was implemented. This approach was chosen as it is recognized for facilitating change within community contexts which complements the University Campus culture. Seventeen semi-structured interviews gave qualitative insight into working habitus (from both professional and academic services), leadership attributions and qualities and gender significance within the workplace. The research team (cross-disciplinary) used framework analysis to code and categorized the data. Key findings presented categories in gender significance to personal/work identity, organizational change and positive reflections on leadership characteristics and roles. This research has helped support the creation of tools to better assist the organization in gender equality, inclusion, and leadership development.

Keywords: gendered work, gender equality, leadership, university organization

Procedia PDF Downloads 145
16470 Prediction of California Bearing Ratio of a Black Cotton Soil Stabilized with Waste Glass and Eggshell Powder using Artificial Neural Network

Authors: Biruhi Tesfaye, Avinash M. Potdar

Abstract:

The laboratory test process to determine the California bearing ratio (CBR) of black cotton soils is not only overpriced but also time-consuming as well. Hence advanced prediction of CBR plays a significant role as it is applicable In pavement design. The prediction of CBR of treated soil was executed by Artificial Neural Networks (ANNs) which is a Computational tool based on the properties of the biological neural system. To observe CBR values, combined eggshell and waste glass was added to soil as 4, 8, 12, and 16 % of the weights of the soil samples. Accordingly, the laboratory related tests were conducted to get the required best model. The maximum CBR value found at 5.8 at 8 % of eggshell waste glass powder addition. The model was developed using CBR as an output layer variable. CBR was considered as a function of the joint effect of liquid limit, plastic limit, and plastic index, optimum moisture content and maximum dry density. The best model that has been found was ANN with 5, 6 and 1 neurons in the input, hidden and output layer correspondingly. The performance of selected ANN has been 0.99996, 4.44E-05, 0.00353 and 0.0067 which are correlation coefficient (R), mean square error (MSE), mean absolute error (MAE) and root mean square error (RMSE) respectively. The research presented or summarized above throws light on future scope on stabilization with waste glass combined with different percentages of eggshell that leads to the economical design of CBR acceptable to pavement sub-base or base, as desired.

Keywords: CBR, artificial neural network, liquid limit, plastic limit, maximum dry density, OMC

Procedia PDF Downloads 161
16469 Transforming Butterworth Low Pass Filter into Microstrip Line Form at LC-Band Applications

Authors: Liew Hui Fang, Syed Idris Syed Hassan, Mohd Fareq Abd. Malek, Yufridin Wahab, Norshafinash Saudin

Abstract:

The paper implementation new approach method applied into transforming lumped element circuit into microstrip line form for Butterworth low pass filter which is operating at LC band. The filter’s lumped element circuits and microstrip line form were first designed and simulated using Advanced Design Software (ADS) to obtain the best filter characteristic based on S-parameter and implemented on FR4 substrate for order N=3,4,5,6,7,8 and 9. The importance of a new approach of transforming method as a correction factor has been considered into designed microstrip line. From ADS simulation results proved that the response of microstrip line circuit of Butterworth low pass filter with fringing correction factor has an excellent agreement with its lumped circuit. This shows that the new approach of transforming lumped element circuit into microstrip line is able to solve the conventional design of complexity size of circuit of Butterworth low pass filter (LPF) into microstrip line.

Keywords: Butterworth low pass filter, number of order, microstrip line, microwave filter, maximally flat

Procedia PDF Downloads 303
16468 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software

Authors: Chandra Mukherjee

Abstract:

The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.

Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction

Procedia PDF Downloads 382
16467 Performance Evaluation of Production Schedules Based on Process Mining

Authors: Kwan Hee Han

Abstract:

External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.

Keywords: data mining, event log, process mining, production scheduling

Procedia PDF Downloads 252
16466 Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries

Authors: Elham Alaee, Mousa Shamsi, Hossein Ahmadi, Soroosh Nazem, Mohammad Hossein Sedaaghi

Abstract:

Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm.

Keywords: facial image, segmentation, PCM, FCM, skin error, facial surgery

Procedia PDF Downloads 558
16465 Assessing the Preparedness of Teachers for Their Role in an Inclusive Classroom: Photo-Voice as a Reflexive Tool

Authors: Nan Stevens

Abstract:

Photo-voice is a participatory method through which participants identify and represent their lived experiences and contexts through the use of photo imagery. Photo-voice is a qualitative research method that explores individuals’ lived experiences. This method is known as a creative art form to help researchers listen to the 'voice' of a certain population. A teacher educator at Thompson Rivers University, responsible for preparing new teachers for the demands of the profession in an ever-changing demographic, utilized the Photo-voice method to enable a self-study of emerging teachers’ readiness for the inclusive classroom. Coding analysis was applied to 96 Photo-voice portfolios, which were created over two years with the Inclusive Education course work, in a Bachelor of Education program (Elementary). Coding utilized students’ written associations to their visual images, anecdotes attached to visual metaphors, and personal narratives that illustrated the professional development process in which they were engaged. Thematic findings include: 1) becoming an inclusive educator is a process; 2) one must be open to identifying and exploring their fear and biases, and 3) an attitudinal shift enables relevant skill acquisition and readiness for working with diverse student needs.

Keywords: teacher education, inclusive education, professional development, Photo-voice

Procedia PDF Downloads 112
16464 Low-Cost Reversible Logic Serial Multipliers with Error Detection Capability

Authors: Mojtaba Valinataj

Abstract:

Nowadays reversible logic has received many attentions as one of the new fields for reducing the power consumption. On the other hand, the processing systems have weaknesses against different external effects. In this paper, some error detecting reversible logic serial multipliers are proposed by incorporating the parity-preserving gates. This way, the new designs are presented for signed parity-preserving serial multipliers based on the Booth's algorithm by exploiting the new arrangements of existing gates. The experimental results show that the proposed 4×4 multipliers in this paper reach up to 20%, 35%, and 41% enhancements in the number of constant inputs, quantum cost, and gate count, respectively, as the reversible logic criteria, compared to previous designs. Furthermore, all the proposed designs have been generalized for n×n multipliers with general formulations to estimate the main reversible logic criteria as the functions of the multiplier size.

Keywords: Booth’s algorithm, error detection, multiplication, parity-preserving gates, quantum computers, reversible logic

Procedia PDF Downloads 199
16463 Dialogue Meetings as an Arena for Collaboration and Reflection among Researchers and Practitioners

Authors: Kerstin Grunden, Ann Svensson, Berit Forsman, Christina Karlsson, Ayman Obeid

Abstract:

The research question of the article is to explore whether the dialogue meetings method could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in municipalities, or not. A testbed was planned to be implemented in a retirement home in a Swedish municipality, and the practitioners worked with a pre-study of that testbed. In the article, the dialogue between the researchers and the practitioners in the dialogue meetings is described and analyzed. The potential of dialogue meetings as an arena for learning and reflection among researchers and practitioners is discussed. The research methodology approach is participatory action research with mixed methods (dialogue meetings, focus groups, participant observations). The main findings from the dialogue meetings were that the researchers learned more about the use of traditional research methods, and the practitioners learned more about how they could improve their use of the methods to facilitate change processes in their organization. These findings have the potential both for the researchers and the practitioners to result in more relevant use of research methods in change processes in organizations. It is concluded that dialogue meetings could be relevant for reflective learning among researchers and practitioners when welfare technology should be implemented in a health care organization.

Keywords: dialogue meetings, implementation, reflection, test bed, welfare technology, participatory action research

Procedia PDF Downloads 117
16462 The Relevance of PISA Tests in the Decentralization of the Educational System in Romania

Authors: Nitu Marilena Cristina

Abstract:

Decentralization of the education system is an educational policy option necessary from the perspective of democratizing internal life and streamlining service administration public. The experience of recent years has shown that decisions taken at central level do not to take into account all situations and especially all the specific needs and interests of the various institutions and individuals. A democratic society implies that the decision-making process is brought closer to the place of application, allowing citizens to take part in the decision-making that affects them directly or indirectly. Essentially decentralization of pre-university education is the transfer of authority, responsibility and resources in decision-making and general management, and financially to the educational units and the local community. This creates a frame of an effective collaboration between school and community. Modern theories on the leadership of education advocate the adoption of decentralization measures and participatory strategies. Numerous countries confronted with the educational impasse has appealed to these strategies. Reforming projects have begun application diversified and nuanced social decentralization models according to the specific social and educational situation. Analysis of legal provisions and measures adopted in the framework of the reform process indicates that, at least formally, decentralization is the solution chosen.

Keywords: decentralization, educational, management, reforming

Procedia PDF Downloads 142
16461 Structural and Morphological Characterization of Inorganic Deposits in Spinal Ligaments

Authors: Sylwia Orzechowska, Andrzej Wróbel, Eugeniusz Rokita

Abstract:

The mineralization is a curious problem of connective tissues. Factors which may play a decisive role in the regulation of the yellow ligaments (YL) mineralization are still open questions. The aim of the studies was a detailed description of the chemical composition and morphology of mineral deposits in the human yellow ligaments. Investigations of the structural features of deposits were used to explain the impact of various factors on mineralization process. The studies were carried out on 24 YL samples, surgically removed from patients suffer from spinal canal stenosis and the patients who sustained a trauma. The micro-computed tomography was used to describe the morphology of mineral deposits. The X-ray fluorescence method and Fourier transform infrared spectroscopy were applied to determine the chemical composition of the samples. In order to eliminate the effect of blur in microtomographic images, the correction method of partial volume effect was used. The mineral deposits appear in 60% of YL samples, both in patients with a stenosis and following injury. The mineral deposits have a heterogeneous structure and they are a mixture of the tissue and mineral grains. The volume of mineral grains amounts to (1.9 ± 3.4)*10-3 mm3 while the density distribution of grains occurs in two distinct ranges (1.75 - 2.15 and 2.15-2.5) g/cm3. Application of the partial volume effect correction allows accurate calculations by eliminating the averaging effect of gray levels in tomographic images. The B-type carbonate-containing hydroxyapatite constitutes the mineral phase of majority YLs. The main phase of two samples was calcium pyrophosphate dihydrate (CPPD). The elemental composition of minerals in all samples is almost identical. This pathology may be independent on the spine diseases and it does not evoke canal stenosis. The two ranges of grains density indicate two stages of grains growth and the degree of maturity. The presence of CPPD crystals may coexist with other pathologies.

Keywords: FTIR, micro-tomography, mineralization, spinal ligaments

Procedia PDF Downloads 359
16460 Using the Bootstrap for Problems Statistics

Authors: Brahim Boukabcha, Amar Rebbouh

Abstract:

The bootstrap method based on the idea of exploiting all the information provided by the initial sample, allows us to study the properties of estimators. In this article we will present a theoretical study on the different methods of bootstrapping and using the technique of re-sampling in statistics inference to calculate the standard error of means of an estimator and determining a confidence interval for an estimated parameter. We apply these methods tested in the regression models and Pareto model, giving the best approximations.

Keywords: bootstrap, error standard, bias, jackknife, mean, median, variance, confidence interval, regression models

Procedia PDF Downloads 359
16459 The AI Method and System for Analyzing Wound Status in Wound Care Nursing

Authors: Ho-Hsin Lee, Yue-Min Jiang, Shu-Hui Tsai, Jian-Ren Chen, Mei-Yu XU, Wen-Tien Wu

Abstract:

This project presents an AI-based method and system for wound status analysis. The system uses a three-in-one sensor device to analyze wound status, including color, temperature, and a 3D sensor to provide wound information up to 2mm below the surface, such as redness, heat, and blood circulation information. The system has a 90% accuracy rate, requiring only one manual correction in 70% of cases, with a one-second delay. The system also provides an offline application that allows for manual correction of the wound bed range using color-based guidance to estimate wound bed size with 96% accuracy and a maximum of one manual correction in 96% of cases, with a one-second delay. Additionally, AI-assisted wound bed range selection achieves 100% of cases without manual intervention, with an accuracy rate of 76%, while AI-based wound tissue type classification achieves an 85.3% accuracy rate for five categories. The AI system also includes similar case search and expert recommendation capabilities. For AI-assisted wound range selection, the system uses WIFI6 technology, increasing data transmission speeds by 22 times. The project aims to save up to 64% of the time required for human wound record keeping and reduce the estimated time to assess wound status by 96%, with an 80% accuracy rate. Overall, the proposed AI method and system integrate multiple sensors to provide accurate wound information and offer offline and online AI-assisted wound bed size estimation and wound tissue type classification. The system decreases delay time to one second, reduces the number of manual corrections required, saves time on wound record keeping, and increases data transmission speed, all of which have the potential to significantly improve wound care and management efficiency and accuracy.

Keywords: wound status analysis, AI-based system, multi-sensor integration, color-based guidance

Procedia PDF Downloads 80
16458 Meta Mask Correction for Nuclei Segmentation in Histopathological Image

Authors: Jiangbo Shi, Zeyu Gao, Chen Li

Abstract:

Nuclei segmentation is a fundamental task in digital pathology analysis and can be automated by deep learning-based methods. However, the development of such an automated method requires a large amount of data with precisely annotated masks which is hard to obtain. Training with weakly labeled data is a popular solution for reducing the workload of annotation. In this paper, we propose a novel meta-learning-based nuclei segmentation method which follows the label correction paradigm to leverage data with noisy masks. Specifically, we design a fully conventional meta-model that can correct noisy masks by using a small amount of clean meta-data. Then the corrected masks are used to supervise the training of the segmentation model. Meanwhile, a bi-level optimization method is adopted to alternately update the parameters of the main segmentation model and the meta-model. Extensive experimental results on two nuclear segmentation datasets show that our method achieves the state-of-the-art result. In particular, in some noise scenarios, it even exceeds the performance of training on supervised data.

Keywords: deep learning, histopathological image, meta-learning, nuclei segmentation, weak annotations

Procedia PDF Downloads 120
16457 Model-Driven and Data-Driven Approaches for Crop Yield Prediction: Analysis and Comparison

Authors: Xiangtuo Chen, Paul-Henry Cournéde

Abstract:

Crop yield prediction is a paramount issue in agriculture. The main idea of this paper is to find out efficient way to predict the yield of corn based meteorological records. The prediction models used in this paper can be classified into model-driven approaches and data-driven approaches, according to the different modeling methodologies. The model-driven approaches are based on crop mechanistic modeling. They describe crop growth in interaction with their environment as dynamical systems. But the calibration process of the dynamic system comes up with much difficulty, because it turns out to be a multidimensional non-convex optimization problem. An original contribution of this paper is to propose a statistical methodology, Multi-Scenarios Parameters Estimation (MSPE), for the parametrization of potentially complex mechanistic models from a new type of datasets (climatic data, final yield in many situations). It is tested with CORNFLO, a crop model for maize growth. On the other hand, the data-driven approach for yield prediction is free of the complex biophysical process. But it has some strict requirements about the dataset. A second contribution of the paper is the comparison of these model-driven methods with classical data-driven methods. For this purpose, we consider two classes of regression methods, methods derived from linear regression (Ridge and Lasso Regression, Principal Components Regression or Partial Least Squares Regression) and machine learning methods (Random Forest, k-Nearest Neighbor, Artificial Neural Network and SVM regression). The dataset consists of 720 records of corn yield at county scale provided by the United States Department of Agriculture (USDA) and the associated climatic data. A 5-folds cross-validation process and two accuracy metrics: root mean square error of prediction(RMSEP), mean absolute error of prediction(MAEP) were used to evaluate the crop prediction capacity. The results show that among the data-driven approaches, Random Forest is the most robust and generally achieves the best prediction error (MAEP 4.27%). It also outperforms our model-driven approach (MAEP 6.11%). However, the method to calibrate the mechanistic model from dataset easy to access offers several side-perspectives. The mechanistic model can potentially help to underline the stresses suffered by the crop or to identify the biological parameters of interest for breeding purposes. For this reason, an interesting perspective is to combine these two types of approaches.

Keywords: crop yield prediction, crop model, sensitivity analysis, paramater estimation, particle swarm optimization, random forest

Procedia PDF Downloads 209