Search results for: ordinal scale
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6033

Search results for: ordinal scale

6033 Evaluation of the Effect of Learning Disabilities and Accommodations on the Prediction of the Exam Performance: Ordinal Decision-Tree Algorithm

Authors: G. Singer, M. Golan

Abstract:

Providing students with learning disabilities (LD) with extra time to grant them equal access to the exam is a necessary but insufficient condition to compensate for their LD; there should also be a clear indication that the additional time was actually used. For example, if students with LD use more time than students without LD and yet receive lower grades, this may indicate that a different accommodation is required. If they achieve higher grades but use the same amount of time, then the effectiveness of the accommodation has not been demonstrated. The main goal of this study is to evaluate the effect of including parameters related to LD and extended exam time, along with other commonly-used characteristics (e.g., student background and ability measures such as high-school grades), on the ability of ordinal decision-tree algorithms to predict exam performance. We use naturally-occurring data collected from hundreds of undergraduate engineering students. The sub-goals are i) to examine the improvement in prediction accuracy when the indicator of exam performance includes 'actual time used' in addition to the conventional indicator (exam grade) employed in most research; ii) to explore the effectiveness of extended exam time on exam performance for different courses and for LD students with different profiles (i.e., sets of characteristics). This is achieved by using the patterns (i.e., subgroups) generated by the algorithms to identify pairs of subgroups that differ in just one characteristic (e.g., course or type of LD) but have different outcomes in terms of exam performance (grade and time used). Since grade and time used to exhibit an ordering form, we propose a method based on ordinal decision-trees, which applies a weighted information-gain ratio (WIGR) measure for selecting the classifying attributes. Unlike other known ordinal algorithms, our method does not assume monotonicity in the data. The proposed WIGR is an extension of an information-theoretic measure, in the sense that it adjusts to the case of an ordinal target and takes into account the error severity between two different target classes. Specifically, we use ordinal C4.5, random-forest, and AdaBoost algorithms, as well as an ensemble technique composed of ordinal and non-ordinal classifiers. Firstly, we find that the inclusion of LD and extended exam-time parameters improves prediction of exam performance (compared to specifications of the algorithms that do not include these variables). Secondly, when the indicator of exam performance includes 'actual time used' together with grade (as opposed to grade only), the prediction accuracy improves. Thirdly, our subgroup analyses show clear differences in the effect of extended exam time on exam performance among different courses and different student profiles. From a methodological perspective, we find that the ordinal decision-tree based algorithms outperform their conventional, non-ordinal counterparts. Further, we demonstrate that the ensemble-based approach leverages the strengths of each type of classifier (ordinal and non-ordinal) and yields better performance than each classifier individually.

Keywords: actual exam time usage, ensemble learning, learning disabilities, ordinal classification, time extension

Procedia PDF Downloads 99
6032 A Preliminary Conceptual Scale to Discretize the Distributed Manufacturing Continuum

Authors: Ijaz Ul Haq, Fiorenzo Franceschini

Abstract:

The distributed manufacturing methodology brings a new concept of decentralized manufacturing operations close to the proximity of end users. A preliminary scale, to measure distributed capacity and evaluate positioning of firms, is developed in this research. In the first part of the paper, a literature review has been performed which highlights the explorative nature of the studies conducted to present definitions and classifications due to novelty of this topic. From literature, five dimensions of distributed manufacturing development stages have been identified: localization, manufacturing technologies, customization and personalization, digitalization and democratization of design. Based on these determinants a conceptual scale is proposed to measure the status of distributed manufacturing of a generic firm. A multiple case study is then conducted in two steps to test the conceptual scale and to identify the corresponding level of distributed potential in each case study firm.

Keywords: distributed manufacturing, distributed capacity, localized production, ordinal scale

Procedia PDF Downloads 162
6031 Self-Image of Police Officers

Authors: Leo Carlo B. Rondina

Abstract:

Self-image is an important factor to improve the self-esteem of the personnel. The purpose of the study is to determine the self-image of the police. The respondents were the 503 policemen assigned in different Police Station in Davao City, and they were chosen with the used of random sampling. With the used of Exploratory Factor Analysis (EFA), latent construct variables of police image were identified as follows; professionalism, obedience, morality and justice and fairness. Further, ordinal regression indicates statistical characteristics on ages 21-40 which means the age of the respondent statistically improves self-image.

Keywords: police image, exploratory factor analysis, ordinal regression, Galatea effect

Procedia PDF Downloads 284
6030 Selection of Designs in Ordinal Regression Models under Linear Predictor Misspecification

Authors: Ishapathik Das

Abstract:

The purpose of this article is to find a method of comparing designs for ordinal regression models using quantile dispersion graphs in the presence of linear predictor misspecification. The true relationship between response variable and the corresponding control variables are usually unknown. Experimenter assumes certain form of the linear predictor of the ordinal regression models. The assumed form of the linear predictor may not be correct always. Thus, the maximum likelihood estimates (MLE) of the unknown parameters of the model may be biased due to misspecification of the linear predictor. In this article, the uncertainty in the linear predictor is represented by an unknown function. An algorithm is provided to estimate the unknown function at the design points where observations are available. The unknown function is estimated at all points in the design region using multivariate parametric kriging. The comparison of the designs are based on a scalar valued function of the mean squared error of prediction (MSEP) matrix, which incorporates both variance and bias of the prediction caused by the misspecification in the linear predictor. The designs are compared using quantile dispersion graphs approach. The graphs also visually depict the robustness of the designs on the changes in the parameter values. Numerical examples are presented to illustrate the proposed methodology.

Keywords: model misspecification, multivariate kriging, multivariate logistic link, ordinal response models, quantile dispersion graphs

Procedia PDF Downloads 390
6029 The Effect of Initial Sample Size and Increment in Simulation Samples on a Sequential Selection Approach

Authors: Mohammad H. Almomani

Abstract:

In this paper, we argue the effect of the initial sample size, and the increment in simulation samples on the performance of a sequential approach that used in selecting the top m designs when the number of alternative designs is very large. The sequential approach consists of two stages. In the first stage the ordinal optimization is used to select a subset that overlaps with the set of actual best k% designs with high probability. Then in the second stage the optimal computing budget is used to select the top m designs from the selected subset. We apply the selection approach on a generic example under some parameter settings, with a different choice of initial sample size and the increment in simulation samples, to explore the impacts on the performance of this approach. The results show that the choice of initial sample size and the increment in simulation samples does affect the performance of a selection approach.

Keywords: Large Scale Problems, Optimal Computing Budget Allocation, ordinal optimization, simulation optimization

Procedia PDF Downloads 354
6028 Assessing the Impacts of Urbanization on Urban Precincts: A Case of Golconda Precinct, Hyderabad

Authors: Sai AKhila Budaraju

Abstract:

Heritage sites are an integral part of cities and carry a sense of identity to the cities/ towns, but the process of urbanization is a carrying potential threat for the loss of these heritage sites/monuments. Both Central and State Governments listed the historic Golconda fort as National Important Monument and the Heritage precinct with eight heritage-listed buildings and two historical sites respectively, for conservation and preservation, due to the presence of IT Corridor 6kms away accommodating more people in the precinct is under constant pressure. The heritage precinct possesses high property values, being a prime location connecting the IT corridor and CBD (central business district )areas. The primary objective of the study was to assess and identify the factors that are affecting the heritage precinct through Mapping and documentation, Identifying and assessing the factors through empirical analysis, Ordinal regression analysis and Hedonic Pricing Model. Ordinal regression analysis was used to identify the factors that contribute to the changes in the precinct due to urbanization. Hedonic Pricing Model was used to understand and establish a relation whether the presence of historical monuments is also a contributing factor to the property value and to what extent this influence can contribute. The above methods and field visit indicates the Physical, socio-economic factors and the neighborhood characteristics of the precinct contributing to the property values. The outturns and the potential elements derived from the analysis of the Development Control Rules were derived as recommendations to Integrate both Old and newly built environments.

Keywords: heritage planning, heritage conservation, hedonic pricing model, ordinal regression analysis

Procedia PDF Downloads 190
6027 Ordinal Regression with Fenton-Wilkinson Order Statistics: A Case Study of an Orienteering Race

Authors: Joonas Pääkkönen

Abstract:

In sports, individuals and teams are typically interested in final rankings. Final results, such as times or distances, dictate these rankings, also known as places. Places can be further associated with ordered random variables, commonly referred to as order statistics. In this work, we introduce a simple, yet accurate order statistical ordinal regression function that predicts relay race places with changeover-times. We call this function the Fenton-Wilkinson Order Statistics model. This model is built on the following educated assumption: individual leg-times follow log-normal distributions. Moreover, our key idea is to utilize Fenton-Wilkinson approximations of changeover-times alongside an estimator for the total number of teams as in the notorious German tank problem. This original place regression function is sigmoidal and thus correctly predicts the existence of a small number of elite teams that significantly outperform the rest of the teams. Our model also describes how place increases linearly with changeover-time at the inflection point of the log-normal distribution function. With real-world data from Jukola 2019, a massive orienteering relay race, the model is shown to be highly accurate even when the size of the training set is only 5% of the whole data set. Numerical results also show that our model exhibits smaller place prediction root-mean-square-errors than linear regression, mord regression and Gaussian process regression.

Keywords: Fenton-Wilkinson approximation, German tank problem, log-normal distribution, order statistics, ordinal regression, orienteering, sports analytics, sports modeling

Procedia PDF Downloads 123
6026 Research Opportunities in Business Process Management and Performance Measurement from a Constructivist View

Authors: R.T.O. Lacerda, L. Ensslin., S.R. Ensslin, L. Knoff

Abstract:

This research paper aims to discover research opportunities in business process management and performance measurement from a constructivist view. The nature of this research is exploratory and descriptive and the research method was performed in a qualitative way. The process narrowed down 2142 articles, gathered after a search in scientific databases, and identified 16 articles that were relevant to the research and highly cited. The analysis found that most of the articles uses realistic approach and there is a need to analyze the decision making process in a singular manner. The measurement criteria are identified from scientific literature searching, in most cases, using ordinal scale without any integration process to present the results to the decision maker. Regarding management aspects, most of the articles do not have a structured process to measure the current situation and generate improvements opportunities.

Keywords: performance measurement, BPM, decision, research opportunities

Procedia PDF Downloads 309
6025 Multidimensional Poverty and Child Cognitive Development

Authors: Bidyadhar Dehury, Sanjay Kumar Mohanty

Abstract:

According to the Right to Education Act of India, education is the fundamental right of all children of age group 6-14 year irrespective of their status. Using the unit level data from India Human Development Survey (IHDS), we tried to understand the inter-relationship between the level of poverty and the academic performance of the children aged 8-11 years. The level of multidimensional poverty is measured using five dimensions and 10 indicators using Alkire-Foster approach. The weighted deprivation score was obtained by giving equal weight to each dimension and indicators within the dimension. The weighted deprivation score varies from 0 to 1 and grouped into four categories as non-poor, vulnerable, multidimensional poor and sever multidimensional poor. The academic performance index was measured using three variables reading skills, math skills and writing skills using PCA. The bivariate and multivariate analysis was used in the analysis. The outcome variable was ordinal. So the predicted probabilities were calculated using the ordinal logistic regression. The predicted probabilities of good academic performance index was 0.202 if the child was sever multidimensional poor, 0.235 if the child was multidimensional poor, 0.264 if the child was vulnerable, and 0.316 if the child was non-poor. Hence, if the level of poverty among the children decreases from sever multidimensional poor to non-poor, the probability of good academic performance increases.

Keywords: multidimensional poverty, academic performance index, reading skills, math skills, writing skills, India

Procedia PDF Downloads 589
6024 Reducing the Risk of Alcohol Relapse after Liver-Transplantation

Authors: Rebeca V. Tholen, Elaine Bundy

Abstract:

Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease Background: Liver transplantation (LT) is considered the only curative treatment for end-stage liver disease (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving an LT. Methods: The HRAR Scale is a predictive tool designed to determine the severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients. (ESLD). The effects of alcoholism can cause irreversible liver damage, cirrhosis and subsequent liver failure. Alcohol relapse after transplant occurs in 20-50% of patients, and increases the risk for recurrent cirrhosis, organ rejection, and graft failure. Alcohol relapse after transplant has been identified as a problem among liver transplant recipients at a large urban academic transplant center in the United States. Transplantation will reverse the complications of ESLD, but it does not treat underlying alcoholism or reduce the risk of relapse after transplant. The purpose of this quality improvement project is to implement and evaluate the effectiveness of a High-Risk Alcoholism Relapse (HRAR) Scale to screen and identify patients at high-risk for alcohol relapse after receiving a LT. Methods: The HRAR Scale is a predictive tool designed to determine severity of alcoholism and risk of relapse after transplant. The scale consists of three variables identified as having the highest predictive power for early relapse including, daily number of drinks, history of previous inpatient treatment for alcoholism, and the number of years of heavy drinking. All adult liver transplant recipients at a large urban transplant center were screened with the HRAR Scale prior to hospital discharge. A zero to two ordinal score is ranked for each variable, and the total score ranges from zero to six. High-risk scores are between three to six. Results: Descriptive statistics revealed 25 patients were newly transplanted and discharged from the hospital during an 8-week period. 40% of patients (n=10) were identified as being high-risk for relapse and 60% low-risk (n=15). The daily number of drinks were determined by alcohol content (1 drink = 15g of ethanol) and number of drinks per day. 60% of patients reported drinking 9-17 drinks per day, and 40% reported ≤ 9 drinks. 50% of high-risk patients reported drinking ≥ 25 years, 40% for 11-25 years, and 10% ≤ 11 years. For number of inpatient treatments for alcoholism, 50% received inpatient treatment one time, 20% ≥ 1, and 30% reported never receiving inpatient treatment. Findings reveal the importance and value of a validated screening tool as a more efficient method than other screening methods alone. Integration of a structured clinical tool will help guide the drinking history portion of the psychosocial assessment. Targeted interventions can be implemented for all high-risk patients. Conclusions: Our findings validate the effectiveness of utilizing the HRAR scale to screen and identify patients who are a high-risk for alcohol relapse post-LT. Recommendations to help maintain post-transplant sobriety include starting a transplant support group within the organization for all high-risk patients.

Keywords: alcoholism, liver transplant, quality improvement, substance abuse

Procedia PDF Downloads 115
6023 Developing of Attitude towards Using Complementary Treatments Scale in Turkey

Authors: Ayşegül Bilge, Merve Uğuryol, Şeyda Dülgerler, Mustafa Yıldız

Abstract:

The purpose of this research is to prove the Attitude towards Using Complementary Treatments Scale reliability and validity. The research is a methodological type of research that has been planned to determine the validity and reliability of the Attitude towards Using Complementary Treatments Scale. The scale has been developed by the researchers. In the scale, there are 23 questions including complementary and modern therapies individuals apply when they have health problems 4-item Likert-type evaluation has been carried out in preparing the questionnaire. High score obtained from the scale indicates a positive attitude towards complementary therapies. In the course of validity assessment of the scale, expert opinion has been received, and the content validity of the scale has been determined by using Kendall coefficient correlation test (Wa=0.200, p = 0.460). In the course of the reliability assessment of the scale, total score correlations of 23 materials have been examined, and those under 0.20 correlation limit has been removed from the scale correlation. As a result, the scale was left to be 13 items. In the internal consistency tests of the analyses, Cronbach's alpha value has been found to be 0.79. As a result, of the validity analyses of the Attitude towards Using Complementary Treatments Scale, the content and language validity analyses has been found to be at the expected level. It has been determined to be a highly reliable scale as the result of the reliability analyses. In conclusion, Attitude towards Using Complementary Treatments Scale is a valid and reliable scale.

Keywords: alternative health care, complementary treatment, instrument development, nursing practice

Procedia PDF Downloads 397
6022 Using Soil Texture Field Observations as Ordinal Qualitative Variables for Digital Soil Mapping

Authors: Anne C. Richer-De-Forges, Dominique Arrouays, Songchao Chen, Mercedes Roman Dobarco

Abstract:

Most of the digital soil mapping (DSM) products rely on machine learning (ML) prediction models and/or the use or pedotransfer functions (PTF) in which calibration data come from soil analyses performed in labs. However, many other observations (often qualitative, nominal, or ordinal) could be used as proxies of lab measurements or as input data for ML of PTF predictions. DSM and ML are briefly described with some examples taken from the literature. Then, we explore the potential of an ordinal qualitative variable, i.e., the hand-feel soil texture (HFST) estimating the mineral particle distribution (PSD): % of clay (0-2µm), silt (2-50µm) and sand (50-2000µm) in 15 classes. The PSD can also be measured by lab measurements (LAST) to determine the exact proportion of these particle-sizes. However, due to cost constraints, HFST are much more numerous and spatially dense than LAST. Soil texture (ST) is a very important soil parameter to map as it is controlling many of the soil properties and functions. Therefore, comes an essential question: is it possible to use HFST as a proxy of LAST for calibration and/or validation of DSM predictions of ST? To answer this question, the first step is to compare HFST with LAST on a representative set where both information are available. This comparison was made on ca 17,400 samples representative of a French region (34,000 km2). The accuracy of HFST was assessed, and each HFST class was characterized by a probability distribution function (PDF) of its LAST values. This enables to randomly replace HFST observations by LAST values while respecting the PDF previously calculated and results in a very large increase of observations available for the calibration or validation of PTF and ML predictions. Some preliminary results are shown. First, the comparison between HFST classes and LAST analyses showed that accuracies could be considered very good when compared to other studies. The causes of some inconsistencies were explored and most of them were well explained by other soil characteristics. Then we show some examples applying these relationships and the increase of data to several issues related to DSM. The first issue is: do the PDF functions that were established enable to use HSFT class observations to improve the LAST soil texture prediction? For this objective, we replaced all HFST for topsoil by values from the PDF 100 time replicates). Results were promising for the PTF we tested (a PTF predicting soil water holding capacity). For the question related to the ML prediction of LAST soil texture on the region, we did the same kind of replacement, but we implemented a 10-fold cross-validation using points where we had LAST values. We obtained only preliminary results but they were rather promising. Then we show another example illustrating the potential of using HFST as validation data. As in numerous countries, the HFST observations are very numerous; these promising results pave the way to an important improvement of DSM products in all the countries of the world.

Keywords: digital soil mapping, improvement of digital soil mapping predictions, potential of using hand-feel soil texture, soil texture prediction

Procedia PDF Downloads 222
6021 Pharmaceutical Scale up for Solid Dosage Forms

Authors: A. Shashank Tiwari, S. P. Mahapatra

Abstract:

Scale-up is defined as the process of increasing batch size. Scale-up of a process viewed as a procedure for applying the same process to different output volumes. There is a subtle difference between these two definitions: batch size enlargement does not always translate into a size increase of the processing volume. In mixing applications, scale-up is indeed concerned with increasing the linear dimensions from the laboratory to the plant size. On the other hand, processes exist (e.g., tableting) where the term ‘scale-up’ simply means enlarging the output by increasing the speed. To complete the picture, one should point out special procedures where an increase of the scale is counterproductive and ‘scale-down’ is required to improve the quality of the product. In moving from Research and Development (R&D) to production scale, it is sometimes essential to have an intermediate batch scale. This is achieved at the so-called pilot scale, which is defined as the manufacturing of drug product by a procedure fully representative of and simulating that used for full manufacturing scale. This scale also makes it possible to produce enough products for clinical testing and to manufacture samples for marketing. However, inserting an intermediate step between R&D and production scales does not, in itself, guarantee a smooth transition. A well-defined process may generate a perfect product both in the laboratory and the pilot plant and then fail quality assurance tests in production.

Keywords: scale up, research, size, batch

Procedia PDF Downloads 411
6020 Political Economy and Human Rights Engaging in Conversation

Authors: Manuel Branco

Abstract:

This paper argues that mainstream economics is one of the reasons that can explain the difficulty in fully realizing human rights because its logic is intrinsically contradictory to human rights, most especially economic, social and cultural rights. First, its utilitarianism, both in its cardinal and ordinal understanding, contradicts human rights principles. Maximizing aggregate utility along the lines of cardinal utility is a theoretical exercise that consists in ensuring as much as possible that gains outweigh losses in society. In this process an individual may get worse off, though. If mainstream logic is comfortable with this, human rights' logic does not. Indeed, universality is a key principle in human rights and for this reason the maximization exercise should aim at satisfying all citizens’ requests when goods and services necessary to secure human rights are at stake. The ordinal version of utilitarianism, in turn, contradicts the human rights principle of indivisibility. Contrary to ordinal utility theory that ranks baskets of goods, human rights do not accept ranking when these goods and services are necessary to secure human rights. Second, by relying preferably on market logic to allocate goods and services, mainstream economics contradicts human rights because the intermediation of money prices and the purpose of profit may cause exclusion, thus compromising the principle of universality. Finally, mainstream economics sees human rights mainly as constraints to the development of its logic. According to this view securing human rights would, then, be considered a cost weighing on economic efficiency and, therefore, something to be minimized. Fully realizing human rights needs, therefore, a different approach. This paper discusses a human rights-based political economy. This political economy, among other characteristics should give up mainstream economics narrow utilitarian approach, give up its belief that market logic should guide all exchanges of goods and services between human beings, and finally give up its view of human rights as constraints on rational choice and consequently on good economic performance. Giving up mainstream’s narrow utilitarian approach means, first embracing procedural utility and human rights-aimed consequentialism. Second, a more radical break can be imagined; non-utilitarian, or even anti-utilitarian, approaches may emerge, then, as alternatives, these two standpoints being not necessarily mutually exclusive, though. Giving up market exclusivity means embracing decommodification. More specifically, this means an approach that takes into consideration the value produced outside the market and an allocation process no longer necessarily centered on money prices. Giving up the view of human rights as constraints means, finally, to consider human rights as an expression of wellbeing and a manifestation of choice. This means, in turn, an approach that uses indicators of economic performance other than growth at the macro level and profit at the micro level, because what we measure affects what we do.

Keywords: economic and social rights, political economy, economic theory, markets

Procedia PDF Downloads 150
6019 Delamination of Scale in a Fe Carbon Steel Surface by Effect of Interface Roughness and Oxide Scale Thickness

Authors: J. M. Lee, W. R. Noh, C. Y. Kim, M. G. Lee

Abstract:

Delamination of oxide scale has been often discovered at the interface between Fe carbon steel and oxide scale. Among several mechanisms of this delamination behavior, the normal tensile stress to the substrate-scale interface has been described as one of the main factors. The stress distribution at the interface is also known to be affected by thermal expansion mismatch between substrate and oxide scale, creep behavior during cooling and the geometry of the interface. In this study, stress states near the interface in a Fe carbon steel with oxide scale have been investigated using FE simulations. The thermal and mechanical properties of oxide scales are indicated in literature and Fe carbon steel is measured using tensile testing machine. In particular, the normal and shear stress components developed at the interface during bending are investigated. Preliminary numerical sensitivity analyses are provided to explain the effects of the interface geometry and oxide thickness on the delamination behavior.

Keywords: oxide scale, delamination, Fe analysis, roughness, thickness, stress state

Procedia PDF Downloads 342
6018 Development of the ‘Teacher’s Counselling Competence Self-Efficacy Scale’

Authors: Riin Seema

Abstract:

Guidance and counseling as a whole-school responsibility is a global trend. Counseling is a specific competence, that consist of cognitive, emotional, attitudinal, and behavioral components. To authors best knowledge, there are no self-assessment scales for teachers in the whole world to measure teachers’ counseling competency. In 2016 an Estonian scale on teachers counseling competence was developed during an Interdisciplinary Project at Tallinn University. The team consisted of 10 interdisciplinary students (psychology, nursery school, special and adult education) and their supervisor. In 2017 another international Interdisciplinary Project was carried out for adapting the scale in English for international students. Firstly, the Estonian scale was translated by 2 professional translators, and then a group of international Erasmus students (again from psychology, nursery school, special and adult education) selected the most suitable translation for the scale. The developed ‘Teacher’s Counselling Competence Self-Efficacy Scale’ measures teacher’s self-efficacy beliefs in their own competence to perform different counseling tasks (creating a counseling relationship, using different reflection techniques, etc.). The scale consists of 47 questions in a 5-point numeric scale. The scale is created based on counseling theory and scale development and validation theory. The scale has been used as a teaching and learning material for counseling courses by 174 Estonian and 10 international student teachers. After filling out the scale, the students also reflected on the scale and their own counseling competencies. The study showed that the scale is unidimensional and has an excellent Cronbach alpha coefficient. Student’s qualitative feedback on the scale has been very positive, as the scale supports their self-reflection. In conclusion, the developed ‘Teacher’s Counselling Competence Self-Efficacy Scale’ is a useful tool for supporting student teachers’ learning.

Keywords: competency, counseling, self-efficacy, teacher students

Procedia PDF Downloads 144
6017 Experimental Investigation of Fluid Dynamic Effects on Crystallisation Scale Growth and Suppression in Agitation Tank

Authors: Prasanjit Das, M. M. K. Khan, M. G. Rasul, Jie Wu, I. Youn

Abstract:

Mineral scale formation is undoubtedly a more serious problem in the mineral industry than other process industries. To better understand scale growth and suppression, an experimental model is proposed in this study for supersaturated crystallised solutions commonly found in mineral process plants. In this experiment, surface crystallisation of potassium nitrate (KNO3) on the wall of the agitation tank and agitation effects on the scale growth and suppression are studied. The new quantitative scale suppression model predicts that at lower agitation speed, the scale growth rate is enhanced and at higher agitation speed, the scale suppression rate increases due to the increased flow erosion effect. A lab-scale agitation tank with and without baffles were used as a benchmark in this study. The fluid dynamic effects on scale growth and suppression in the agitation tank with three different size impellers (diameter 86, 114, 160 mm and model A310 with flow number 0.56) at various ranges of rotational speed (up to 700 rpm) and solution with different concentration (4.5, 4.75 and 5.25 mol/dm3) were investigated. For more elucidation, the effects of the different size of the impeller on wall surface scale growth and suppression rate as well as bottom settled scale accumulation rate are also discussed. Emphasis was placed on applications in the mineral industry, although results are also relevant to other industrial applications.

Keywords: agitation tank, crystallisation, impeller speed, scale

Procedia PDF Downloads 220
6016 The Development of the Self-concept Scale for Elders in Taiwan

Authors: Ting-Chia Lien, Tzu-Yin Yen, Szu-Fan Chen, Tai-chun Kuo, Hung-Tse Lin, Yi-Chen Chung, Hock-Sen Gwee

Abstract:

The purpose of this study was to explore the result of the survey by developing “Self-Concept Scale for Elders”, which could provide community counseling and guidance institution for practical application. The sample of this study consisted of 332 elders in Taiwan (male: 33.4%; female: 66.6%). The mean age of participants was 65-98 years. The measurements applied in this study is “Self-Concept Scale for Elders”. After item and factor analyses, the preliminary version of the Self-Concept Scale for Elders was revised to the final version. The results were summarized as follows: 1) There were 10 items in Self-Concept Scale for Elders. 2) The variance explained for the scale accounted for 77.15%, with corrected item-total correlations Cronbach’s alpha=0.87. 3) The content validity, criterion validity and construct validity have been found to be satisfactory. Based on the findings, the implication and suggestions are offered for reference regarding counselor education and future research.

Keywords: self-concept, elder, development scale, applied psychology

Procedia PDF Downloads 568
6015 Provision of Different Layers of Activities for Different Iranian Intermediate English as a Foreign Language Learners for the Beneficial Use of Films within Speaking Classes

Authors: Zahra Ebrahimi, Abbas Moradan

Abstract:

This study investigated the effect of applying different layers of activity for different Iranian intermediate EFL learner’s oral proficiency and two of its components (fluency and accura-cy) for the beneficial use of films within speaking classes. For this purpose, thirty Iranian EFL intermediate learners were selected based on availability sampling, they were divided into one experimental group and one control group, each consisting of 15 participants, who were proved to be homogeneous based on the results obtained from IELTS oral proficien-cy test prior to the treatment. Experimental Group received the treatment which was apply-ing different layers of speaking tasks according to learners’ level of fluency and accuracy. Control group received ordinal treatment of speaking classrooms. The materials for this study consisted of 11 English movies for each session, voice-recorder device, and IELTS oral proficiency tests as well as two interviews based on Ur’s oral scale for measuring fluen-cy and accuracy. The treatment was run for 12 sessions in six weeks. At the end of the treatment, all the students both in experimental and control group were given a post-test interview based on Ur’s scale. To compare and contrast the amount of progress of the learners in different groups the results of the pre-test and post-test of speaking were analysed by using T-tests. Moreover, Multivariate analysis of variance was also used to check the hypotheses. Results showed that application of different layers of activity with regard to students’ level, led to a significantly superior performance in experimental group. Thus, this study verified the positive effect of implementation of different layers of activity and tasks to achieve progress in speaking skill. It can also help to create a less stressful at-mosphere of learning in which all the students will be given specific time to speak and lead them to be autonomous learners.

Keywords: differentiated instruction, learners’ style, multiple intelligence, speaking skill, task-based activities

Procedia PDF Downloads 141
6014 Grating Scale Thermal Expansion Error Compensation for Large Machine Tools Based on Multiple Temperature Detection

Authors: Wenlong Feng, Zhenchun Du, Jianguo Yang

Abstract:

To decrease the grating scale thermal expansion error, a novel method which based on multiple temperature detections is proposed. Several temperature sensors are installed on the grating scale and the temperatures of these sensors are recorded. The temperatures of every point on the grating scale are calculated by interpolating between adjacent sensors. According to the thermal expansion principle, the grating scale thermal expansion error model can be established by doing the integral for the variations of position and temperature. A novel compensation method is proposed in this paper. By applying the established error model, the grating scale thermal expansion error is decreased by 90% compared with no compensation. The residual positioning error of the grating scale is less than 15um/10m and the accuracy of the machine tool is significant improved.

Keywords: thermal expansion error of grating scale, error compensation, machine tools, integral method

Procedia PDF Downloads 363
6013 Developing New Media Credibility Scale: A Multidimensional Perspective

Authors: Hanaa Farouk Saleh

Abstract:

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

Keywords: credibility scale, media credibility components, new media credibility scale, scale development

Procedia PDF Downloads 320
6012 Impacts of Financial Development and Operational Scale on Bank Efficiencies in Taiwan

Authors: Ying-Hsiu Chen, Pao-Peng Hsu

Abstract:

This paper adopts a two-stage data envelopment analysis to explore the impacts of financial development and bank operational scale on bank efficiencies. The sample comprises of unbalanced panel data of 32 Taiwanese enlisted in domestic commercial banks over the period 1998 to 2013. Empirical results show that technical efficiency is positively related to financial development, whereas the effect of financial development on scale efficiency is insignificant. The effect of operational scale exerts a significantly positive effect on bank efficiencies, but the gain of efficiency is decreased gradually when operational scale increases. Furthermore, increase in capital adequacy ratio and market power of banks leads to a growth of bank efficiencies.

Keywords: financial development, operational scale, efficiency, DEA

Procedia PDF Downloads 523
6011 Prediction of Fire Growth of the Office by Real-Scale Fire Experiment

Authors: Kweon Oh-Sang, Kim Heung-Youl

Abstract:

Estimating the engineering properties of fires is important to be prepared for the complex and various fire risks of large-scale structures such as super-tall buildings, large stadiums, and multi-purpose structures. In this study, a mock-up of a compartment which was 2.4(L) x 3.6 (W) x 2.4 (H) meter in dimensions was fabricated at the 10MW LSC (Large Scale Calorimeter) and combustible office supplies were placed in the compartment for a real-scale fire test. Maximum heat release rate was 4.1 MW and total energy release obtained through the application of t2 fire growth rate was 6705.9 MJ.

Keywords: fire growth, fire experiment, t2 curve, large scale calorimeter

Procedia PDF Downloads 334
6010 Pilot Scale Deproteinization Study on Fish Scale Using Response Surface Methodology

Authors: Fatima Bellali, Mariem Kharroubi

Abstract:

Fish scale wastes are one of the main sources of production of value-added products such as collagen. The main aim of this study is to investigate the optimization conditions of the sardine scale deproteinization using response surface methodology (RSM) on a pilot scale. In order to look for the optimal conditions, a Box–Behnken-based design of experiment (DOE) method was carried out. The model predicted values of product coal ash content were in good agreement with the experiment values (R2 = 0.9813). Finally, model-based optimization was carried out to identify the operating parameters (reaction time=4h and the solid-liquid ratio= 1/10) and to obtain the lowest collagen content.

Keywords: pilot scale, Plackett and Burman design, fish waste, deproteinization

Procedia PDF Downloads 159
6009 Analysis of Efficiency Production of Grass Black Jelly (Mesona palustris) in Double Scale

Authors: Irvan Adhin Cholilie, Susinggih Wijana, Yusron Sugiarto

Abstract:

The aim of this research is to compare the results of black grass jelly produced using laboratory scale and double scale. In this research, the production from the laboratory scale is using ingredients of 1 kg black grass jelly added with 5 liters of water, while the double scale is using 5 kg black grass jelly and 75 liters of water. The results of organoleptic tests performed by 30 panelists (general) to the sample gels of grass black powder produced from both of laboratory and double scale are not different significantly in color, odor, flavor, and texture. Proximate test results conducted in both of grass black jelly powder produced in laboratory scale and double scale also have no significant differences in all parameters. Grass black jelly powder from double scale contains water, carbohydrate, crude fiber, and yield in the amount of 12,25 %; 43,7 %; 5,89 %; and 16,28 % respectively. The results of the energy efficiency analysis by boiling, draining, evaporation, drying, and milling processes are 85,11 %; 76,97 %; 99,64 %; 99,99% and 99,39% respectively. The utility needs including water needs for each batch amounted 0.1 m3 and cost Rp 220,5 per batch, the electricity needs for each batch is 20.01 kWh and cost Rp 18569.28 per batch, and LPG needs for each batch is 30 kg costed Rp 234,000.00 so that the total cost spent for the process is Rp 252,789.78 .

Keywords: black grass jelly, powder, mass balance, energy balance, cost

Procedia PDF Downloads 383
6008 Comparative Study of Isothermal and Cyclic Oxidation on Titanium Alloys

Authors: Poonam Yadav, Dong Bok Lee

Abstract:

Isothermal oxidation at 800°C for 50h and Cyclic oxidation at 600°C and 800°C for 40h of Pure Ti and Ti64 were performed in a muffle furnace. In Cyclic oxidation, massive scale spallation occurred, and the oxide scale cracks and peels off were observed at high temperature, it represents oxide scale that formed during cyclic oxidation was spalled out owing to stresses due to thermal shock generated during repetitive oxidation and subsequent cooling. The thickness of scale is larger in cyclic oxidation than the isothermal case. This is due to inward diffusion of oxygen through oxide scales and/or pores and cracks in cyclic oxidation.

Keywords: cyclic, diffusion, isothermal, cyclic

Procedia PDF Downloads 917
6007 Adjustment and Scale-Up Strategy of Pilot Liquid Fermentation Process of Azotobacter sp.

Authors: G. Quiroga-Cubides, A. Díaz, M. Gómez

Abstract:

The genus Azotobacter has been widely used as bio-fertilizer due to its significant effects on the stimulation and promotion of plant growth in various agricultural species of commercial interest. In order to obtain significantly viable cellular concentration, a scale-up strategy for a liquid fermentation process (SmF) with two strains of A. chroococcum (named Ac1 and Ac10) was validated and adjusted at laboratory and pilot scale. A batch fermentation process under previously defined conditions was carried out on a biorreactor Infors®, model Minifors of 3.5 L, which served as a baseline for this research. For the purpose of increasing process efficiency, the effect of the reduction of stirring speed was evaluated in combination with a fed-batch-type fermentation laboratory scale. To reproduce the efficiency parameters obtained, a scale-up strategy with geometric and fluid dynamic behavior similarities was evaluated. According to the analysis of variance, this scale-up strategy did not have significant effect on cellular concentration and in laboratory and pilot fermentations (Tukey, p > 0.05). Regarding air consumption, fermentation process at pilot scale showed a reduction of 23% versus the baseline. The percentage of reduction related to energy consumption reduction under laboratory and pilot scale conditions was 96.9% compared with baseline.

Keywords: Azotobacter chroococcum, scale-up, liquid fermentation, fed-batch process

Procedia PDF Downloads 438
6006 Scaling-Down an Agricultural Waste Biogas Plant Fermenter

Authors: Matheus Pessoa, Matthias Kraume

Abstract:

Scale-Down rules in process engineering help us to improve and develop Industrial scale parameters into lab scale. Several scale-down rules available in the literature like Impeller Power Number, Agitation device Power Input, Substrate Tip Speed, Reynolds Number and Cavern Development were investigated in order to stipulate the rotational speed to operate an 11 L working volume lab-scale bioreactor within industrial process parameters. Herein, xanthan gum was used as a fluid with a representative viscosity of a hypothetical biogas plant, with H/D = 1 and central agitation, fermentation broth using sewage sludge and sugar beet pulp as substrate. The results showed that the cavern development strategy was the best method for establishing a rotational speed for the bioreactor operation, while the other rules presented values out of reality for this article proposes.

Keywords: anaerobic digestion, cavern development, scale down rules, xanthan gum

Procedia PDF Downloads 491
6005 A Comparative Performance of Polyaspartic Acid and Sodium Polyacrylate on Silicate Scale Inhibition

Authors: Ismail Bin Mohd Saaid, Abubakar Abubakar Umar

Abstract:

Despite the successes recorded by Alkaline/Surfactant/Polymer (ASP) flooding as an effective chemical EOR technique, the combination CEOR is not unassociated with stern glitches, one of which is the scaling of downhole equipment. One of the major issues inside the oil industry is how to control scale formation, regardless of whether it is in the wellhead equipment, down-hole pipelines or even the actual field formation. The best approach to handle the challenge associated with oilfield scale formation is the application of scale inhibitors to avert the scale formation. Chemical inhibitors have been employed in doing such. But due to environmental regulations, the industry have focused on using green scale inhibitors to mitigate the formation of scales. This paper compares the scale inhibition performance of Polyaspartic acid and sodium polyacrylic acid, both commercial green scale inhibitors, in mitigating silicate scales formed during Alkaline/Surfactant/polymer flooding under static conditions. Both PASP and TH5000 are non-threshold inhibitors, therefore their efficiency was only seeing in delaying the deposition of the silicate scales.

Keywords: alkaline/surfactant/polymer flooding (ASP), polyaspartic acid (PASP), sodium polyacrylate (SPA)

Procedia PDF Downloads 350
6004 Effects of Employees’ Training Program on the Performance of Small Scale Enterprises in Oyo State

Authors: Itiola Kehinde Adeniran

Abstract:

The study examined the effect of employees’ training on the performance of small scale enterprises in Oyo State. A structured questionnaire was used to collect data from 150 respondents through purposive sampling method. Linear regression was used with the aid of statistical package for social science (SPSS) version 20 to analyze the data collected in order to examine the effect of independent variable, employees’ training on dependent variable, performance (profit) of small scale enterprises. The result revealed that employees’ training has a significant effect on the performance of small scale enterprises. It was concluded that predictor variable namely (training) is 55.5% variance of enterprises performance (profitability). Therefore, the paper recommended that all small scale enterprises in Nigeria should embrace manpower training and development in order to improve employees’ performance leading to organizational profitability.

Keywords: training, employee performance, small scale enterprise, organizational profitability

Procedia PDF Downloads 384