Search results for: random deviation
2707 Longitudinal Study of the Phenomenon of Acting White in Hungarian Elementary Schools Analysed by Fixed and Random Effects Models
Authors: Lilla Dorina Habsz, Marta Rado
Abstract:
Popularity is affected by a variety of factors in the primary school such as academic achievement and ethnicity. The main goal of our study was to analyse whether acting white exists in Hungarian elementary schools. In other words, we observed whether Roma students penalize those in-group members who obtain the high academic achievement. Furthermore, to show how popularity is influenced by changes in academic achievement in inter-ethnic relations. The empirical basis of our research was the 'competition and negative networks' longitudinal dataset, which was collected by the MTA TK 'Lendület' RECENS research group. This research followed 11 and 12-year old students for a two-year period. The survey was analysed using fixed and random effect models. Overall, we found a positive correlation between grades and popularity, but no evidence for the acting white effect. However, better grades were more positively evaluated within the majority group than within the minority group, which may further increase inequalities.Keywords: academic achievement, elementary school, ethnicity, popularity
Procedia PDF Downloads 2002706 Empirical Analyses of Students’ Self-Concepts and Their Mathematics Achievements
Authors: Adetunji Abiola Olaoye
Abstract:
The study examined the students’ self-concepts and mathematics achievement viz-a-viz the existing three theoretical models: Humanist self-concept (M1), Contemporary self-concept (M2) and Skills development self-concept (M3). As a qualitative research study, it comprised of one research question, which was transformed into hypothesis viz-a-viz the existing theoretical models. Sample to the study comprised of twelve public secondary schools from which twenty-five mathematics teachers, twelve counselling officers and one thousand students of Upper Basic II were selected based on intact class as school administrations and system did not allow for randomization. Two instruments namely 10 items ‘Achievement test in Mathematics’ (r1=0.81) and 10 items Student’s self-concept questionnaire (r2=0.75) were adapted, validated and used for the study. Data were analysed through descriptive, one way ANOVA, t-test and correlation statistics at 5% level of significance. Finding revealed mean and standard deviation of pre-achievement test scores of (51.322, 16.10), (54.461, 17.85) and (56.451, 18.22) for the Humanist Self-Concept, Contemporary Self-Concept and Skill Development Self-Concept respectively. Apart from that study showed that there was significant different in the academic performance of students along the existing models (F-cal>F-value, df = (2,997); P<0.05). Furthermore, study revealed students’ achievement in mathematics and self-concept questionnaire with the mean and standard deviation of (57.4, 11.35) and (81.6, 16.49) respectively. Result confirmed an affirmative relationship with the Contemporary Self-Concept model that expressed an individual subject and specific self-concept as the primary determinants of higher academic achievement in the subject as there is a statistical correlation between students’ self-concept and mathematics achievement viz-a-viz the existing three theoretical models of Contemporary (M2) with -Z_cal<-Z_val, df=998: P<0.05*. The implication of the study was discussed with recommendations and suggestion for further studies proffered.Keywords: contemporary, humanists, self-concepts, skill development
Procedia PDF Downloads 2372705 Probabilistic Gathering of Agents with Simple Sensors: Distributed Algorithm for Aggregation of Robots Equipped with Binary On-Board Detectors
Authors: Ariel Barel, Rotem Manor, Alfred M. Bruckstein
Abstract:
We present a probabilistic gathering algorithm for agents that can only detect the presence of other agents in front of or behind them. The agents act in the plane and are identical and indistinguishable, oblivious, and lack any means of direct communication. They do not have a common frame of reference in the plane and choose their orientation (direction of possible motion) at random. The analysis of the gathering process assumes that the agents act synchronously in selecting random orientations that remain fixed during each unit time-interval. Two algorithms are discussed. The first one assumes discrete jumps based on the sensing results given the randomly selected motion direction, and in this case, extensive experimental results exhibit probabilistic clustering into a circular region with radius equal to the step-size in time proportional to the number of agents. The second algorithm assumes agents with continuous sensing and motion, and in this case, we can prove gathering into a very small circular region in finite expected time.Keywords: control, decentralized, gathering, multi-agent, simple sensors
Procedia PDF Downloads 1642704 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables
Procedia PDF Downloads 3382703 Assessment of Carbon Dioxide Separation by Amine Solutions Using Electrolyte Non-Random Two-Liquid and Peng-Robinson Models: Carbon Dioxide Absorption Efficiency
Authors: Arash Esmaeili, Zhibang Liu, Yang Xiang, Jimmy Yun, Lei Shao
Abstract:
A high pressure carbon dioxide (CO2) absorption from a specific gas in a conventional column has been evaluated by the Aspen HYSYS simulator using a wide range of single absorbents and blended solutions to estimate the outlet CO2 concentration, absorption efficiency and CO2 loading to choose the most proper solution in terms of CO2 capture for environmental concerns. The property package (Acid Gas-Chemical Solvent) which is compatible with all applied solutions for the simulation in this study, estimates the properties based on an electrolyte non-random two-liquid (E-NRTL) model for electrolyte thermodynamics and Peng-Robinson equation of state for the vapor and liquid hydrocarbon phases. Among all the investigated single amines as well as blended solutions, piperazine (PZ) and the mixture of piperazine and monoethanolamine (MEA) have been found as the most effective absorbents respectively for CO2 absorption with high reactivity based on the simulated operational conditions.Keywords: absorption, amine solutions, Aspen HYSYS, carbon dioxide, simulation
Procedia PDF Downloads 1852702 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach
Authors: Ali Hesari, Arezoo Esmaeeli
Abstract:
Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects
Procedia PDF Downloads 3392701 Comparing Xbar Charts: Conventional versus Reweighted Robust Estimation Methods for Univariate Data Sets
Authors: Ece Cigdem Mutlu, Burak Alakent
Abstract:
Maintaining the quality of manufactured products at a desired level depends on the stability of process dispersion and location parameters and detection of perturbations in these parameters as promptly as possible. Shewhart control chart is the most widely used technique in statistical process monitoring to monitor the quality of products and control process mean and variability. In the application of Xbar control charts, sample standard deviation and sample mean are known to be the most efficient conventional estimators in determining process dispersion and location parameters, respectively, based on the assumption of independent and normally distributed datasets. On the other hand, there is no guarantee that the real-world data would be normally distributed. In the cases of estimated process parameters from Phase I data clouded with outliers, efficiency of traditional estimators is significantly reduced, and performance of Xbar charts are undesirably low, e.g. occasional outliers in the rational subgroups in Phase I data set may considerably affect the sample mean and standard deviation, resulting a serious delay in detection of inferior products in Phase II. For more efficient application of control charts, it is required to use robust estimators against contaminations, which may exist in Phase I. In the current study, we present a simple approach to construct robust Xbar control charts using average distance to the median, Qn-estimator of scale, M-estimator of scale with logistic psi-function in the estimation of process dispersion parameter, and Harrell-Davis qth quantile estimator, Hodge-Lehmann estimator and M-estimator of location with Huber psi-function and logistic psi-function in the estimation of process location parameter. Phase I efficiency of proposed estimators and Phase II performance of Xbar charts constructed from these estimators are compared with the conventional mean and standard deviation statistics both under normality and against diffuse-localized and symmetric-asymmetric contaminations using 50,000 Monte Carlo simulations on MATLAB. Consequently, it is found that robust estimators yield parameter estimates with higher efficiency against all types of contaminations, and Xbar charts constructed using robust estimators have higher power in detecting disturbances, compared to conventional methods. Additionally, utilizing individuals charts to screen outlier subgroups and employing different combination of dispersion and location estimators on subgroups and individual observations are found to improve the performance of Xbar charts.Keywords: average run length, M-estimators, quality control, robust estimators
Procedia PDF Downloads 1902700 Reconstruction Spectral Reflectance Cube Based on Artificial Neural Network for Multispectral Imaging System
Authors: Iwan Cony Setiadi, Aulia M. T. Nasution
Abstract:
The multispectral imaging (MSI) technique has been used for skin analysis, especially for distant mapping of in-vivo skin chromophores by analyzing spectral data at each reflected image pixel. For ergonomic purpose, our multispectral imaging system is decomposed in two parts: a light source compartment based on LED with 11 different wavelenghts and a monochromatic 8-Bit CCD camera with C-Mount Objective Lens. The software based on GUI MATLAB to control the system was also developed. Our system provides 11 monoband images and is coupled with a software reconstructing hyperspectral cubes from these multispectral images. In this paper, we proposed a new method to build a hyperspectral reflectance cube based on artificial neural network algorithm. After preliminary corrections, a neural network is trained using the 32 natural color from X-Rite Color Checker Passport. The learning procedure involves acquisition, by a spectrophotometer. This neural network is then used to retrieve a megapixel multispectral cube between 380 and 880 nm with a 5 nm resolution from a low-spectral-resolution multispectral acquisition. As hyperspectral cubes contain spectra for each pixel; comparison should be done between the theoretical values from the spectrophotometer and the reconstructed spectrum. To evaluate the performance of reconstruction, we used the Goodness of Fit Coefficient (GFC) and Root Mean Squared Error (RMSE). To validate reconstruction, the set of 8 colour patches reconstructed by our MSI system and the one recorded by the spectrophotometer were compared. The average GFC was 0.9990 (standard deviation = 0.0010) and the average RMSE is 0.2167 (standard deviation = 0.064).Keywords: multispectral imaging, reflectance cube, spectral reconstruction, artificial neural network
Procedia PDF Downloads 3222699 Swarm Optimization of Unmanned Vehicles and Object Localization
Authors: Venkataramana Sovenahalli Badigar, B. M. Suryakanth, Akshar Prasanna, Karthik Veeramalai, Vishwak Ram Vishwak Ram
Abstract:
Technological advances have led to widespread autonomy in vehicles. Empowering these autonomous with the intelligence to cooperate amongst themselves leads to a more efficient use of the resources available to them. This paper proposes a demonstration of a swarm algorithm implemented on a group of autonomous vehicles. The demonstration involves two ground bots and an aerial drone which cooperate amongst them to locate an object of interest. The object of interest is modelled using a high-intensity light source which acts as a beacon. The ground bots are light sensitive and move towards the beacon. The ground bots and the drone traverse in random paths and jointly locate the beacon. This finds application in various scenarios in where human interference is difficult such as search and rescue during natural disasters, delivering crucial packages in perilous situations, etc. Experimental results show that the modified swarm algorithm implemented in this system has better performance compared to fully random based moving algorithm for object localization and tracking.Keywords: swarm algorithm, object localization, ground bots, drone, beacon
Procedia PDF Downloads 2572698 Bag of Words Representation Based on Fusing Two Color Local Descriptors and Building Multiple Dictionaries
Authors: Fatma Abdedayem
Abstract:
We propose an extension to the famous method called Bag of words (BOW) which proved a successful role in the field of image categorization. Practically, this method based on representing image with visual words. In this work, firstly, we extract features from images using Spatial Pyramid Representation (SPR) and two dissimilar color descriptors which are opponent-SIFT and transformed-color-SIFT. Secondly, we fuse color local features by joining the two histograms coming from these descriptors. Thirdly, after collecting of all features, we generate multi-dictionaries coming from n random feature subsets that obtained by dividing all features into n random groups. Then, by using these dictionaries separately each image can be represented by n histograms which are lately concatenated horizontally and form the final histogram, that allows to combine Multiple Dictionaries (MDBoW). In the final step, in order to classify image we have applied Support Vector Machine (SVM) on the generated histograms. Experimentally, we have used two dissimilar image datasets in order to test our proposition: Caltech 256 and PASCAL VOC 2007.Keywords: bag of words (BOW), color descriptors, multi-dictionaries, MDBoW
Procedia PDF Downloads 2972697 Analysis of the Unreliable M/G/1 Retrial Queue with Impatient Customers and Server Vacation
Authors: Fazia Rahmoune, Sofiane Ziani
Abstract:
Retrial queueing systems have been extensively used to stochastically model many problems arising in computer networks, telecommunication, telephone systems, among others. In this work, we consider a $M/G/1$ retrial queue with an unreliable server with random vacations and two types of primary customers, persistent and impatient. This model involves the unreliability of the server, which can be subject to physical breakdowns and takes into account the correctives maintenances for restoring the service when a failure occurs. On the other hand, we consider random vacations, which can model the preventives maintenances for improving system performances and preventing breakdowns. We give the necessary and sufficient stability condition of the system. Then, we obtain the joint probability distribution of the server state and the number of customers in orbit and derive the more useful performance measures analytically. Moreover, we also analyze the busy period of the system. Finally, we derive the stability condition and the generating function of the stationary distribution of the number of customers in the system when there is no vacations and impatient customers, and when there is no vacations, server failures and impatient customers.Keywords: modeling, retrial queue, unreliable server, vacation, stochastic analysis
Procedia PDF Downloads 1862696 Cross Professional Team-Assisted Teaching Effectiveness
Authors: Shan-Yu Hsu, Hsin-Shu Huang
Abstract:
The main purpose of this teaching research is to design an interdisciplinary team-assisted teaching method for trainees and interns and review the effectiveness of this teaching method on trainees' understanding of peritoneal dialysis. The teaching research object is the fifth and sixth-grade trainees in a medical center's medical school. The teaching methods include media teaching, demonstration of technical operation, face-to-face communication with patients, special case discussions, and field visits to the peritoneal dialysis room. Evaluate learning effectiveness before, after, and verbally. Statistical analysis was performed using the SPSS paired-sample t-test to analyze whether there is a difference in peritoneal dialysis professional cognition before and after teaching intervention. Descriptive statistics show that the average score of the previous test is 74.44, the standard deviation is 9.34, the average score of the post-test is 95.56, and the standard deviation is 5.06. The results of the t-test of the paired samples are shown as p-value = 0.006, showing the peritoneal dialysis professional cognitive test. Significant differences were observed before and after. The interdisciplinary team-assisted teaching method helps trainees and interns to improve their professional awareness of peritoneal dialysis. At the same time, trainee physicians have positive feedback on the inter-professional team-assisted teaching method. This teaching research finds that the clinical ability development education of trainees and interns can provide cross-professional team-assisted teaching methods to assist clinical teaching guidance.Keywords: monitor quality, patient safety, health promotion objective, cross-professional team-assisted teaching methods
Procedia PDF Downloads 1432695 Nonlinear Analysis of Shear Deformable Deep Beam Resting on Nonlinear Two-Parameter Random Soil
Authors: M. Seguini, D. Nedjar
Abstract:
In this paper, the nonlinear analysis of Timoshenko beam undergoing moderate large deflections and resting on nonlinear two-parameter random foundation is presented, taking into account the effects of shear deformation, beam’s properties variation and the spatial variability of soil characteristics. The finite element probabilistic analysis has been performed by using Timoshenko beam theory with the Von Kàrmàn nonlinear strain-displacement relationships combined to Vanmarcke theory and Monte Carlo simulations, which is implemented in a Matlab program. Numerical examples of the newly developed model is conducted to confirm the efficiency and accuracy of this later and the importance of accounting for the foundation second parameter (Winkler-Pasternak). Thus, the results obtained from the developed model are presented and compared with those available in the literature to examine how the consideration of the shear and spatial variability of soil’s characteristics affects the response of the system.Keywords: nonlinear analysis, soil-structure interaction, large deflection, Timoshenko beam, Euler-Bernoulli beam, Winkler foundation, Pasternak foundation, spatial variability
Procedia PDF Downloads 3232694 KSVD-SVM Approach for Spontaneous Facial Expression Recognition
Authors: Dawood Al Chanti, Alice Caplier
Abstract:
Sparse representations of signals have received a great deal of attention in recent years. In this paper, the interest of using sparse representation as a mean for performing sparse discriminative analysis between spontaneous facial expressions is demonstrated. An automatic facial expressions recognition system is presented. It uses a KSVD-SVM approach which is made of three main stages: A pre-processing and feature extraction stage, which solves the problem of shared subspace distribution based on the random projection theory, to obtain low dimensional discriminative and reconstructive features; A dictionary learning and sparse coding stage, which uses the KSVD model to learn discriminative under or over dictionaries for sparse coding; Finally a classification stage, which uses a SVM classifier for facial expressions recognition. Our main concern is to be able to recognize non-basic affective states and non-acted expressions. Extensive experiments on the JAFFE static acted facial expressions database but also on the DynEmo dynamic spontaneous facial expressions database exhibit very good recognition rates.Keywords: dictionary learning, random projection, pose and spontaneous facial expression, sparse representation
Procedia PDF Downloads 3052693 Improvement of Visual Acuity in Patient Undergoing Occlusion Therapy
Authors: Rajib Husain, Mezbah Uddin, Mohammad Shamsal Islam, Rabeya Siddiquee
Abstract:
Purpose: To determine the improvement of visual acuity in patients undergoing occlusion therapy. Methods: This was a prospective hospital-based study of newly diagnosed of amblyopia seen at the pediatric clinic of Chittagong Eye Infirmary & Training Complex. There were 32 refractive amblyopia subjects were examined & questionnaire was piloted. Included were all patients diagnosed with refractive amblyopia between 5 to 8 years, without previous amblyopia treatment, and whose parents were interested to participate in the study. Patients diagnosed with strabismic amblyopia were excluded. Patients were first corrected with the best correction for a month. When the VA in the amblyopic eye did not improve over a month, then occlusion treatment was started. Occlusion was done daily for 6-8 h together with vision therapy. The occlusion was carried out for three months. Results: Out of study 32 children, 31 of them have a good compliance of amblyopic treatment whereas one child has poor compliance. About 6% Children have amblyopia from Myopia, 7% Hyperopia, 32% from myopic astigmatism, 42% from hyperopic astigmatism and 13% have mixed astigmatism. The mean and Standard deviation of present average VA was 0.452±0.275 Log MAR and after an intervention of amblyopia therapy with vision therapy mean and Standard deviation VA was 0.155±0.157 Log MAR. Out of total respondent 21.85% have BCVA in range from (0-.2) log MAR, 37.5% have BCVA in range from (0.22-0.5) log MAR, 35.95% have in range from (0.52-0.8) log MAR, 4.7% have in range from (0.82-1) log MAR and after intervention of occlusion therapy with vision therapy 76.6% have VA in range from (0-.2) log MAR, 21.85% have VA in range from (0.22-0.5) log MAR, 1.5% have in range from (0.52-0.8) log MAR. Conclusion: Amblyopia is a most important factor in pediatric age group because it can lead to visual impairment. Thus, this study concludes that occlusion therapy with vision therapy is probably one of the best treatment methods for amblyopic patients (age 5-8 years), and compliance and age were the most critical factor predicting a successful outcome.Keywords: amblyopia, occlusion therapy, vision therapy, eccentric fixation, visuoscopy
Procedia PDF Downloads 5032692 Comparative Evaluation of Vanishing Interfacial Tension Approach for Minimum Miscibility Pressure Determination
Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi
Abstract:
Minimum miscibility pressure (MMP) plays a great role in determining the displacement efficiency of different gas injection processes. Experimental techniques for MMP determination include industrially recommended slim tube, vanishing interfacial tension (VIT) and rising bubble apparatus (RBA). In this paper, MMP measurement study using slim tube and VIT experimental techniques for two different crude oil samples (M and N) both in live and stock tank oil forms is being presented. VIT measured MMP values for both 'M' and 'N' live crude oils were close to slim tube determined MMP values with 6.4 and 5 % deviation respectively. Whereas for both oil samples in stock tank oil form, VIT measured MMP showed a higher unacceptable deviation from slim tube determined MMP. This higher difference appears to be related to high stabilized crude oil heavier fraction and lack of multiple contacts miscibility. None of the different nine deployed crude oil and CO2 MMP computing correlations could result in reliable MMP, close to slim tube determined MMP. Since VIT determined MMP values for both considered live crude oils are in close match with slim tube determined MMP values, it confirms reliable, reproducible, rapid and cheap alternative for live crude oil MMP determination. Whereas VIT MMP determination for stock tank oil case needed further investigation about stabilization / destabilization mechanism of oil heavier ends and multiple contacts miscibility development issues.Keywords: minimum miscibility pressure, interfacial tension, multiple contacts miscibility, heavier ends
Procedia PDF Downloads 2682691 Reliability Analysis for Cyclic Fatigue Life Prediction in Railroad Bolt Hole
Authors: Hasan Keshavarzian, Tayebeh Nesari
Abstract:
Bolted rail joint is one of the most vulnerable areas in railway track. A comprehensive approach was developed for studying the reliability of fatigue crack initiation of railroad bolt hole under random axle loads and random material properties. The operation condition was also considered as stochastic variables. In order to obtain the comprehensive probability model of fatigue crack initiation life prediction in railroad bolt hole, we used FEM, response surface method (RSM), and reliability analysis. Combined energy-density based and critical plane based fatigue concept is used for the fatigue crack prediction. The dynamic loads were calculated according to the axle load, speed, and track properties. The results show that axle load is most sensitive parameter compared to Poisson’s ratio in fatigue crack initiation life. Also, the reliability index decreases slowly due to high cycle fatigue regime in this area.Keywords: rail-wheel tribology, rolling contact mechanic, finite element modeling, reliability analysis
Procedia PDF Downloads 3812690 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance
Authors: Abdullah Al Farwan, Ya Zhang
Abstract:
In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance
Procedia PDF Downloads 1662689 Testing the Weak Form Efficiency of Islamic Stock Market: Empirical Evidence from Indonesia
Authors: Herjuno Bagus Wicaksono, Emma Almira Fauni, Salma Amelia Dina
Abstract:
The Efficient Market Hypothesis (EMH) states that, in an efficient capital market, price fully reflects the information available in the market. This theory has influenced many investors behavior in trading in the stock market. Advanced researches have been conducted to test the efficiency of the stock market in particular countries. Indonesia, as one of the emerging countries, has performed substantial growth in the past years. Hence, this paper aims to examine the efficiency of Islamic stock market in Indonesia in its weak form. The daily stock price data from Indonesia Sharia Stock Index (ISSI) for the period October 2015 to October 2016 were used to do the statistical tests: Run Test and Serial Correlation Test. The results show that there is no serial correlation between the current price with the past prices and the market follows the random walk. This research concludes that Indonesia Islamic stock market is weak form efficient.Keywords: efficient market hypothesis, Indonesia sharia stock index, random walk, weak form efficiency
Procedia PDF Downloads 4602688 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 382687 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 1122686 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach
Authors: Utkarsh A. Mishra, Ankit Bansal
Abstract:
At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks
Procedia PDF Downloads 2232685 Development of IDF Curves for Precipitation in Western Watershed of Guwahati, Assam
Authors: Rajarshi Sharma, Rashidul Alam, Visavino Seleyi, Yuvila Sangtam
Abstract:
The Intensity-Duration-Frequency (IDF) relationship of rainfall amounts is one of the most commonly used tools in water resources engineering for planning, design and operation of water resources project, or for various engineering projects against design floods. The establishment of such relationships was reported as early as in 1932 (Bernard). Since then many sets of relationships have been constructed for several parts of the globe. The objective of this research is to derive IDF relationship of rainfall for western watershed of Guwahati, Assam. These relationships are useful in the design of urban drainage works, e.g. storm sewers, culverts and other hydraulic structures. In the study, rainfall depth for 10 years viz. 2001 to 2010 has been collected from the Regional Meteorological Centre Borjhar, Guwahati. Firstly, the data has been used to construct the mass curve for duration of more than 7 hours rainfall to calculate the maximum intensity and to form the intensity duration curves. Gumbel’s frequency analysis technique has been used to calculate the probable maximum rainfall intensities for a period of 2 yr, 5 yr, 10 yr, 50 yr, 100 yr from the maximum intensity. Finally, regression analysis has been used to develop the intensity-duration-frequency (IDF) curve. Thus, from the analysis the values for the constants ‘a’,‘b’ &‘c’ have been found out. The values of ‘a’ for which the sum of the squared deviation is minimum has been found out to be 40 and when the corresponding value of ‘c’ and ‘b’ for the minimum squared deviation of ‘a’ are 0.744 and 1981.527 respectively. The results obtained showed that in all the cases the correlation coefficient is very high indicating the goodness of fit of the formulae to estimate IDF curves in the region of interest.Keywords: intensity-duration-frequency relationship, mass curve, regression analysis, correlation coefficient
Procedia PDF Downloads 2442684 Dissocial Personality in Adolescents
Authors: Tsirekidze M., Aprasidze T.
Abstract:
Introduction: The problem of dissocial behavior is at the heart of the social sciences and psychiatry; however, it should be noted that its psychiatric aspect is little studied, and some issues of the problem are still controversial. This is complicated by the diversity of terminological concepts in defining “dissocial behavior”, “behavioral disorder”, “abnormal behavior”, “deviant behavior”, “delinquent behavior”, etc. In literature, there is no comprehensive definition of the essence of dissociative behavior. Numerous attempts to systematize dissociative disorders should also be considered unsatisfactory, which is primarily related to the lack of solid criteria for defining this group of disorders. According to the clinical classification, dissocial behavior is divided into psychotic and non-psychotic forms. Such differentiation is conditional in nature since it is not always possible to draw precise, clear distinctions between these forms, and in addition, there is a transition of a behavior disorder or so-called intermediate forms. One group of authors distinguishes two main forms of deviant behavior in terms of both theoretical and practical significance - non-pathological and pathological. In recent years, especially, the non-pathological form of behavior disorder has become topical. It refers to a large group of forms of deviant behavior, the emergence of which is associated with psychologically full-fledged reactions of children and adolescents to stressful situations and extreme conditions. According to the authors, its concept is understandable-it is difficult to draw a line between psychologically understandable reactions and psychogenically induced reactive states. In addition, the concept of "normal" child and adolescent is, to some extent, a vague concept, as in medicine, any definition of the norm. From a practical (more precisely, pragmatic) point of view, the term "abnormal behavioral disorder" undoubtedly makes sense, especially for the purpose of forensic psychiatric examination. Non-pathological deviation mainly includes transient situational reactions, microsocial-pedagogical backwardness, and character accentuation.Deviant behavior was predominantly manifested in a non-pathological form, which, in our opinion, is due to the difficult socio-economic situation of the country, moral-ethical deprivation, and expressed frustration. By itself, society is an indicator of deviation. Add to this situation complicated factors such as micro-social-pedagogical leave, unfavorable family environment, and parenting defects. Consideration is also given to the connection of acceptable deviation with the personal structural features of the adolescent. Aim: The topic of our discussion is the dissocial behavior of the non-psychotic register. Methods: We surveyed 120 adolescents with deviant behaviors. 61% of them were diagnosed with various neuropsychiatric disorders. Results: Abnormal forms of deviant behavior were observed in 13%, and non-pathological forms in -69%. A combination of non-pathological and pathological forms was present in 10% of cases. In the case of non-pathological deviation, microsocial-pedagogical acceptance was revealed in 62%, character accentuation in 22%; during the pathological forms, pathological reactions were observed in 21%, and abnormal formation of the person -21%. Conclusion: It should be emphasized that in case of any of the above defects, if the so-called family psychosis, and medical and pedagogical habilitation measures for the adolescent, it is quite possible to prevent the abnormal development of the child's personality, correct his character, regulate behavior and develop positive labor-social relations.Keywords: dissocial personality, deviant behavior, dissocial, delinquent behavior
Procedia PDF Downloads 2202683 Joint Modeling of Bottle Use, Daily Milk Intake from Bottles, and Daily Energy Intake in Toddlers
Authors: Yungtai Lo
Abstract:
The current study follows an educational intervention on bottle-weaning to simultaneously evaluate the effect of the bottle-weaning intervention on reducing bottle use, daily milk intake from bottles, and daily energy intake in toddlers aged 11 to 13 months. A shared parameter model and a random effects model are used to jointly model bottle use, daily milk intake from bottles, and daily energy intake. We show in the two joint models that the bottle-weaning intervention promotes bottleweaning, and reduces daily milk intake from bottles in toddlers not off bottles and daily energy intake. We also show that the odds of drinking from a bottle were positively associated with the amount of milk intake from bottles and increased daily milk intake from bottles was associated with increased daily energy intake. The effect of bottle use on daily energy intake is through its effect on increasing daily milk intake from bottles that in turn increases daily energy intake.Keywords: two-part model, semi-continuous variable, joint model, gamma regression, shared parameter model, random effects model
Procedia PDF Downloads 2872682 Influence of the Paint Coating Thickness in Digital Image Correlation Experiments
Authors: Jesús A. Pérez, Sam Coppieters, Dimitri Debruyne
Abstract:
In the past decade, the use of digital image correlation (DIC) techniques has increased significantly in the area of experimental mechanics, especially for materials behavior characterization. This non-contact tool enables full field displacement and strain measurements over a complete region of interest. The DIC algorithm requires a random contrast pattern on the surface of the specimen in order to perform properly. To create this pattern, the specimen is usually first coated using a white matt paint. Next, a black random speckle pattern is applied using any suitable method. If the applied paint coating is too thick, its top surface may not be able to exactly follow the deformation of the specimen, and consequently, the strain measurement might be underestimated. In the present article, a study of the influence of the paint thickness on the strain underestimation is performed for different strain levels. The results are then compared to typical paint coating thicknesses applied by experienced DIC users. A slight strain underestimation was observed for paint coatings thicker than about 30μm. On the other hand, this value was found to be uncommonly high compared to coating thicknesses applied by DIC users.Keywords: digital image correlation, paint coating thickness, strain
Procedia PDF Downloads 5152681 Stress Recovery and Durability Prediction of a Vehicular Structure with Random Road Dynamic Simulation
Authors: Jia-Shiun Chen, Quoc-Viet Huynh
Abstract:
This work develops a flexible-body dynamic model of an all-terrain vehicle (ATV), capable of recovering dynamic stresses while the ATV travels on random bumpy roads. The fatigue life of components is forecasted as well. While considering the interaction between dynamic forces and structure deformation, the proposed model achieves a highly accurate structure stress prediction and fatigue life prediction. During the simulation, stress time history of the ATV structure is retrieved for life prediction. Finally, the hot sports of the ATV frame are located, and the frame life for combined road conditions is forecasted, i.e. 25833.6 hr. If the usage of vehicle is eight hours daily, the total vehicle frame life is 8.847 years. Moreover, the reaction force and deformation due to the dynamic motion can be described more accurately by using flexible body dynamics than by using rigid-body dynamics. Based on recommendations made in the product design stage before mass production, the proposed model can significantly lower development and testing costs.Keywords: flexible-body dynamics, veicle, dynamics, fatigue, durability
Procedia PDF Downloads 3942680 Relationship Between In-Service Training and Employees’ Feeling of Psychological Ownership
Authors: Mahsa Kallhor Mohammadi, Hamideh Reshadatjoo
Abstract:
This study verified the relationship between in-service training and employees’ feeling of psychological ownership. This research applied a descriptive survey that investigated a correlation between variables. The target population was 140 employees of a Drilling Fluid and Waste Management Service Company, and the sample was 123 employees who were selected randomly and encouraged to complete an electronic questionnaire which was designed based on standard questionnaires for research variables covering 62 questions. The face validity of the questionnaire was supported by an experimental test, and its content validity was approved by the thesis supervisor and consulting advisor. For the descriptive statistics frequency tables and diagrams, measures of central tendency such as mode, median, and mean and measures of variability such as variance, standards deviation, and quartile deviation were used. In the inferential statistics section, the Pearson correlation coefficient was used to verify the relationship between the variables of the research. According to the results, all of the research hypotheses were supported. According to hypothesis 1, there was a positive and significant relationship between training policy-making and employees’ psychological ownership (r=0/408, α=0/05). According to hypothesis 2, there was a positive and significant relationship between training planning and employees’ psychological ownership (r=0/446, α=0/05). According to hypothesis 3, there was a positive and significant relationship between providing the training and employees’ psychological ownership (r=0/512, α=0/05). According to hypothesis 4, there was a positive and significant relationship between training performance management and employees’ psychological ownership (r=0/462, α=0/05). According to hypothesis 5, there was a positive and significant relationship between employees’ motivation and psychological ownership (r=0/694, α=0/05). Therefore, through systematic in-service training, which is in the same line with the strategic goals of an organization and is based on scientific needs analysis, design, implementation, and evaluation, it is possible to improve employees’ sense of psychological ownership toward an organization.Keywords: in-service training, motivation, organizational behavior, psychological ownership
Procedia PDF Downloads 612679 Analysis of Policy Issues on Computer-Based Testing in Nigeria
Authors: Samuel Oye Bandele
Abstract:
A policy is a system of principles to guide activities and strategic decisions of an organisation in order to achieve stated objectives and meeting expected outcomes. A Computer Based Test (CBT) policy is therefore a statement of intent to drive the CBT programmes, and should be implemented as a procedure or protocol. Policies are hence generally adopted by an organization or a nation. The concern here, in this paper, is the consideration and analysis of issues that are significant to evolving the acceptable policy that will drive the new CBT innovation in Nigeria. Public examinations and internal examinations in higher educational institutions in Nigeria are gradually making a radical shift from Paper Based or Paper-Pencil to Computer-Based Testing. The need to make an objective and empirical analysis of Policy issues relating to CBT became expedient. The following are some of the issues on CBT evolution in Nigeria that were identified as requiring policy backing. Prominent among them are requirements for establishing CBT centres, purpose of CBT, types and acquisition of CBT equipment, qualifications of staff: professional, technical and regular, security plans and curbing of cheating during examinations, among others. The descriptive research design was employed based on a population consisting of Principal Officers (Policymakers), Staff (Teaching and non-Teaching-Policy implementors), and CBT staff ( Technical and Professional- Policy supports) and candidates (internal and external). A fifty-item researcher-constructed questionnaire on policy issues was employed to collect data from 600 subjects drawn from higher institutions in South West Nigeria, using the purposive and stratified random sampling techniques. Data collected were analysed using descriptive (frequency counts, means and standard deviation) and inferential (t-test, ANOVA, regression and Factor analysis) techniques. Findings from this study showed, among others, that the factor loadings had significantly weights on the organizational and National policy issues on CBT innovation in Nigeria.Keywords: computer-based testing, examination, innovation, paper-based testing, paper pencil based testing, policy issues
Procedia PDF Downloads 2482678 Modeling of Particle Reduction and Volatile Compounds Profile during Chocolate Conching by Electronic Nose and Genetic Programming (GP) Based System
Authors: Juzhong Tan, William Kerr
Abstract:
Conching is one critical procedure in chocolate processing, where special flavors are developed, and smooth mouse feel the texture of the chocolate is developed due to particle size reduction of cocoa mass and other additives. Therefore, determination of the particle size and volatile compounds profile of cocoa bean is important for chocolate manufacturers to ensure the quality of chocolate products. Currently, precise particle size measurement is usually done by laser scattering which is expensive and inaccessible to small/medium size chocolate manufacturers. Also, some other alternatives, such as micrometer and microscopy, can’t provide good measurements and provide little information. Volatile compounds analysis of cocoa during conching, has similar problems due to its high cost and limited accessibility. In this study, a self-made electronic nose system consists of gas sensors (TGS 800 and 2000 series) was inserted to a conching machine and was used to monitoring the volatile compound profile of chocolate during the conching. A model correlated volatile compounds profiles along with factors including the content of cocoa, sugar, and the temperature during the conching to particle size of chocolate particles by genetic programming was established. The model was used to predict the particle size reduction of chocolates with different cocoa mass to sugar ratio (1:2, 1:1, 1.5:1, 2:1) at 8 conching time (15min, 30min, 1h, 1.5h, 2h, 4h, 8h, and 24h). And the predictions were compared to laser scattering measurements of the same chocolate samples. 91.3% of the predictions were within the range of later scatting measurement ± 5% deviation. 99.3% were within the range of later scatting measurement ± 10% deviation.Keywords: cocoa bean, conching, electronic nose, genetic programming
Procedia PDF Downloads 255