Search results for: common vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18644

Search results for: common vector approach

18224 Behavior of Common Wheat under the Influence of Treated Waste Water

Authors: Chiahi Nadia

Abstract:

The aim of our work is to monitor the behavior of soft wheat on a morpho-physiological and agronomic scale under the influence of treated wastewater. Physico-chemical analyses of the treated sewage were also carried out, and our tests were carried out on two varieties of common wheat (Triticum aestivum L), HD1220 and ARZ. For this, a seedling was made, and two different irrigations were chosen, one using treated wastewater from the Sedrata (Wilaya of Souk ahras - Algeria) WWTP and the other stormwater as a control. The tests focused on soil and soft wheat parameters, and based on our results, the soft wheat development, physiological and yield parameters appear to respond favorably to the use of these waters.

Keywords: common wheat (Triticum aestivum L.), purified wastewater, irrigation, morph physiological and agronomic parameters

Procedia PDF Downloads 46
18223 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 140
18222 Construct the Fur Input Mixed Model with Activity-Based Benefit Assessment Approach of Leather Industry

Authors: M. F. Wu, F. T. Cheng

Abstract:

Leather industry is the most important traditional industry to provide the leather products in the world for thousand years. The fierce global competitive environment and common awareness of global carbon reduction make livestock supply quantities falling, salt and wet blue leather material reduces and the price skyrockets significantly. Exchange rate fluctuation led sales revenue decreasing which due to the differences of export exchanges and compresses the overall profitability of leather industry. This paper applies activity-based benefit assessment approach to build up fitness fur input mixed model, fur is Wet Blue, which concerned with four key factors: the output rate of wet blue, unit cost of wet blue, yield rate and grade level of Wet Blue to achieve the low cost strategy under given unit price of leather product condition of the company. The research findings indicate that applying this model may improve the input cost structure, decrease numbers of leather product inventories and to raise the competitive advantages of the enterprise in the future.

Keywords: activity-based benefit assessment approach, input mixed, output rate, wet blue

Procedia PDF Downloads 353
18221 A Medical Resource Forecasting Model for Emergency Room Patients with Acute Hepatitis

Authors: R. J. Kuo, W. C. Cheng, W. C. Lien, T. J. Yang

Abstract:

Taiwan is a hyper endemic area for the Hepatitis B virus (HBV). The estimated total number of HBsAg carriers in the general population who are more than 20 years old is more than 3 million. Therefore, a case record review is conducted from January 2003 to June 2007 for all patients with a diagnosis of acute hepatitis who were admitted to the Emergency Department (ED) of a well-known teaching hospital. The cost for the use of medical resources is defined as the total medical fee. In this study, principal component analysis (PCA) is firstly employed to reduce the number of dimensions. Support vector regression (SVR) and artificial neural network (ANN) are then used to develop the forecasting model. A total of 117 patients meet the inclusion criteria. 61% patients involved in this study are hepatitis B related. The computational result shows that the proposed PCA-SVR model has superior performance than other compared algorithms. In conclusion, the Child-Pugh score and echogram can both be used to predict the cost of medical resources for patients with acute hepatitis in the ED.

Keywords: acute hepatitis, medical resource cost, artificial neural network, support vector regression

Procedia PDF Downloads 405
18220 Applying Lean Six Sigma in an Emergency Department, of a Private Hospital

Authors: Sarah Al-Lumai, Fatima Al-Attar, Nour Jamal, Badria Al-Dabbous, Manal Abdulla

Abstract:

Today, many commonly used Industrial Engineering tools and techniques are being used in hospitals around the world for the goal of producing a more efficient and effective healthcare system. A common quality improvement methodology known as Lean Six-Sigma has been successful in manufacturing industries and recently in healthcare. The objective of our project is to use the Lean Six-Sigma methodology to reduce waiting time in the Emergency Department (ED), in a local private hospital. Furthermore, a comprehensive literature review was conducted to evaluate the success of Lean Six-Sigma in the ED. According to the study conducted by Ibn Sina Hospital, in Morocco, the most common problem that patients complain about is waiting time. To ensure patient satisfaction many hospitals such as North Shore University Hospital were able to reduce waiting time up to 37% by using Lean Six-Sigma. Other hospitals, such as John Hopkins’s medical center used Lean Six-Sigma successfully to enhance the overall patient flow that ultimately decreased waiting time. Furthermore, it was found that capacity constraints, such as staff shortages and lack of beds were one of the main reasons behind long waiting time. With the use of Lean Six-Sigma and bed management, hospitals like Memorial Hermann Southwest Hospital were able to reduce patient delays. Moreover, in order to successfully implement Lean Six-Sigma in our project, two common methodologies were considered, DMAIC and DMADV. After the assessment of both methodologies, it was found that DMAIC was a more suitable approach to our project because it is more concerned with improving an already existing process. With many of its successes, Lean Six-Sigma has its limitation especially in healthcare; but limitations can be minimized if properly approached.

Keywords: lean six sigma, DMAIC, hospital, methodology

Procedia PDF Downloads 471
18219 Multisymplectic Geometry and Noether Symmetries for the Field Theories and the Relativistic Mechanics

Authors: H. Loumi-Fergane, A. Belaidi

Abstract:

The problem of symmetries in field theory has been analyzed using geometric frameworks, such as the multisymplectic models by using in particular the multivector field formalism. In this paper, we expand the vector fields associated to infinitesimal symmetries which give rise to invariant quantities as Noether currents for classical field theories and relativistic mechanic using the multisymplectic geometry where the Poincaré-Cartan form has thus been greatly simplified using the Second Order Partial Differential Equation (SOPDE) for multi-vector fields verifying Euler equations. These symmetries have been classified naturally according to the construction of the fiber bundle used.  In this work, unlike other works using the analytical method, our geometric model has allowed us firstly to distinguish the angular moments of the gauge field obtained during different transformations while these moments are gathered in a single expression and are obtained during a rotation in the Minkowsky space. Secondly, no conditions are imposed on the Lagrangian of the mechanics with respect to its dependence in time and in qi, the currents obtained naturally from the transformations are respectively the energy and the momentum of the system.

Keywords: conservation laws, field theories, multisymplectic geometry, relativistic mechanics

Procedia PDF Downloads 184
18218 Flywheel Energy Storage Control Using SVPWM for Small Satellites Application

Authors: Noha El-Gohary, Thanaa El-Shater, A. A. Mahfouz, M. M. Sakr

Abstract:

Searching for high power conversion efficiency and long lifetime are important goals when designing a power supply subsystem for satellite applications. To fulfill these goals, this paper presents a power supply subsystem for small satellites in which flywheel energy storage system is used as a secondary power source instead of chemical battery. In this paper, the model of flywheel energy storage system is introduced; a DC bus regulation control algorithm for charging and discharging of flywheel based on space vector pulse width modulation technique and motor current control is also introduced. Simulation results showed the operation of the flywheel for charging and discharging mode during illumination and shadowed period. The advantages of the proposed system are confirmed by the simulation results of the power supply system.

Keywords: small-satellites, flywheel energy storage system, space vector pulse width modulation, power conversion

Procedia PDF Downloads 377
18217 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

Authors: Amir Hajian, Sepehr Damavandinejadmonfared

Abstract:

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Keywords: biometrics, finger vein recognition, principal component analysis (PCA), kernel principal component analysis (KPCA)

Procedia PDF Downloads 342
18216 Proposal of a Virtual Reality Dynamism Augmentation Method for Sports Spectating

Authors: Hertzog Clara, Sakurai Sho, Hirota Koichi, Nojima Takuya

Abstract:

It is common to see graphics appearing on television while watching a sports game to provide information, but it is less common to see graphics specifically aiming to boost spectators’ dynamism perception. It is even less common to see such graphics designed especially for virtual reality (VR). However, it appears that even with simple dynamic graphics, it would be possible to improve VR sports spectators’ experience. So, in this research, we explain how graphics can be used in VR to improve the dynamism of a broadcasted sports game and we provide a simple example. This example consists in a white halo displayed around the video and blinking according to the game speed. We hope to increase people’s awareness about VR sports spectating and the possibilities this display offers through dynamic graphics.

Keywords: broadcasting, graphics, sports spectating, virtual reality

Procedia PDF Downloads 65
18215 A Study Regarding Nanotechnologies as a Vector of New European Business Model

Authors: Adriana Radan Ungureanu

Abstract:

The industrial landscape is changing due to the financial crises, poor availability of raw materials, new discoveries and interdisciplinary collaborations. New ideas shape the change through technologies and bring responses for a better life. The process of change is leaded by big players like states and companies, but they cannot keep their places on the market without the help of the small ones. The main tool of change is technology and the entire developed world dedicated efforts for decades in this direction. Even the expectations are not yet met, the research for finding adequate solutions is far from to be stopped. A relevant example is nanotechnology where most of discoveries still remain into laboratory and could not succeed to find the right way to the market. In front of this situation the right question could be: ”Is it worth investing in nanotechnology in the name of an uncertain future but with very little impact on present?” This paper tries to find a positive answer from a three-dimensional approach using a descriptive analyse based on available database supplied by the European case studies, reports, and literature.

Keywords: Europe, KET’s, nanotechnology, technology

Procedia PDF Downloads 392
18214 An Approach for the Assessment of Semi-Elliptical Surface Crack

Authors: Muhammad Naweed, Usman Tariq Murtaza, Waseem Siddique

Abstract:

A pallet body approach is a finite element-based computational approach used for the modeling and assessment of a three-dimensional surface crack. The approach is capable of inserting the crack in an engineering structure and generating high-quality hexahedral mesh in the cracked region of the structure. The approach is capable of computing the stress intensity factors along a semi-elliptical surface crack numerically. The objective of this work is to present that the stress intensity factors produced by the approach can be used with confidence for capturing the parameters during the fatigue crack growth.

Keywords: pallet body approach, semi-elliptical surface crack, stress intensity factors, fatigue crack growth

Procedia PDF Downloads 81
18213 First Survey of Seasonal Abundance and Daily Activity of Stomoxys calcitrans: In Zaouiet Sousse, the Sahel Area of Tunisia

Authors: Amira Kalifa, Faïek Errouissi

Abstract:

The seasonal changes and the daily activity of Stomoxys calcitrans (Diptera: Muscidae) were examined, using Vavoua traps, in a dairy cattle farm in Zaouiet Sousse, the Sahel area of Tunisia during May 2014 to October 2014. Over this period, a total of 4366 hematophagous diptera were captured and Stomoxys calcitrans was the most commonly trapped species (96.52%). Analysis of the seasonal activity, showed that S.calcitrans is bivoltine, with two peaks: a significant peak is recorded in May-June, during the dry season, and a second peak at the end of October, which is quite weak. This seasonal pattern would depend on climatic factors, particularly the temperature of the manure and that of the air. The activity pattern of Stomoxys calcitrans was diurnal with seasonal variations. The daily rhythm shows a peak between 11:00 am to 15:00 pm in May and between 11:00 am to 17:00 pm in June. These vector flies are important pests of livestock in Tunisia, where they are known as a mechanical vector of several pathogens and have a considerable economic and health impact on livestock. A better knowledge of their ecology is a prerequisite for more efficient control measures.

Keywords: cattle farm, daily rhythm, Stomoxys calcitrans, seasonal activity

Procedia PDF Downloads 250
18212 Low-Voltage Multiphase Brushless DC Motor for Electric Vehicle Application

Authors: Mengesha Mamo Wogari

Abstract:

In this paper, low voltage multiphase brushless DC motor with square wave air-gap flux distribution for electric vehicle application is proposed. Ten-phase, 5 kW motor, has been designed and simulated by finite element methods demonstrating the desired high torque capability at low speed and flux weakening operation for high-speed operations. The motor torque is proportional to number of phases for a constant phase current and air-gap flux. The concept of vector control and simple space vector modulation technique is used on MATLAB to control the motor demonstrating simple switching pattern for selected number of phases. The low voltage DC and inverter output AC are desired characteristics to avoid any electric shock in the vehicle, accidentally and during abnormal conditions. The switching devices for inverter are of low-voltage rating and cost effective though their number is equal to twice the number of phases.

Keywords: brushless DC motors, electric Vehicle, finite element methods, Low-voltage inverter, multiphase

Procedia PDF Downloads 126
18211 Numerical Investigation of the Electromagnetic Common Rail Injector Characteristics

Authors: Rafal Sochaczewski, Ksenia Siadkowska, Tytus Tulwin

Abstract:

The paper describes the modeling of a fuel injector for common rail systems. A one-dimensional model of a solenoid-valve-controlled injector with Valve Closes Orifice (VCO) spray was modelled in the AVL Hydsim. This model shows the dynamic phenomena that occur in the injector. The accuracy of the calibration, based on a regulation of the parameters of the control valve and the nozzle needle lift, was verified by comparing the numerical results of injector flow rate. Our model is capable of a precise simulation of injector operating parameters in relation to injection time and fuel pressure in a fuel rail. As a result, there were made characteristics of the injector flow rate and backflow.

Keywords: common rail, diesel engine, fuel injector, modeling

Procedia PDF Downloads 393
18210 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada

Authors: Bilel Chalghaf, Mathieu Varin

Abstract:

Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.

Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR

Procedia PDF Downloads 110
18209 ANN Based Simulation of PWM Scheme for Seven Phase Voltage Source Inverter Using MATLAB/Simulink

Authors: Mohammad Arif Khan

Abstract:

This paper analyzes and presents the development of Artificial Neural Network based controller of space vector modulation (ANN-SVPWM) for a seven-phase voltage source inverter. At first, the conventional method of producing sinusoidal output voltage by utilizing six active and one zero space vectors are used to synthesize the input reference, is elaborated and then new PWM scheme called Artificial Neural Network Based PWM is presented. The ANN based controller has the advantage of the very fast implementation and analyzing the algorithms and avoids the direct computation of trigonometric and non-linear functions. The ANN controller uses the individual training strategy with the fixed weight and supervised models. A computer simulation program has been developed using Matlab/Simulink together with the neural network toolbox for training the ANN-controller. A comparison of the proposed scheme with the conventional scheme is presented based on various performance indices. Extensive Simulation results are provided to validate the findings.

Keywords: space vector PWM, total harmonic distortion, seven-phase, voltage source inverter, multi-phase, artificial neural network

Procedia PDF Downloads 434
18208 Clinician's Perspective of Common Factors of Change in Family Therapy: A Cross-National Exploration

Authors: Hassan Karimi, Fred Piercy, Ruoxi Chen, Ana L. Jaramillo-Sierra, Wei-Ning Chang, Manjushree Palit, Catherine Martosudarmo, Angelito Antonio

Abstract:

Background: The two psychotherapy camps, the randomized clinical trials (RCTs) and the common factors model, have competitively claimed specific explanations for therapy effectiveness. Recently, scholars called for empirical evidence to show the role of common factors in therapeutic outcome in marriage and family therapy. Purpose: This cross-national study aims to explore how clinicians, across different nations and theoretical orientations, attribute the contribution of common factors to therapy outcome. Method: A brief common factors questionnaire (CFQ-with a Cronbach’s Alpha, 0.77) was developed and administered in seven nations. A series of statistical analyses (paired-samples t-test, independent sample t-test, ANOVA) were conducted: to compare clinicians perceived contribution of total common factors versus model-specific factors, to compare each pair of common factors’ categories, and to compare clinicians from collectivistic nations versus clinicians from individualistic nation. Results: Clinicians across seven nations attributed 86% to common factors versus 14% to model-specific factors. Clinicians attributed 34% of therapeutic change to client’s factors, 26% to therapist’s factors, 26% to relationship factors, and 14% to model-specific techniques. The ANOVA test indicated each of the three categories of common factors (client 34%, therapist 26%, relationship 26%) showed higher contribution in therapeutic outcome than the category of model specific factors (techniques 14%). Clinicians with psychology degree attributed more contribution to model-specific factors than clinicians with MFT and counseling degrees who attributed more contribution to client factors. Clinicians from collectivistic nations attributed larger contributions to therapist’s factors (M=28.96, SD=12.75) than the US clinicians (M=23.22, SD=7.73). The US clinicians attributed a larger contribution to client’s factors (M=39.02, SD=1504) than clinicians from the collectivistic nations (M=28.71, SD=15.74). Conclusion: The findings indicate clinicians across the globe attributed more than two thirds of therapeutic change to CFs, which emphasize the training of the common factors model in the field. CFs, like model-specific factors, vary in their contribution to therapy outcome in relation to specific client, therapist, problem, treatment model, and sociocultural context. Sociocultural expectations and norms should be considered as a context in which both CFs and model-specific factors function toward therapeutic goals. Clinicians need to foster a cultural competency specifically regarding the divergent ways that CFs can be activated due to specific sociocultural values.

Keywords: common factors, model-specific factors, cross-national survey, therapist cultural competency, enhancing therapist efficacy

Procedia PDF Downloads 268
18207 Advances of Image Processing in Precision Agriculture: Using Deep Learning Convolution Neural Network for Soil Nutrient Classification

Authors: Halimatu S. Abdullahi, Ray E. Sheriff, Fatima Mahieddine

Abstract:

Agriculture is essential to the continuous existence of human life as they directly depend on it for the production of food. The exponential rise in population calls for a rapid increase in food with the application of technology to reduce the laborious work and maximize production. Technology can aid/improve agriculture in several ways through pre-planning and post-harvest by the use of computer vision technology through image processing to determine the soil nutrient composition, right amount, right time, right place application of farm input resources like fertilizers, herbicides, water, weed detection, early detection of pest and diseases etc. This is precision agriculture which is thought to be solution required to achieve our goals. There has been significant improvement in the area of image processing and data processing which has being a major challenge. A database of images is collected through remote sensing, analyzed and a model is developed to determine the right treatment plans for different crop types and different regions. Features of images from vegetations need to be extracted, classified, segmented and finally fed into the model. Different techniques have been applied to the processes from the use of neural network, support vector machine, fuzzy logic approach and recently, the most effective approach generating excellent results using the deep learning approach of convolution neural network for image classifications. Deep Convolution neural network is used to determine soil nutrients required in a plantation for maximum production. The experimental results on the developed model yielded results with an average accuracy of 99.58%.

Keywords: convolution, feature extraction, image analysis, validation, precision agriculture

Procedia PDF Downloads 294
18206 On the PTC Thermistor Model with a Hyperbolic Tangent Electrical Conductivity

Authors: M. O. Durojaye, J. T. Agee

Abstract:

This paper is on the one-dimensional, positive temperature coefficient (PTC) thermistor model with a hyperbolic tangent function approximation for the electrical conductivity. The method of asymptotic expansion was adopted to obtain the steady state solution and the unsteady-state response was obtained using the method of lines (MOL) which is a well-established numerical technique. The approach is to reduce the partial differential equation to a vector system of ordinary differential equations and solve numerically. Our analysis shows that the hyperbolic tangent approximation introduced is well suitable for the electrical conductivity. Numerical solutions obtained also exhibit correct physical characteristics of the thermistor and are in good agreement with the exact steady state solutions.

Keywords: electrical conductivity, hyperbolic tangent function, PTC thermistor, method of lines

Procedia PDF Downloads 301
18205 Machine Learning Approach for Predicting Students’ Academic Performance and Study Strategies Based on Their Motivation

Authors: Fidelia A. Orji, Julita Vassileva

Abstract:

This research aims to develop machine learning models for students' academic performance and study strategy prediction, which could be generalized to all courses in higher education. Key learning attributes (intrinsic, extrinsic, autonomy, relatedness, competence, and self-esteem) used in building the models are chosen based on prior studies, which revealed that the attributes are essential in students’ learning process. Previous studies revealed the individual effects of each of these attributes on students’ learning progress. However, few studies have investigated the combined effect of the attributes in predicting student study strategy and academic performance to reduce the dropout rate. To bridge this gap, we used Scikit-learn in python to build five machine learning models (Decision Tree, K-Nearest Neighbour, Random Forest, Linear/Logistic Regression, and Support Vector Machine) for both regression and classification tasks to perform our analysis. The models were trained, evaluated, and tested for accuracy using 924 university dentistry students' data collected by Chilean authors through quantitative research design. A comparative analysis of the models revealed that the tree-based models such as the random forest (with prediction accuracy of 94.9%) and decision tree show the best results compared to the linear, support vector, and k-nearest neighbours. The models built in this research can be used in predicting student performance and study strategy so that appropriate interventions could be implemented to improve student learning progress. Thus, incorporating strategies that could improve diverse student learning attributes in the design of online educational systems may increase the likelihood of students continuing with their learning tasks as required. Moreover, the results show that the attributes could be modelled together and used to adapt/personalize the learning process.

Keywords: classification models, learning strategy, predictive modeling, regression models, student academic performance, student motivation, supervised machine learning

Procedia PDF Downloads 102
18204 A Multigranular Linguistic ARAS Model in Group Decision Making

Authors: Wiem Daoud Ben Amor, Luis Martínez López, Hela Moalla Frikha

Abstract:

Most of the multi-criteria group decision making (MCGDM) problems dealing with qualitative criteria require consideration of the large background of expert information. It is common that experts have different degrees of knowledge for giving their alternative assessments according to criteria. So, it seems logical that they use different evaluation scales to express their judgment, i.e., multi granular linguistic scales. In this context, we propose the extension of the classical additive ratio assessment (ARAS) method to the case of a hierarchical linguistics term for managing multi granular linguistic scales in uncertain contexts where uncertainty is modeled by means in linguistic information. The proposed approach is called the extended hierarchical linguistics-ARAS method (ARAS-ELH). Within the ARAS-ELH approach, the DM can diagnose the results (the ranking of the alternatives) in a decomposed style, i.e., not only at one level of the hierarchy but also at the intermediate ones. Also, the developed approach allows a feedback transformation i.e the collective final results of all experts able to be transformed at any level of the extended linguistic hierarchy that each expert has previously used. Therefore, the ARAS-ELH technique makes it easier for decision-makers to understand the results. Finally, An MCGDM case study is given to illustrate the proposed approach.

Keywords: additive ratio assessment, extended hierarchical linguistic, multi-criteria group decision making problems, multi granular linguistic contexts

Procedia PDF Downloads 186
18203 The Ability of Forecasting the Term Structure of Interest Rates Based on Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector auto-regressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel Model, neural networks, Svensson Model, vector autoregressive model, yield curve

Procedia PDF Downloads 293
18202 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression

Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras

Abstract:

In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.

Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression

Procedia PDF Downloads 97
18201 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data

Authors: K. Sathishkumar, V. Thiagarasu

Abstract:

Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.

Keywords: microarray technology, gene expression data, clustering, gene Selection

Procedia PDF Downloads 302
18200 Low Complexity Deblocking Algorithm

Authors: Jagroop Singh Sidhu, Buta Singh

Abstract:

A low computational deblocking filter including three frequency related modes (smooth mode, intermediate mode, and non-smooth mode for low-frequency, mid-frequency, and high frequency regions, respectively) is proposed. The suggested approach requires zero additions, zero subtractions, zero multiplications (for intermediate region), no divisions (for non-smooth region) and no comparison. The suggested method thus keeps the computation lower and thus suitable for image coding systems based on blocks. Comparison of average number of operations for smooth, non-smooth, intermediate (per pixel vector for each block) using filter suggested by Chen and the proposed method filter suggests that the proposed filter keeps the computation lower and is thus suitable for fast processing algorithms.

Keywords: blocking artifacts, computational complexity, non-smooth, intermediate, smooth

Procedia PDF Downloads 442
18199 Movie Genre Preference Prediction Using Machine Learning for Customer-Based Information

Authors: Haifeng Wang, Haili Zhang

Abstract:

Most movie recommendation systems have been developed for customers to find items of interest. This work introduces a predictive model usable by small and medium-sized enterprises (SMEs) who are in need of a data-based and analytical approach to stock proper movies for local audiences and retain more customers. We used classification models to extract features from thousands of customers’ demographic, behavioral and social information to predict their movie genre preference. In the implementation, a Gaussian kernel support vector machine (SVM) classification model and a logistic regression model were established to extract features from sample data and their test error-in-sample were compared. Comparison of error-out-sample was also made under different Vapnik–Chervonenkis (VC) dimensions in the machine learning algorithm to find and prevent overfitting. Gaussian kernel SVM prediction model can correctly predict movie genre preferences in 85% of positive cases. The accuracy of the algorithm increased to 93% with a smaller VC dimension and less overfitting. These findings advance our understanding of how to use machine learning approach to predict customers’ preferences with a small data set and design prediction tools for these enterprises.

Keywords: computational social science, movie preference, machine learning, SVM

Procedia PDF Downloads 239
18198 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments

Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda

Abstract:

In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.

Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction

Procedia PDF Downloads 487
18197 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering

Authors: Sharifah Mousli, Sona Taheri, Jiayuan He

Abstract:

Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.

Keywords: autism spectrum disorder, clustering, optimization, unsupervised machine learning

Procedia PDF Downloads 86
18196 Evaluating the Nexus between Energy Demand and Economic Growth Using the VECM Approach: Case Study of Nigeria, China, and the United States

Authors: Rita U. Onolemhemhen, Saheed L. Bello, Akin P. Iwayemi

Abstract:

The effectiveness of energy demand policy depends on identifying the key drivers of energy demand both in the short-run and the long-run. This paper examines the influence of regional differences on the link between energy demand and other explanatory variables for Nigeria, China and USA using the Vector Error Correction Model (VECM) approach. This study employed annual time series data on energy consumption (ED), real gross domestic product (GDP) per capita (RGDP), real energy prices (P) and urbanization (N) for a thirty-six-year sample period. The utilized time-series data are sourced from World Bank’s World Development Indicators (WDI, 2016) and US Energy Information Administration (EIA). Results from the study, shows that all the independent variables (income, urbanization, and price) substantially affect the long-run energy consumption in Nigeria, USA and China, whereas, income has no significant effect on short-run energy demand in USA and Nigeria. In addition, the long-run effect of urbanization is relatively stronger in China. Urbanization is a key factor in energy demand, it therefore recommended that more attention should be given to the development of rural communities to reduce the inflow of migrants into urban communities which causes the increase in energy demand and energy excesses should be penalized while energy management should be incentivized.

Keywords: economic growth, energy demand, income, real GDP, urbanization, VECM

Procedia PDF Downloads 279
18195 Music Genre Classification Based on Non-Negative Matrix Factorization Features

Authors: Soyon Kim, Edward Kim

Abstract:

In order to retrieve information from the massive stream of songs in the music industry, music search by title, lyrics, artist, mood, and genre has become more important. Despite the subjectivity and controversy over the definition of music genres across different nations and cultures, automatic genre classification systems that facilitate the process of music categorization have been developed. Manual genre selection by music producers is being provided as statistical data for designing automatic genre classification systems. In this paper, an automatic music genre classification system utilizing non-negative matrix factorization (NMF) is proposed. Short-term characteristics of the music signal can be captured based on the timbre features such as mel-frequency cepstral coefficient (MFCC), decorrelated filter bank (DFB), octave-based spectral contrast (OSC), and octave band sum (OBS). Long-term time-varying characteristics of the music signal can be summarized with (1) the statistical features such as mean, variance, minimum, and maximum of the timbre features and (2) the modulation spectrum features such as spectral flatness measure, spectral crest measure, spectral peak, spectral valley, and spectral contrast of the timbre features. Not only these conventional basic long-term feature vectors, but also NMF based feature vectors are proposed to be used together for genre classification. In the training stage, NMF basis vectors were extracted for each genre class. The NMF features were calculated in the log spectral magnitude domain (NMF-LSM) as well as in the basic feature vector domain (NMF-BFV). For NMF-LSM, an entire full band spectrum was used. However, for NMF-BFV, only low band spectrum was used since high frequency modulation spectrum of the basic feature vectors did not contain important information for genre classification. In the test stage, using the set of pre-trained NMF basis vectors, the genre classification system extracted the NMF weighting values of each genre as the NMF feature vectors. A support vector machine (SVM) was used as a classifier. The GTZAN multi-genre music database was used for training and testing. It is composed of 10 genres and 100 songs for each genre. To increase the reliability of the experiments, 10-fold cross validation was used. For a given input song, an extracted NMF-LSM feature vector was composed of 10 weighting values that corresponded to the classification probabilities for 10 genres. An NMF-BFV feature vector also had a dimensionality of 10. Combined with the basic long-term features such as statistical features and modulation spectrum features, the NMF features provided the increased accuracy with a slight increase in feature dimensionality. The conventional basic features by themselves yielded 84.0% accuracy, but the basic features with NMF-LSM and NMF-BFV provided 85.1% and 84.2% accuracy, respectively. The basic features required dimensionality of 460, but NMF-LSM and NMF-BFV required dimensionalities of 10 and 10, respectively. Combining the basic features, NMF-LSM and NMF-BFV together with the SVM with a radial basis function (RBF) kernel produced the significantly higher classification accuracy of 88.3% with a feature dimensionality of 480.

Keywords: mel-frequency cepstral coefficient (MFCC), music genre classification, non-negative matrix factorization (NMF), support vector machine (SVM)

Procedia PDF Downloads 273