Search results for: in-phase component
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2578

Search results for: in-phase component

2458 20 Definitions in 20 Years: Exploring the Evolution of Blended Learning Definitions from 2003-2022

Authors: Damian Gordon, Paul Doyle, Anna Becevel, Tina Baloh

Abstract:

The goal of this research is to explore the evolution of the concept of “blended learning” over a twenty-year period, to see whether or not the conceptualization has remained consistent or if it has become either more specific or more general. To achieve this goal, the term “blended learning” (and variations) was searched for in various bibliographical repositories for each year 2003-2022 to locate a highly cited paper that is not behind a paywall, to locate unique definitions that would be freely available to all academics each year. Each of the twenty unique definitions is explored to identify how they categorize both the Classroom Component and the Computer Component of blended learning, as well as identify which discipline each definition originates from and which country it comes from to see if there are any significant geographical variations. Based on this analysis, trends that appear in the definitions are noted, as well as an overall interpretation of the notion of “Blended Learning.”

Keywords: blended learning, definitions of blended learning, e-learning, thematic searches

Procedia PDF Downloads 129
2457 Improvement of Environment and Climate Change Canada’s Gem-Hydro Streamflow Forecasting System

Authors: Etienne Gaborit, Dorothy Durnford, Daniel Deacu, Marco Carrera, Nathalie Gauthier, Camille Garnaud, Vincent Fortin

Abstract:

A new experimental streamflow forecasting system was recently implemented at the Environment and Climate Change Canada’s (ECCC) Canadian Centre for Meteorological and Environmental Prediction (CCMEP). It relies on CaLDAS (Canadian Land Data Assimilation System) for the assimilation of surface variables, and on a surface prediction system that feeds a routing component. The surface energy and water budgets are simulated with the SVS (Soil, Vegetation, and Snow) Land-Surface Scheme (LSS) at 2.5-km grid spacing over Canada. The routing component is based on the Watroute routing scheme at 1-km grid spacing for the Great Lakes and Nelson River watersheds. The system is run in two distinct phases: an analysis part and a forecast part. During the analysis part, CaLDAS outputs are used to force the routing system, which performs streamflow assimilation. In forecast mode, the surface component is forced with the Canadian GEM atmospheric forecasts and is initialized with a CaLDAS analysis. Streamflow performances of this new system are presented over 2019. Performances are compared to the current ECCC’s operational streamflow forecasting system, which is different from the new experimental system in many aspects. These new streamflow forecasts are also compared to persistence. Overall, the new streamflow forecasting system presents promising results, highlighting the need for an elaborated assimilation phase before performing the forecasts. However, the system is still experimental and is continuously being improved. Some major recent improvements are presented here and include, for example, the assimilation of snow cover data from remote sensing, a backward propagation of assimilated flow observations, a new numerical scheme for the routing component, and a new reservoir model.

Keywords: assimilation system, distributed physical model, offline hydro-meteorological chain, short-term streamflow forecasts

Procedia PDF Downloads 130
2456 Assessment of an ICA-Based Method for Detecting the Effect of Attention in the Auditory Late Response

Authors: Siavash Mirahmadizoghi, Steven Bell, David Simpson

Abstract:

In this work a new independent component analysis (ICA) based method for noise reduction in evoked potentials is evaluated on for auditory late responses (ALR) captured with a 63-channel electroencephalogram (EEG) from 10 normal-hearing subjects. The performance of the new method is compared with a single channel alternative in terms of signal to noise ratio (SNR), the number of channels with an SNR above an empirically derived statistical critical value and an estimate of the effect of attention on the major components in the ALR waveform. The results show that the multichannel signal processing method can significantly enhance the quality of the ALR signal and also detect the effect of the attention on the ALR better than the single channel alternative.

Keywords: auditory late response (ALR), attention, EEG, independent component analysis (ICA), multichannel signal processing

Procedia PDF Downloads 505
2455 Detection of Abnormal Process Behavior in Copper Solvent Extraction by Principal Component Analysis

Authors: Kirill Filianin, Satu-Pia Reinikainen, Tuomo Sainio

Abstract:

Frequent measurements of product steam quality create a data overload that becomes more and more difficult to handle. In the current study, plant history data with multiple variables was successfully treated by principal component analysis to detect abnormal process behavior, particularly, in copper solvent extraction. The multivariate model is based on the concentration levels of main process metals recorded by the industrial on-stream x-ray fluorescence analyzer. After mean-centering and normalization of concentration data set, two-dimensional multivariate model under principal component analysis algorithm was constructed. Normal operating conditions were defined through control limits that were assigned to squared score values on x-axis and to residual values on y-axis. 80 percent of the data set were taken as the training set and the multivariate model was tested with the remaining 20 percent of data. Model testing showed successful application of control limits to detect abnormal behavior of copper solvent extraction process as early warnings. Compared to the conventional techniques of analyzing one variable at a time, the proposed model allows to detect on-line a process failure using information from all process variables simultaneously. Complex industrial equipment combined with advanced mathematical tools may be used for on-line monitoring both of process streams’ composition and final product quality. Defining normal operating conditions of the process supports reliable decision making in a process control room. Thus, industrial x-ray fluorescence analyzers equipped with integrated data processing toolbox allows more flexibility in copper plant operation. The additional multivariate process control and monitoring procedures are recommended to apply separately for the major components and for the impurities. Principal component analysis may be utilized not only in control of major elements’ content in process streams, but also for continuous monitoring of plant feed. The proposed approach has a potential in on-line instrumentation providing fast, robust and cheap application with automation abilities.

Keywords: abnormal process behavior, failure detection, principal component analysis, solvent extraction

Procedia PDF Downloads 309
2454 Comparison of Different Data Acquisition Techniques for Shape Optimization Problems

Authors: Attila Vámosi, Tamás Mankovits, Dávid Huri, Imre Kocsis, Tamás Szabó

Abstract:

Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. Rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. The shape optimization problem of rubber parts led to the study of FEM based calculation processes. This type of problems was posed and investigated by several authors. In this paper the time demand of certain calculation methods are studied and the possibilities of time reduction is presented.

Keywords: rubber bumper, data acquisition, finite element analysis, support vector regression

Procedia PDF Downloads 471
2453 Performance Improvement in a Micro Compressor for Micro Gas Turbine Using Computational Fluid Dynamics

Authors: Kamran Siddique, Hiroyuki Asada, Yoshifumi Ogami

Abstract:

Micro gas turbine (MGT) nowadays has a wide variety of applications from drones to hybrid electric vehicles. As microfabrication technology getting better, the size of MGT is getting smaller. Overall performance of MGT is dependent on the individual components. Each component’s performance is dependent and interrelated with another component. Therefore, careful consideration needs to be given to each and every individual component of MGT. In this study, the focus is on improving the performance of the compressor in order to improve the overall performance of MGT. Computational Fluid Dynamics (CFD) is being performed using the software FLUENT to analyze the design of a micro compressor. Operating parameters like mass flow rate and RPM, and design parameters like inner blade angle (IBA), outer blade angle (OBA), blade thickness and number of blades are varied to study its effect on the performance of the compressor. Pressure ratio is used as a tool to measure the performance of the compressor. Higher the pressure ratio, better the design is. In the study, target mass flow rate is 0.2 g/s and RPM to be less than or equal to 900,000. So far, a pressure ratio of above 3 has been achieved at 0.2 g/s mass flow rate with 5 rotor blades, 0.36 mm blade thickness, 94.25 degrees OBA and 10.46 degrees IBA. The design in this study differs from a regular centrifugal compressor used in conventional gas turbines such that compressor is designed keeping in mind ease of manufacturability. So, this study proposes a compressor design which has a good pressure ratio, and at the same time, it is easy to manufacture using current microfabrication technologies.

Keywords: computational fluid dynamics, FLUENT microfabrication, RPM

Procedia PDF Downloads 162
2452 Implementation of ANN-Based MPPT for a PV System and Efficiency Improvement of DC-DC Converter by WBG Devices

Authors: Bouchra Nadji, Elaid Bouchetob

Abstract:

PV systems are common in residential and industrial settings because of their low, upfront costs and operating costs throughout their lifetimes. Buck or boost converters are used in photovoltaic systems, regardless of whether the system is autonomous or connected to the grid. These converters became less appealing because of their low efficiency, inadequate power density, and use of silicon for their power components. Traditional devices based on Si are getting close to reaching their theoretical performance limits, which makes it more challenging to improve the performance and efficiency of these devices. GaN and SiC are the two types of WBG semiconductors with the most recent technological advancements and are available. Tolerance to high temperatures and switching frequencies can reduce active and passive component size. Utilizing high-efficiency dc-dc boost converters is the primary emphasis of this work. These converters are for photovoltaic systems that use wave energy.

Keywords: component, Artificial intelligence, PV System, ANN MPPT, DC-DC converter

Procedia PDF Downloads 60
2451 Object-Oriented Program Comprehension by Identification of Software Components and Their Connexions

Authors: Abdelhak-Djamel Seriai, Selim Kebir, Allaoua Chaoui

Abstract:

During the last decades, object oriented program- ming has been massively used to build large-scale systems. However, evolution and maintenance of such systems become a laborious task because of the lack of object oriented programming to offer a precise view of the functional building blocks of the system. This lack is caused by the fine granularity of classes and objects. In this paper, we use a post object-oriented technology namely software components, to propose an approach based on the identification of the functional building blocks of an object oriented system by analyzing its source code. These functional blocks are specified as software components and the result is a multi-layer component based software architecture.

Keywords: software comprehension, software component, object oriented, software architecture, reverse engineering

Procedia PDF Downloads 412
2450 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software

Authors: Anjushi Verma, Tirthankar Gayen

Abstract:

Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.

Keywords: black box, faults, failure, software reliability

Procedia PDF Downloads 443
2449 Ultra-Tightly Coupled GNSS/INS Based on High Degree Cubature Kalman Filtering

Authors: Hamza Benzerrouk, Alexander Nebylov

Abstract:

In classical GNSS/INS integration designs, the loosely coupled approach uses the GNSS derived position and the velocity as the measurements vector. This design is suboptimal from the standpoint of preventing GNSSoutliers/outages. The tightly coupled GPS/INS navigation filter mixes the GNSS pseudo range and inertial measurements and obtains the vehicle navigation state as the final navigation solution. The ultra‐tightly coupled GNSS/INS design combines the I (inphase) and Q(quadrature) accumulator outputs in the GNSS receiver signal tracking loops and the INS navigation filter function intoa single Kalman filter variant (EKF, UKF, SPKF, CKF and HCKF). As mentioned, EKF and UKF are the most used nonlinear filters in the literature and are well adapted to inertial navigation state estimation when integrated with GNSS signal outputs. In this paper, it is proposed to move a step forward with more accurate filters and modern approaches called Cubature and High Degree cubature Kalman Filtering methods, on the basis of previous results solving the state estimation based on INS/GNSS integration, Cubature Kalman Filter (CKF) and High Degree Cubature Kalman Filter with (HCKF) are the references for the recent developed generalized Cubature rule based Kalman Filter (GCKF). High degree cubature rules are the kernel of the new solution for more accurate estimation with less computational complexity compared with the Gauss-Hermite Quadrature (GHQKF). Gauss-Hermite Kalman Filter GHKF which is not selected in this work because of its limited real-time implementation in high-dimensional state-spaces. In ultra tightly or a deeply coupled GNSS/INS system is dynamics EKF is used with transition matrix factorization together with GNSS block processing which is well described in the paper and assumes available the intermediary frequency IF by using a correlator samples with a rate of 500 Hz in the presented approach. GNSS (GPS+GLONASS) measurements are assumed available and modern SPKF with Cubature Kalman Filter (CKF) are compared with new versions of CKF called high order CKF based on Spherical-radial cubature rules developed at the fifth order in this work. Estimation accuracy of the high degree CKF is supposed to be comparative to GHKF, results of state estimation are then observed and discussed for different initialization parameters. Results show more accurate navigation state estimation and more robust GNSS receiver when Ultra Tightly Coupled approach applied based on High Degree Cubature Kalman Filter.

Keywords: GNSS, INS, Kalman filtering, ultra tight integration

Procedia PDF Downloads 280
2448 Alumina Supported Copper-Manganese Catalysts for Combustion of Exhaust Gases: Effect of Preparation Method

Authors: Krasimir Ivanov, Elitsa Kolentsova, Dimitar Dimitrov

Abstract:

The development of active and stable catalysts without noble metals for low temperature oxidation of exhaust gases remains a significant challenge. The purpose of this study is to determine the influence of the preparation method on the catalytic activity of the supported copper-manganese mixed oxides in terms of VOCs oxidation. The catalysts were prepared by impregnation of γ-Al2O3 with copper and manganese nitrates and acetates and the possibilities for CO, CH3OH and dimethyl ether (DME) oxidation were evaluated using continuous flow equipment with a four-channel isothermal stainless steel reactor. Effect of the support, Cu/Mn mole ratio, heat treatment of the precursor and active component loading were investigated. Highly active alumina supported Cu-Mn catalysts for CO and VOCs oxidation were synthesized. The effect of preparation conditions on the activity behavior of the catalysts was discussed. The synergetic interaction between copper and manganese species increases the activity for complete oxidation over mixed catalysts. Type of support, calcination temperature and active component loading along with catalyst composition are important factors, determining catalytic activity. Cu/Mn molar ratio of 1:5, heat treatment at 450oC and 20 % active component loading are the best compromise for production of active catalyst for simultaneous combustion of CO, CH3OH and DME.

Keywords: copper-manganese catalysts, CO, VOCs oxidation, exhaust gases

Procedia PDF Downloads 412
2447 A Combination of Independent Component Analysis, Relative Wavelet Energy and Support Vector Machine for Mental State Classification

Authors: Nguyen The Hoang Anh, Tran Huy Hoang, Vu Tat Thang, T. T. Quyen Bui

Abstract:

Mental state classification is an important step for realizing a control system based on electroencephalography (EEG) signals which could benefit a lot of paralyzed people including the locked-in or Amyotrophic Lateral Sclerosis. Considering that EEG signals are nonstationary and often contaminated by various types of artifacts, classifying thoughts into correct mental states is not a trivial problem. In this work, our contribution is that we present and realize a novel model which integrates different techniques: Independent component analysis (ICA), relative wavelet energy, and support vector machine (SVM) for the same task. We applied our model to classify thoughts in two types of experiment whether with two or three mental states. The experimental results show that the presented model outperforms other models using Artificial Neural Network, K-Nearest Neighbors, etc.

Keywords: EEG, ICA, SVM, wavelet

Procedia PDF Downloads 384
2446 A Comprehensive Evaluation of Supervised Machine Learning for the Phase Identification Problem

Authors: Brandon Foggo, Nanpeng Yu

Abstract:

Power distribution circuits undergo frequent network topology changes that are often left undocumented. As a result, the documentation of a circuit’s connectivity becomes inaccurate with time. The lack of reliable circuit connectivity information is one of the biggest obstacles to model, monitor, and control modern distribution systems. To enhance the reliability and efficiency of electric power distribution systems, the circuit’s connectivity information must be updated periodically. This paper focuses on one critical component of a distribution circuit’s topology - the secondary transformer to phase association. This topology component describes the set of phase lines that feed power to a given secondary transformer (and therefore a given group of power consumers). Finding the documentation of this component is call Phase Identification, and is typically performed with physical measurements. These measurements can take time lengths on the order of several months, but with supervised learning, the time length can be reduced significantly. This paper compares several such methods applied to Phase Identification for a large range of real distribution circuits, describes a method of training data selection, describes preprocessing steps unique to the Phase Identification problem, and ultimately describes a method which obtains high accuracy (> 96% in most cases, > 92% in the worst case) using only 5% of the measurements typically used for Phase Identification.

Keywords: distribution network, machine learning, network topology, phase identification, smart grid

Procedia PDF Downloads 299
2445 Statistical Analysis of Natural Images after Applying ICA and ISA

Authors: Peyman Sheikholharam Mashhadi

Abstract:

Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.

Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images

Procedia PDF Downloads 339
2444 The Use of Creativity to Nudge Students Into Heutagogy: An Implementation in Graduate Business Education

Authors: Ricardo Bragança, Tom Vinaimont

Abstract:

This paper discusses the introduction of processes of self-determined learning (heutagogy) into a graduate course on financial modeling, using elements of entangled pedagogy and Biggs’ constructive alignment. To encourage learners to take control of their own learning journey and develop critical thinking and problem-solving skills, each session in the course receives tailor-made media-enhanced pedagogical assets. The design of those assets specifically supports entangled pedagogy, which opposes technological or pedagogical determinism in support of the collaborative integration of pedagogy and technology. Media assets for each of the ten sessions in this course consist of three components. The first component in this three-pronged approach is a game-cut-like cinematographic representation that introduces the context of the session. The second component represents a character from an open-source-styled community that encourages self-determined learning. The third component consists of a character, which refers to the in-person instructor and also aligns learning outcomes and assessment tasks, using Biggs’ constructive alignment, to the cinematographic and open-source-styled component. In essence, the course's metamorphosis helps students apply the concepts they've studied to actual financial modeling issues. The audio-visual media assets create a storyline throughout the course based on gamified and real-world applications, thus encouraging student engagement and interaction. The structured entanglement of pedagogy and technology also guides the instructor in the design of the in-class interactions and directs the focus on outcomes and assessments. The transformation process of this graduate course in financial modeling led to an institutional teaching award in 2021. The transformation of this course may be used as a model for other courses and programs in many disciplines to help with intended learning outcomes integration, constructive alignment, and Assurance of Learning.

Keywords: innovative education, active learning, entangled pedagogy, heutagogy, constructive alignment, project based learning, financial modeling, graduate business education

Procedia PDF Downloads 72
2443 Analysis of the Significance of Multimedia Channels Using Sparse PCA and Regularized SVD

Authors: Kourosh Modarresi

Abstract:

The abundance of media channels and devices has given users a variety of options to extract, discover, and explore information in the digital world. Since, often, there is a long and complicated path that a typical user may venture before taking any (significant) action (such as purchasing goods and services), it is critical to know how each node (media channel) in the path of user has contributed to the final action. In this work, the significance of each media channel is computed using statistical analysis and machine learning techniques. More specifically, “Regularized Singular Value Decomposition”, and “Sparse Principal Component” has been used to compute the significance of each channel toward the final action. The results of this work are a considerable improvement compared to the present approaches.

Keywords: multimedia attribution, sparse principal component, regularization, singular value decomposition, feature significance, machine learning, linear systems, variable shrinkage

Procedia PDF Downloads 309
2442 The Association between Health-Related Quality of Life and Physical Activity in Different Domains with Other Factors in Croatian Male Police Officers

Authors: Goran Sporiš, Dinko Vuleta, Stefan Lovro

Abstract:

The purpose of the present study was to determine the associations between health-related quality of life (HRQOL) and physical activity (PA) in different domains. In this cross-sectional study, participants were 169 Croatian police officers (mean age 35.14±8.95 yrs, mean height 180.93±7.53 cm, mean weight 88.39±14.05 kg, mean body-mass index 26.90±3.39 kg/m2). The dependent variables were two general domains extracted from the HRQOL questionnaire: (1) physical component scale (PCS) and (2) mental component scale (MCS). The independent variables were job-related, transport, domestic and leisure-time PA, along with other factors: age, body-mass index, smoking status, psychological distress, socioeconomic status and time spent in sedentary behaviour. The associations between dependent and independent variables were analyzed by using multiple regression analysis. Significance was set up at p < 0.05. PCS was positively associated with leisure-time PA (β 0.28, p < 0.001) and socioeconomic status (SES) (β 0.16, p=0.005), but inversely associated with job-related PA (β -0.15, p=0.012), domestic-time PA (β -0.14, p=0.014), age (β -0.12, p=0.050), psychological distress (β -0.43, p<0.001) and sedentary behaviour (β -0.15, p=0.009). MCS was positively associated with leisure-time PA (β 0.19, p=0.013) and SES (β 0.20, p=0.002), while inversely associated with age (β -0.23, p=0.001), psychological distress (β -0.27, p<0.001) and sedentary behaviour (β -0.22, p=0.001). Our results added new information about the associations between domain-specific PA and both physical and mental component scale in police officers. Future studies should deal with the same associations in other stressful occupations.

Keywords: health, fitness, police force, relations

Procedia PDF Downloads 299
2441 The Use of Degradation Measures to Design Reliability Test Plans

Authors: Stephen V. Crowder, Jonathan W. Lane

Abstract:

With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component.

Keywords: degradation measure, time to failure distribution, bootstrap, computational science

Procedia PDF Downloads 531
2440 Finding the Theory of Riba Avoidance: A Scoping Review to Set the Research Agenda

Authors: Randa Ismail Sharafeddine

Abstract:

The Islamic economic system is distinctive in that it implicitly recognizes money as a separate, independent component of production capable of assuming risk and so entitled to the same reward as other Entrepreneurial Factors of Production (EFP). Conventional theory does not identify money capital explicitly as a component of production; rather, interest is recognized as a reward for capital, the interest rate is the cost of money capital, and it is also seen as a cost of physical capital. The conventional theory of production examines how diverse non-entrepreneurial resources (Land, Labor, and Capital) are selected; however, the economic theory community is largely unaware of the reasons why these resources choose to remain as non-entrepreneurial resources as opposed to becoming entrepreneurial resources. Should land, labor, and financial asset owners choose to work for others in return for rent, income, or interest, or should they engage in entrepreneurial risk-taking in order to profit. This is a decision made often in the actual world, but it has never been effectively treated in economic theory. This article will conduct a critical analysis of the conventional classification of factors of production and propose a classification for resource allocation and income distribution (Rent, Wages, Interest, and Profits) that is more rational, even within the conventional theoretical framework for evaluating and developing production and distribution theories. Money is an essential component of production in an Islamic economy, and it must be used to sustain economic activity.

Keywords: financial capital, production theory, distribution theory, economic activity, riba avoidance, institution of participation

Procedia PDF Downloads 91
2439 Robust Shrinkage Principal Component Parameter Estimator for Combating Multicollinearity and Outliers’ Problems in a Poisson Regression Model

Authors: Arum Kingsley Chinedu, Ugwuowo Fidelis Ifeanyi, Oranye Henrietta Ebele

Abstract:

The Poisson regression model (PRM) is a nonlinear model that belongs to the exponential family of distribution. PRM is suitable for studying count variables using appropriate covariates and sometimes experiences the problem of multicollinearity in the explanatory variables and outliers on the response variable. This study aims to address the problem of multicollinearity and outliers jointly in a Poisson regression model. We developed an estimator called the robust modified jackknife PCKL parameter estimator by combining the principal component estimator, modified jackknife KL and transformed M-estimator estimator to address both problems in a PRM. The superiority conditions for this estimator were established, and the properties of the estimator were also derived. The estimator inherits the characteristics of the combined estimators, thereby making it efficient in addressing both problems. And will also be of immediate interest to the research community and advance this study in terms of novelty compared to other studies undertaken in this area. The performance of the estimator (robust modified jackknife PCKL) with other existing estimators was compared using mean squared error (MSE) as a performance evaluation criterion through a Monte Carlo simulation study and the use of real-life data. The results of the analytical study show that the estimator outperformed other existing estimators compared with by having the smallest MSE across all sample sizes, different levels of correlation, percentages of outliers and different numbers of explanatory variables.

Keywords: jackknife modified KL, outliers, multicollinearity, principal component, transformed M-estimator.

Procedia PDF Downloads 66
2438 The Effect of MOOC-Based Distance Education in Academic Engagement and Its Components on Kerman University Students

Authors: Fariba Dortaj, Reza Asadinejad, Akram Dortaj, Atena Baziyar

Abstract:

The aim of this study was to determine the effect of distance education (based on MOOC) on the components of academic engagement of Kerman PNU. The research was quasi-experimental method that cluster sampling with an appropriate volume was used in this study (one class in experimental group and one class in controlling group). Sampling method is single-stage cluster sampling. The statistical society is students of Kerman Payam Noor University, which) were selected 40 of them as sample (20 students in the control group and 20 students in experimental group). To test the hypothesis, it was used the analysis of univariate and Co-covariance to offset the initial difference (difference of control) in the experimental group and the control group. The instrument used in this study is academic engagement questionnaire of Zerang (2012) that contains component of cognitive, behavioral and motivational engagement. The results showed that there is no significant difference between mean scores of academic components of academic engagement in experimental group and the control group on the post-test, after elimination of the pre-test. The adjusted mean scores of components of academic engagement in the experimental group were higher than the adjusted average of scores after the test in the control group. The use of technology-based education in distance education has been effective in increasing cognitive engagement, motivational engagement and behavioral engagement among students. Experimental variable with the effect size 0.26, predicted 26% of cognitive engagement component variance. Experimental variable with the effect size 0.47, predicted 47% of the motivational engagement component variance. Experimental variable with the effect size 0.40, predicted 40% of behavioral engagement component variance. So teaching with technology (MOOC) has a positive impact on increasing academic engagement and academic performance of students in educational technology. The results suggest that technology (MOOC) is used to enrich the teaching of other lessons of PNU.

Keywords: educational technology, distance education, components of academic engagement, mooc technology

Procedia PDF Downloads 149
2437 Prediction of Childbearing Orientations According to Couples' Sexual Review Component

Authors: Razieh Rezaeekalantari

Abstract:

Objective: The purpose of this study was to investigate the prediction of parenting orientations in terms of the components of couples' sexual review. Methods: This was a descriptive correlational research method. The population consisted of 500 couples referring to Sari Health Center. Two hundred and fifteen (215) people were selected randomly by using Krejcie-Morgan-sample-size-table. For data collection, the childbearing orientations scale and the Multidimensional Sexual Self-Concept Questionnaire were used. Result: For data analysis, the mean and standard deviation were used and to analyze the research hypothesis regression correlation and inferential statistics were used. Conclusion: The findings indicate that there is not a significant relationship between the tendency to childbearing and the predictive value of sexual review (r = 0.84) with significant level (sig = 219.19) (P < 0.05). So, with 95% confidence, we conclude that there is not a meaningful relationship between sexual orientation and tendency to child-rearing.

Keywords: couples referring, health center, sexual review component, parenting orientations

Procedia PDF Downloads 219
2436 The Effect of Biochar, Inoculated Biochar and Compost Biological Component of the Soil

Authors: Helena Dvořáčková, Mikajlo Irina, Záhora Jaroslav, Elbl Jakub

Abstract:

Biochar can be produced from the waste matter and its application has been associated with returning of carbon in large amounts into the soil. The impacts of this material on physical and chemical properties of soil have been described. The biggest part of the research work is dedicated to the hypothesis of this material’s toxic effects on the soil life regarding its effect on the soil biological component. At present, it has been worked on methods which could eliminate these undesirable properties of biochar. One of the possibilities is to mix biochar with organic material, such as compost, or focusing on the natural processes acceleration in the soil. In the experiment has been used as the addition of compost as well as the elimination of toxic substances by promoting microbial activity in aerated water environment. Biochar was aerated for 7 days in a container with a volume of 20 l. This way modified biochar had six times higher biomass production and reduce mineral nitrogen leaching. Better results have been achieved by mixing biochar with compost.

Keywords: leaching of nitrogen, soil, biochar, compost

Procedia PDF Downloads 328
2435 Evaluation of Real-Time Background Subtraction Technique for Moving Object Detection Using Fast-Independent Component Analysis

Authors: Naoum Abderrahmane, Boumehed Meriem, Alshaqaqi Belal

Abstract:

Background subtraction algorithm is a larger used technique for detecting moving objects in video surveillance to extract the foreground objects from a reference background image. There are many challenges to test a good background subtraction algorithm, like changes in illumination, dynamic background such as swinging leaves, rain, snow, and the changes in the background, for example, moving and stopping of vehicles. In this paper, we propose an efficient and accurate background subtraction method for moving object detection in video surveillance. The main idea is to use a developed fast-independent component analysis (ICA) algorithm to separate background, noise, and foreground masks from an image sequence in practical environments. The fast-ICA algorithm is adapted and adjusted with a matrix calculation and searching for an optimum non-quadratic function to be faster and more robust. Moreover, in order to estimate the de-mixing matrix and the denoising de-mixing matrix parameters, we propose to convert all images to YCrCb color space, where the luma component Y (brightness of the color) gives suitable results. The proposed technique has been verified on the publicly available datasets CD net 2012 and CD net 2014, and experimental results show that our algorithm can detect competently and accurately moving objects in challenging conditions compared to other methods in the literature in terms of quantitative and qualitative evaluations with real-time frame rate.

Keywords: background subtraction, moving object detection, fast-ICA, de-mixing matrix

Procedia PDF Downloads 96
2434 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts

Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti

Abstract:

Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.

Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization

Procedia PDF Downloads 63
2433 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain

Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper

Abstract:

Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.

Keywords: additive manufacturing, lean production, reproducibility, work safety

Procedia PDF Downloads 184
2432 Statistical Wavelet Features, PCA, and SVM-Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the support-vectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: discrete wavelet transform, electroencephalogram, pattern recognition, principal component analysis, support vector machine

Procedia PDF Downloads 638
2431 Neuromarketing: Discovering the Somathyc Marker in the Consumer´s Brain

Authors: Mikel Alonso López, María Francisca Blasco López, Víctor Molero Ayala

Abstract:

The present study explains the somatic marker theory of Antonio Damasio, which indicates that when making a decision, the stored or possible future scenarios (future memory) images allow people to feel for a moment what would happen when they make a choice, and how this is emotionally marked. This process can be conscious or unconscious. The development of new Neuromarketing techniques such as functional magnetic resonance imaging (fMRI), carries a greater understanding of how the brain functions and consumer behavior. In the results observed in different studies using fMRI, the evidence suggests that the somatic marker and future memories influence the decision-making process, adding a positive or negative emotional component to the options. This would mean that all decisions would involve a present emotional component, with a rational cost-benefit analysis that can be performed later.

Keywords: emotions, decision making, somatic marker, consumer´s brain

Procedia PDF Downloads 403
2430 Thermodynamics of the Local Hadley Circulation Over Central Africa

Authors: Landry Tchambou Tchouongsi, Appolinaire Derbetini Vondou

Abstract:

This study describes the local Hadley circulation (HC) during the December-February (DJF) and June-August (JJA) seasons, respectively, in Central Africa (CA) from the divergent component of the mean meridional wind and also from a new method called the variation of the ψ vector. Historical data from the ERA5 reanalysis for the period 1983 to 2013 were used. The results show that the maximum of the upward branch of the local Hadley circulation in the DJF and JJA seasons is located under the Congo Basin (CB). However, seasonal and horizontal variations in the mean temperature gradient and thermodynamic properties are largely associated with the distribution of convection and large-scale upward motion. Thus, temperatures beneath the CB show a slight variation between the DJF and JJA seasons. Moreover, energy transport of the moist static energy (MSE) adequately captures the mean flow component of the HC over the tropics. By the way, the divergence under the CB is enhanced by the presence of the low pressure of western Cameroon and the contribution of the warm and dry air currents coming from the Sahara.

Keywords: Circulation, reanalysis, thermodynamic, local Hadley.

Procedia PDF Downloads 89
2429 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: text mining, topic extraction, independent, incremental, independent component analysis

Procedia PDF Downloads 309