Search results for: Least squares
151 Design of Variable Fractional-Delay FIR Differentiators
Authors: Jong-Jy Shyu, Soo-Chang Pei, Min-Han Chang
Abstract:
In this paper, the least-squares design of variable fractional-delay (VFD) finite impulse response (FIR) digital differentiators is proposed. The used transfer function is formulated so that Farrow structure can be applied to realize the designed system. Also, the symmetric characteristics of filter coefficients are derived, which leads to the complexity reduction by saving almost a half of the number of coefficients. Moreover, all the elements of related vectors or matrices for the optimal process can be represented in closed forms, which make the design easier. Design example is also presented to illustrate the effectiveness of the proposed method.
Keywords: Differentiator, variable fractional-delay filter, FIR filter, least-squares method, Farrow structure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1416150 Assessment Tool for Social Responsibility Performance According to the ISO 26000
Authors: W. Fethallah, L. Chraibi, N. Sefiani
Abstract:
The present paper is concerned with a statistical approach involving latent and manifest variables applied in order to assess the organization's social responsibility performance. The main idea is to develop an assessment tool and a measurement of the Social Responsibility Performance, enabling the company to characterize her performance regarding to the ISO 26000 standard's seven core subjects. For this, we conceptualize a structural equation modeling (SEM) which describes various causal connections between the Social Responsibility’s components. The SEM’s resolution is based on the Partial Least squares (PLS) method and the implementation is running in the XLSTAT software.Keywords: Corporate social responsibility, latent and manifest variable, partial least squares, structural equation model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1224149 Block Cipher Based on Randomly Generated Quasigroups
Authors: Deepthi Haridas, S Venkataraman, Geeta Varadan
Abstract:
Quasigroups are algebraic structures closely related to Latin squares which have many different applications. The construction of block cipher is based on quasigroup string transformation. This article describes a block cipher based Quasigroup of order 256, suitable for fast software encryption of messages written down in universal ASCII code. The novelty of this cipher lies on the fact that every time the cipher is invoked a new set of two randomly generated quasigroups are used which in turn is used to create a pair of quasigroup of dual operations. The cryptographic strength of the block cipher is examined by calculation of the xor-distribution tables. In this approach some algebraic operations allows quasigroups of huge order to be used without any requisite to be stored.Keywords: quasigroups, latin squares, block cipher and quasigroup string transformations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2065148 Application of the Total Least Squares Estimation Method for an Aircraft Aerodynamic Model Identification
Authors: Zaouche Mohamed, Amini Mohamed, Foughali Khaled, Aitkaid Souhila, Bouchiha Nihad Sarah
Abstract:
The aerodynamic coefficients are important in the evaluation of an aircraft performance and stability-control characteristics. These coefficients also can be used in the automatic flight control systems and mathematical model of flight simulator. The study of the aerodynamic aspect of flying systems is a reserved domain and inaccessible for the developers. Doing tests in a wind tunnel to extract aerodynamic forces and moments requires a specific and expensive means. Besides, the glaring lack of published documentation in this field of study makes the aerodynamic coefficients determination complicated. This work is devoted to the identification of an aerodynamic model, by using an aircraft in virtual simulated environment. We deal with the identification of the system, we present an environment framework based on Software In the Loop (SIL) methodology and we use MicrosoftTM Flight Simulator (FS-2004) as the environment for plane simulation. We propose The Total Least Squares Estimation technique (TLSE) to identify the aerodynamic parameters, which are unknown, variable, classified and used in the expression of the piloting law. In this paper, we define each aerodynamic coefficient as the mean of its numerical values. All other variations are considered as modeling uncertainties that will be compensated by the robustness of the piloting control.
Keywords: Aircraft aerodynamic model, Microsoft flight simulator, MQ-1 Predator, total least squares estimation, piloting the aircraft.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1669147 The Inverse Problem of Nonsymmetric Matrices with a Submatrix Constraint and its Approximation
Authors: Yongxin Yuan, Hao Liu
Abstract:
In this paper, we first give the representation of the general solution of the following least-squares problem (LSP): Given matrices X ∈ Rn×p, B ∈ Rp×p and A0 ∈ Rr×r, find a matrix A ∈ Rn×n such that XT AX − B = min, s. t. A([1, r]) = A0, where A([1, r]) is the r×r leading principal submatrix of the matrix A. We then consider a best approximation problem: given an n × n matrix A˜ with A˜([1, r]) = A0, find Aˆ ∈ SE such that A˜ − Aˆ = minA∈SE A˜ − A, where SE is the solution set of LSP. We show that the best approximation solution Aˆ is unique and derive an explicit formula for it. Keyw
Keywords: Inverse problem, Least-squares solution, model updating, Singular value decomposition (SVD), Optimal approximation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648146 Alternative Robust Estimators for the Shape Parameters of the Burr XII Distribution
Authors: F. Z. Doğru, O. Arslan
Abstract:
In general, classical methods such as maximum likelihood (ML) and least squares (LS) estimation methods are used to estimate the shape parameters of the Burr XII distribution. However, these estimators are very sensitive to the outliers. To overcome this problem we propose alternative robust estimators based on the M-estimation method for the shape parameters of the Burr XII distribution. We provide a small simulation study and a real data example to illustrate the performance of the proposed estimators over the ML and the LS estimators. The simulation results show that the proposed robust estimators generally outperform the classical estimators in terms of bias and root mean square errors when there are outliers in data.
Keywords: Burr XII distribution, robust estimator, M-estimator, maximum likelihood, least squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2659145 Performance Analysis of Adaptive OFDM Pre and Post-FTT Beamforming System
Authors: S. Elnobi, Iman El-Zahaby, Amr M. Mahros
Abstract:
In mobile communication systems, performance and capacity are affected by multi-path fading, delay spread and Co-Channel Interference (CCI). For this reason Orthogonal Frequency Division Multiplexing (OFDM) and adaptive antenna array are used is required. The goal of the OFDM is to improve the system performance against Inter-Symbol Interference (ISI). An array of adaptive antennas has been employed to suppress CCI by spatial technique. To suppress CCI in OFDM systems two main schemes the pre-FFT and the post-FFT have been proposed. In this paper, through a system level simulation, the behavior of the pre-FFT and post-FFT beamformers for OFDM system has been investigated based on two algorithms namely, Least Mean Squares (LMS) and Recursive Least Squares (RLS). The performance of the system is also discussed in multipath fading channel system specified by 3GPP Long Term Evolution (LTE).
Keywords: OFDM, Beamforming, Adaptive Antennas Array.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2440144 Accurate Dimensional Measurement of 3D Round Holes Based on Stereo Vision
Authors: Zhiguo Ren, Lilong Cai
Abstract:
This paper present an effective method to accurately reconstruct and measure the 3D curve edges of small industrial parts based on stereo vision. To effectively fit the curve of the measured parts using a series of line segments in the images, a strategy from coarse to fine is employed based on multi-scale curve fitting. After reconstructing the 3D curve of a hole through a curved surface, its axis is adjusted so that it is parallel to the Z axis with least squares error and the dimensions of the hole can be calculated on the XY plane easily. Experimental results show that the presented method can accurately measure the dimensions of round holes through a curved surface.
Keywords: Stereo Vision, 3D Round Hole Measurement, Curve Fitting, 3D Curve Reconstruction, Least Squares Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627143 Model Predictive Fuzzy Control of Air-ratio for Automotive Engines
Authors: Hang-cheong Wong, Pak-kin Wong, Chi-man Vong, Zhengchao Xie, Shaojia Huang
Abstract:
Automotive engine air-ratio plays an important role of emissions and fuel consumption reduction while maintains satisfactory engine power among all of the engine control variables. In order to effectively control the air-ratio, this paper presents a model predictive fuzzy control algorithm based on online least-squares support vector machines prediction model and fuzzy logic optimizer. The proposed control algorithm was also implemented on a real car for testing and the results are highly satisfactory. Experimental results show that the proposed control algorithm can regulate the engine air-ratio to the stoichiometric value, 1.0, under external disturbance with less than 5% tolerance.Keywords: Air-ratio, Fuzzy logic, online least-squares support vector machine, model predictive control.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1809142 Equity Risk Premiums and Risk Free Rates in Modelling and Prediction of Financial Markets
Authors: Mohammad Ghavami, Reza S. Dilmaghani
Abstract:
This paper presents an adaptive framework for modelling financial markets using equity risk premiums, risk free rates and volatilities. The recorded economic factors are initially used to train four adaptive filters for a certain limited period of time in the past. Once the systems are trained, the adjusted coefficients are used for modelling and prediction of an important financial market index. Two different approaches based on least mean squares (LMS) and recursive least squares (RLS) algorithms are investigated. Performance analysis of each method in terms of the mean squared error (MSE) is presented and the results are discussed. Computer simulations carried out using recorded data show MSEs of 4% and 3.4% for the next month prediction using LMS and RLS adaptive algorithms, respectively. In terms of twelve months prediction, RLS method shows a better tendency estimation compared to the LMS algorithm.Keywords: Prediction of financial markets, Adaptive methods, MSE, LSE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1022141 Unit Root Tests Based On the Robust Estimator
Authors: Wararit Panichkitkosolkul
Abstract:
The unit root tests based on the robust estimator for the first-order autoregressive process are proposed and compared with the unit root tests based on the ordinary least squares (OLS) estimator. The percentiles of the null distributions of the unit root test are also reported. The empirical probabilities of Type I error and powers of the unit root tests are estimated via Monte Carlo simulation. Simulation results show that all unit root tests can control the probability of Type I error for all situations. The empirical power of the unit root tests based on the robust estimator are higher than the unit root tests based on the OLS estimator.
Keywords: Autoregressive, Ordinary least squares, Type I error, Power of the test, Monte Carlo simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1788140 Online Prediction of Nonlinear Signal Processing Problems Based Kernel Adaptive Filtering
Authors: Hamza Nejib, Okba Taouali
Abstract:
This paper presents two of the most knowing kernel adaptive filtering (KAF) approaches, the kernel least mean squares and the kernel recursive least squares, in order to predict a new output of nonlinear signal processing. Both of these methods implement a nonlinear transfer function using kernel methods in a particular space named reproducing kernel Hilbert space (RKHS) where the model is a linear combination of kernel functions applied to transform the observed data from the input space to a high dimensional feature space of vectors, this idea known as the kernel trick. Then KAF is the developing filters in RKHS. We use two nonlinear signal processing problems, Mackey Glass chaotic time series prediction and nonlinear channel equalization to figure the performance of the approaches presented and finally to result which of them is the adapted one.Keywords: KLMS, online prediction, KAF, signal processing, RKHS, Kernel methods, KRLS, KLMS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1052139 Design of Two-Channel Quadrature Mirror Filter Banks Using Digital All-Pass Filters
Authors: Ju-Hong Lee, Yi-Lin Shieh
Abstract:
The paper deals with the minimax design of two-channel linear-phase (LP) quadrature mirror filter (QMF) banks using infinite impulse response (IIR) digital all-pass filters (DAFs). Based on the theory of two-channel QMF banks using two IIR DAFs, the design problem is appropriately formulated to result in an appropriate Chebyshev approximation for the desired group delay responses of the IIR DAFs and the magnitude response of the low-pass analysis filter. Through a frequency sampling and iterative approximation method, the design problem can be solved by utilizing a weighted least squares approach. The resulting two-channel QMF banks can possess approximately LP response without magnitude distortion. Simulation results are presented for illustration and comparison.
Keywords: Chebyshev approximation, Digital All-Pass Filter, Quadrature Mirror Filter, Weighted Least Squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2741138 Mean-Square Performance of Adaptive Filter Algorithms in Nonstationary Environments
Authors: Mohammad Shams Esfand Abadi, John Hakon Husøy
Abstract:
Employing a recently introduced unified adaptive filter theory, we show how the performance of a large number of important adaptive filter algorithms can be predicted within a general framework in nonstationary environment. This approach is based on energy conservation arguments and does not need to assume a Gaussian or white distribution for the regressors. This general performance analysis can be used to evaluate the mean square performance of the Least Mean Square (LMS) algorithm, its normalized version (NLMS), the family of Affine Projection Algorithms (APA), the Recursive Least Squares (RLS), the Data-Reusing LMS (DR-LMS), its normalized version (NDR-LMS), the Block Least Mean Squares (BLMS), the Block Normalized LMS (BNLMS), the Transform Domain Adaptive Filters (TDAF) and the Subband Adaptive Filters (SAF) in nonstationary environment. Also, we establish the general expressions for the steady-state excess mean square in this environment for all these adaptive algorithms. Finally, we demonstrate through simulations that these results are useful in predicting the adaptive filter performance.Keywords: Adaptive filter, general framework, energy conservation, mean-square performance, nonstationary environment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187137 Evaluation of Model Evaluation Criterion for Software Development Effort Estimation
Authors: S. K. Pillai, M. K. Jeyakumar
Abstract:
Estimation of model parameters is necessary to predict the behavior of a system. Model parameters are estimated using optimization criteria. Most algorithms use historical data to estimate model parameters. The known target values (actual) and the output produced by the model are compared. The differences between the two form the basis to estimate the parameters. In order to compare different models developed using the same data different criteria are used. The data obtained for short scale projects are used here. We consider software effort estimation problem using radial basis function network. The accuracy comparison is made using various existing criteria for one and two predictors. Then, we propose a new criterion based on linear least squares for evaluation and compared the results of one and two predictors. We have considered another data set and evaluated prediction accuracy using the new criterion. The new criterion is easy to comprehend compared to single statistic. Although software effort estimation is considered, this method is applicable for any modeling and prediction.
Keywords: Software effort estimation, accuracy, Radial Basis Function, linear least squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2042136 Iterative Image Reconstruction for Sparse-View Computed Tomography via Total Variation Regularization and Dictionary Learning
Authors: XianYu Zhao, JinXu Guo
Abstract:
Recently, low-dose computed tomography (CT) has become highly desirable due to increasing attention to the potential risks of excessive radiation. For low-dose CT imaging, ensuring image quality while reducing radiation dose is a major challenge. To facilitate low-dose CT imaging, we propose an improved statistical iterative reconstruction scheme based on the Penalized Weighted Least Squares (PWLS) standard combined with total variation (TV) minimization and sparse dictionary learning (DL) to improve reconstruction performance. We call this method "PWLS-TV-DL". In order to evaluate the PWLS-TV-DL method, we performed experiments on digital phantoms and physical phantoms, respectively. The experimental results show that our method is in image quality and calculation. The efficiency is superior to other methods, which confirms the potential of its low-dose CT imaging.Keywords: Low dose computed tomography, penalized weighted least squares, total variation, dictionary learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 837135 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook
Authors: Chien-Jen Liu, Shu Ching Yang
Abstract:
Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.
Keywords: Technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3196134 Construction of Space-Filling Designs for Three Input Variables Computer Experiments
Authors: Kazeem A. Osuolale, Waheed B. Yahya, Babatunde L. Adeleke
Abstract:
Latin hypercube designs (LHDs) have been applied in many computer experiments among the space-filling designs found in the literature. A LHD can be randomly generated but a randomly chosen LHD may have bad properties and thus act poorly in estimation and prediction. There is a connection between Latin squares and orthogonal arrays (OAs). A Latin square of order s involves an arrangement of s symbols in s rows and s columns, such that every symbol occurs once in each row and once in each column and this exists for every non-negative integer s. In this paper, a computer program was written to construct orthogonal array-based Latin hypercube designs (OA-LHDs). Orthogonal arrays (OAs) were constructed from Latin square of order s and the OAs constructed were afterward used to construct the desired Latin hypercube designs for three input variables for use in computer experiments. The LHDs constructed have better space-filling properties and they can be used in computer experiments that involve only three input factors. MATLAB 2012a computer package (www.mathworks.com/) was used for the development of the program that constructs the designs.Keywords: Computer Experiments, Latin Squares, Latin Hypercube Designs, Orthogonal Array, Space-filling Designs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723133 A Mesh Free Moving Node Method To Analyze Flow Through Spirals of Orbiting Scroll Pump
Authors: I.Banerjee, A.K.Mahendra, T.K.Bera, B.G.Chandresh
Abstract:
The scroll pump belongs to the category of positive displacement pump can be used for continuous pumping of gases at low pressure apart from general vacuum application. The shape of volume occupied by the gas moves and deforms continuously as the spiral orbits. To capture flow features in such domain where mesh deformation varies with time in a complicated manner, mesh less solver was found to be very useful. Least Squares Kinetic Upwind Method (LSKUM) is a kinetic theory based mesh free Euler solver working on arbitrary distribution of points. Here upwind is enforced in molecular level based on kinetic flux vector splitting scheme (KFVS). In the present study we extended the LSKUM to moving node viscous flow application. This new code LSKUM-NS-MN for moving node viscous flow is validated for standard airfoil pitching test case. Simulation performed for flow through scroll pump using LSKUM-NS-MN code agrees well with the experimental pumping speed data.Keywords: Least Squares, Moving node, Pitching, Spirals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1904132 Evaluation of Best-Fit Probability Distribution for Prediction of Extreme Hydrologic Phenomena
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
The probability distributions are the best method for forecasting of extreme hydrologic phenomena such as rainfall and flood flows. In this research, in order to determine suitable probability distribution for estimating of annual extreme rainfall and flood flows (discharge) series with different return periods, precipitation with 40 and discharge with 58 years time period had been collected from Karkheh River at Iran. After homogeneity and adequacy tests, data have been analyzed by Stormwater Management and Design Aid (SMADA) software and residual sum of squares (R.S.S). The best probability distribution was Log Pearson Type III with R.S.S value (145.91) and value (13.67) for peak discharge and Log Pearson Type III with R.S.S values (141.08) and (8.95) for maximum discharge in Jelogir Majin and Pole Zal stations, respectively. The best distribution for maximum precipitation in Jelogir Majin and Pole Zal stations was Log Pearson Type III distribution with R.S.S values (1.74&1.90) and then Pearson Type III distribution with R.S.S values (1.53&1.69). Overall, the Log Pearson Type III distributions are acceptable distribution types for representing statistics of extreme hydrologic phenomena in Karkheh River at Iran with the Pearson Type III distribution as a potential alternative.
Keywords: Karkheh river, log pearson type III, probability distribution, residual sum of squares.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 882131 Optimal Design of Two-Channel Recursive Parallelogram Quadrature Mirror Filter Banks
Authors: Ju-Hong Lee, Yi-Lin Shieh
Abstract:
This paper deals with the optimal design of two-channel recursive parallelogram quadrature mirror filter (PQMF) banks. The analysis and synthesis filters of the PQMF bank are composed of two-dimensional (2-D) recursive digital all-pass filters (DAFs) with nonsymmetric half-plane (NSHP) support region. The design problem can be facilitated by using the 2-D doubly complementary half-band (DC-HB) property possessed by the analysis and synthesis filters. For finding the coefficients of the 2-D recursive NSHP DAFs, we appropriately formulate the design problem to result in an optimization problem that can be solved by using a weighted least-squares (WLS) algorithm in the minimax (L∞) optimal sense. The designed 2-D recursive PQMF bank achieves perfect magnitude response and possesses satisfactory phase response without requiring extra phase equalizer. Simulation results are also provided for illustration and comparison.
Keywords: Parallelogram Quadrature Mirror Filter Bank, Doubly Complementary Filter, Nonsymmetric Half-Plane Filter, Weighted Least Squares Algorithm, Digital All-Pass Filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1543130 Analytical Authentication of Butter Using Fourier Transform Infrared Spectroscopy Coupled with Chemometrics
Authors: M. Bodner, M. Scampicchio
Abstract:
Fourier Transform Infrared (FT-IR) spectroscopy coupled with chemometrics was used to distinguish between butter samples and non-butter samples. Further, quantification of the content of margarine in adulterated butter samples was investigated. Fingerprinting region (1400-800 cm–1) was used to develop unsupervised pattern recognition (Principal Component Analysis, PCA), supervised modeling (Soft Independent Modelling by Class Analogy, SIMCA), classification (Partial Least Squares Discriminant Analysis, PLS-DA) and regression (Partial Least Squares Regression, PLS-R) models. PCA of the fingerprinting region shows a clustering of the two sample types. All samples were classified in their rightful class by SIMCA approach; however, nine adulterated samples (between 1% and 30% w/w of margarine) were classified as belonging both at the butter class and at the non-butter one. In the two-class PLS-DA model’s (R2 = 0.73, RMSEP, Root Mean Square Error of Prediction = 0.26% w/w) sensitivity was 71.4% and Positive Predictive Value (PPV) 100%. Its threshold was calculated at 7% w/w of margarine in adulterated butter samples. Finally, PLS-R model (R2 = 0.84, RMSEP = 16.54%) was developed. PLS-DA was a suitable classification tool and PLS-R a proper quantification approach. Results demonstrate that FT-IR spectroscopy combined with PLS-R can be used as a rapid, simple and safe method to identify pure butter samples from adulterated ones and to determine the grade of adulteration of margarine in butter samples.
Keywords: Adulterated butter, margarine, PCA, PLS-DA, PLS-R, SIMCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781129 Mediating Role of Social Responsibility on the Relationship between Consumer Awareness of Green Marketing and Purchase Intentions
Authors: Norazah Mohd Suki, Norbayah Mohd Suki
Abstract:
This research aims to examine the influence of mediating effect of corporate social responsibility on the relationship between consumer awareness of green marketing and purchase intentions in the retail setting. Data from 200 valid questionnaires was analyzed using the partial least squares (PLS) approach for the analysis of structural equation models with SmartPLS computer program version 2.0 as research data does not necessarily have a multivariate normal distribution and is less sensitive to sample size than other covariance approaches. PLS results revealed that corporate social responsibility partially mediated the link between consumer awareness of green marketing and purchase intentions of the product in the retail setting. Marketing managers should allocate a sufficient portion of their budget to appropriate corporate social responsibility activities by engaging in voluntary programs for positive return on investment leading to increased business profitability and long run business sustainability. The outcomes of the mediating effects of corporate social responsibility add a new impetus to the growing literature and preceding discoveries on consumer green marketing awareness, which is inadequately researched in the Malaysian setting. Direction for future research is also presented.Keywords: Green marketing awareness, corporate social responsibility, partial least squares, purchase intention.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584128 Applying Element Free Galerkin Method on Beam and Plate
Authors: Mahdad M’hamed, Belaidi Idir
Abstract:
This paper develops a meshless approach, called Element Free Galerkin (EFG) method, which is based on the weak form Moving Least Squares (MLS) of the partial differential governing equations and employs the interpolation to construct the meshless shape functions. The variation weak form is used in the EFG where the trial and test functions are approximated bye the MLS approximation. Since the shape functions constructed by this discretization have the weight function property based on the randomly distributed points, the essential boundary conditions can be implemented easily. The local weak form of the partial differential governing equations is obtained by the weighted residual method within the simple local quadrature domain. The spline function with high continuity is used as the weight function. The presently developed EFG method is a truly meshless method, as it does not require the mesh, either for the construction of the shape functions, or for the integration of the local weak form. Several numerical examples of two-dimensional static structural analysis are presented to illustrate the performance of the present EFG method. They show that the EFG method is highly efficient for the implementation and highly accurate for the computation. The present method is used to analyze the static deflection of beams and plate holeKeywords: Numerical computation, element-free Galerkin, moving least squares, meshless methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2436127 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm
Authors: Xiang Jianhong, Wang Cong, Wang Linyu
Abstract:
With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.
Keywords: telemedicine, fetal electrocardiogram, compressed sensing, joint sparse reconstruction, block sparse signal
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 511126 Accurate Visualization of Graphs of Functions of Two Real Variables
Authors: Zeitoun D. G., Thierry Dana-Picard
Abstract:
The study of a real function of two real variables can be supported by visualization using a Computer Algebra System (CAS). One type of constraints of the system is due to the algorithms implemented, yielding continuous approximations of the given function by interpolation. This often masks discontinuities of the function and can provide strange plots, not compatible with the mathematics. In recent years, point based geometry has gained increasing attention as an alternative surface representation, both for efficient rendering and for flexible geometry processing of complex surfaces. In this paper we present different artifacts created by mesh surfaces near discontinuities and propose a point based method that controls and reduces these artifacts. A least squares penalty method for an automatic generation of the mesh that controls the behavior of the chosen function is presented. The special feature of this method is the ability to improve the accuracy of the surface visualization near a set of interior points where the function may be discontinuous. The present method is formulated as a minimax problem and the non uniform mesh is generated using an iterative algorithm. Results show that for large poorly conditioned matrices, the new algorithm gives more accurate results than the classical preconditioned conjugate algorithm.
Keywords: Function singularities, mesh generation, point allocation, visualization, collocation least squares method, Augmented Lagrangian method, Uzawa's Algorithm, Preconditioned Conjugate Gradien
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708125 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir
Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder
Abstract:
22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.
Keywords: Catchment management, drinking water reservoir, multivariate curve resolution alternating least squares, thermal stratification, water quality.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 921124 Subpixel Detection of Circular Objects Using Geometric Property
Authors: Wen-Yen Wu, Wen-Bin Yu
Abstract:
In this paper, we propose a method for detecting circular shapes with subpixel accuracy. First, the geometric properties of circles have been used to find the diameters as well as the circumference pixels. The center and radius are then estimated by the circumference pixels. Both synthetic and real images have been tested by the proposed method. The experimental results show that the new method is efficient.Keywords: Subpixel, least squares estimation, circle detection, Hough transformation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2137123 Classification of the Latin Alphabet as Pattern on ARToolkit Markers for Augmented Reality Applications
Authors: Mohamed Badeche, Mohamed Benmohammed
Abstract:
augmented reality is a technique used to insert virtual objects in real scenes. One of the most used libraries in the area is the ARToolkit library. It is based on the recognition of the markers that are in the form of squares with a pattern inside. This pattern which is mostly textual is source of confusing. In this paper, we present the results of a classification of Latin characters as a pattern on the ARToolkit markers to know the most distinguishable among them.
Keywords: ARToolkit library, augmented reality, K-means, patterns
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840122 Convergence Analysis of a Prediction based Adaptive Equalizer for IIR Channels
Authors: Miloje S. Radenkovic, Tamal Bose
Abstract:
This paper presents the convergence analysis of a prediction based blind equalizer for IIR channels. Predictor parameters are estimated by using the recursive least squares algorithm. It is shown that the prediction error converges almost surely (a.s.) toward a scalar multiple of the unknown input symbol sequence. It is also proved that the convergence rate of the parameter estimation error is of the same order as that in the iterated logarithm law.Keywords: Adaptive blind equalizer, Recursive leastsquares, Adaptive Filtering, Convergence analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454