Search results for: least square estimates
2286 Robust Inference with a Skew T Distribution
Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici
Abstract:
There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness
Procedia PDF Downloads 3972285 On Pooling Different Levels of Data in Estimating Parameters of Continuous Meta-Analysis
Authors: N. R. N. Idris, S. Baharom
Abstract:
A meta-analysis may be performed using aggregate data (AD) or an individual patient data (IPD). In practice, studies may be available at both IPD and AD level. In this situation, both the IPD and AD should be utilised in order to maximize the available information. Statistical advantages of combining the studies from different level have not been fully explored. This study aims to quantify the statistical benefits of including available IPD when conducting a conventional summary-level meta-analysis. Simulated meta-analysis were used to assess the influence of the levels of data on overall meta-analysis estimates based on IPD-only, AD-only and the combination of IPD and AD (mixed data, MD), under different study scenario. The percentage relative bias (PRB), root mean-square-error (RMSE) and coverage probability were used to assess the efficiency of the overall estimates. The results demonstrate that available IPD should always be included in a conventional meta-analysis using summary level data as they would significantly increased the accuracy of the estimates. On the other hand, if more than 80% of the available data are at IPD level, including the AD does not provide significant differences in terms of accuracy of the estimates. Additionally, combining the IPD and AD has moderating effects on the biasness of the estimates of the treatment effects as the IPD tends to overestimate the treatment effects, while the AD has the tendency to produce underestimated effect estimates. These results may provide some guide in deciding if significant benefit is gained by pooling the two levels of data when conducting meta-analysis.Keywords: aggregate data, combined-level data, individual patient data, meta-analysis
Procedia PDF Downloads 3752284 The Effect of Non-Normality on CB-SEM and PLS-SEM Path Estimates
Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim
Abstract:
The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are non-normal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and non-normality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under non-normality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.Keywords: CB-SEM, Monte Carlo simulation, normality conditions, non-normality, PLS-SEM
Procedia PDF Downloads 4102283 Identification of Wiener Model Using Iterative Schemes
Authors: Vikram Saini, Lillie Dewan
Abstract:
This paper presents the iterative schemes based on Least square, Hierarchical Least Square and Stochastic Approximation Gradient method for the Identification of Wiener model with parametric structure. A gradient method is presented for the parameter estimation of wiener model with noise conditions based on the stochastic approximation. Simulation results are presented for the Wiener model structure with different static non-linear elements in the presence of colored noise to show the comparative analysis of the iterative methods. The stochastic gradient method shows improvement in the estimation performance and provides fast convergence of the parameters estimates.Keywords: hard non-linearity, least square, parameter estimation, stochastic approximation gradient, Wiener model
Procedia PDF Downloads 4052282 Critical Accounting Estimates and Transparency in Financial Reporting: An Observation Financial Reporting under US GAAP
Authors: Ahmed Shaik
Abstract:
Estimates are very critical in accounting and Financial Reporting cannot be complete without these estimates. There is a long list of accounting estimates that are required to be made to compute Net Income and to determine the value of assets and liabilities. To name a few, valuation of inventory, depreciation, valuation of goodwill, provision for bad debts and estimated warranties, etc. require the use of different valuation models and forecasts. Different business entities under the same industry may use different approaches to measure the value of financial items being reported in Income Statement and Balance Sheet. The disclosure notes do not provide enough details of the approach used by a business entity to arrive at the value of a financial item. Lack of details in the disclosure notes makes it difficult to compare the financial performance of one business entity with the other in the same industry. This paper is an attempt to identify the lack of enough information about accounting estimates in disclosure notes, the impact of the absence of details of accounting estimates on the comparability of financial data and financial analysis. An attempt is made to suggest the detailed disclosure while taking care of the cost and benefit of making such disclosure.Keywords: accounting estimates, disclosure notes, financial reporting, transparency
Procedia PDF Downloads 2002281 Study on Bending Characteristics of Square Tube Using Energy Absorption Part
Authors: Shigeyuki Haruyama, Zefry Darmawan, Ken Kaminishi
Abstract:
In the square tube subjected to the bending load, the rigidity of the entire square tube is reduced when a collapse occurs due to local stress concentration. Therefore, in this research, the influence of bending load on the square tube with attached energy absorbing part was examined and reported. The analysis was conducted by using Finite Element Method (FEM) to produced bending deflection and buckling points. Energy absorption was compared from rigidity of attached part and square tube body. Buckling point was influenced by the rigidity of attached part and the thickness rate of square tube.Keywords: energy absorber, square tube, bending, rigidity
Procedia PDF Downloads 2442280 Estimating Current Suicide Rates Using Google Trends
Authors: Ladislav Kristoufek, Helen Susannah Moat, Tobias Preis
Abstract:
Data on the number of people who have committed suicide tends to be reported with a substantial time lag of around two years. We examine whether online activity measured by Google searches can help us improve estimates of the number of suicide occurrences in England before official figures are released. Specifically, we analyse how data on the number of Google searches for the terms “depression” and “suicide” relate to the number of suicides between 2004 and 2013. We find that estimates drawing on Google data are significantly better than estimates using previous suicide data alone. We show that a greater number of searches for the term “depression” is related to fewer suicides, whereas a greater number of searches for the term “suicide” is related to more suicides. Data on suicide related search behaviour can be used to improve current estimates of the number of suicide occurrences.Keywords: nowcasting, search data, Google Trends, official statistics
Procedia PDF Downloads 3572279 A Review on the Perception of Beşiktaş Public Square
Authors: Neslinur Hizli, Berrak Kirbaş Akyürek
Abstract:
Beşiktaş, one of the historical coastal district of İstanbul, is on the very edge of the radical transformation because of an approaching ‘Beşiktaş Public Square Project’. At this juncture, due its location, presence on the coast, population density and distance to the other centers of the city, the decisions to be taken are critical to whole Istanbul that will be majorly affected from this transformation. As the new project aims to pedestrianize the area by placing the vehicular traffic under the ground, Beşiktaş and its square will change from top to bottom. Among those considerations, through the advantages and disadvantages the perception of the existing conditions of the Beşiktaş play significant role. The motive of this paper is the lack of determination and clarity on the cognition of the Square. After brief analysis on the historical transformation of the area, prominent studies on the criteria of public square are revised. Through cognitive mapping methodology, characteristics of the Square and the public space in general find a place to discuss from individual views. This study aims to discuss and review Beşiktaş Public Square from perspective, mind and behavior of the users. Cognitive map study with thirty subjects (30) is evaluated and categorized upon the five elements that Kevin Lynch defined as the images of the city. The results obtained digitized and represented with tables and graphs. Findings of the research underline the crucial issues on the approaching change in Beşiktaş. Thus, this study may help to develop comprehensive ideas and new suggestions on the Square.Keywords: Beşiktaş public square, cognitive map, perception, public space
Procedia PDF Downloads 2672278 Uncertainty Estimation in Neural Networks through Transfer Learning
Authors: Ashish James, Anusha James
Abstract:
The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.Keywords: uncertainty estimation, neural networks, transfer learning, regression
Procedia PDF Downloads 1352277 Strategies of Spatial Optimization for Open Space in the Old-Age Friendly City: An Investigation of the Behavior of the Elderly in Xicheng Square in Hangzhou
Authors: Yunxiang Fang
Abstract:
With the aging trend continuing to accelerate, open space is important for the daily life of the elderly, and its old-age friendliness is worthy of attention. Based on behavioral observation and literature research, this paper studies the behavior of the elderly in urban open space. Through the investigation, classification and quantitative analysis of the activity types, time characteristics and spatial behavior order of the elderly in Xicheng Square in Hangzhou, it summarizes the square space suitable for the psychological needs, physiology and activity needs of the elderly, combined with the basis of literature research. Finally, the suggestions for the improvement of the old-age friendship of Xicheng Square are put forward, from the aspects of microclimate, safety and accessibility, space richness and service facility quality.Keywords: behavior characteristics, old-age friendliness, open space, square
Procedia PDF Downloads 1692276 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis
Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera
Abstract:
Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.Keywords: log-linear model, multi spectral, residuals, spatial error model
Procedia PDF Downloads 2972275 Evolution of Multimodulus Algorithm Blind Equalization Based on Recursive Least Square Algorithm
Authors: Sardar Ameer Akram Khan, Shahzad Amin Sheikh
Abstract:
Blind equalization is an important technique amongst equalization family. Multimodulus algorithms based on blind equalization removes the undesirable effects of ISI and cater ups the phase issues, saving the cost of rotator at the receiver end. In this paper a new algorithm combination of recursive least square and Multimodulus algorithm named as RLSMMA is proposed by providing few assumption, fast convergence and minimum Mean Square Error (MSE) is achieved. The excellence of this technique is shown in the simulations presenting MSE plots and the resulting filter results.Keywords: blind equalizations, constant modulus algorithm, multi-modulus algorithm, recursive least square algorithm, quadrature amplitude modulation (QAM)
Procedia PDF Downloads 6442274 A Study of Rapid Replication of Square-Microlens Structures
Authors: Ting-Ting Wen, Jung-Ruey Tsai
Abstract:
This paper reports a method for the replication of micro-scale structures. By using electromagnetic force-assisted imprinting system with magnetic soft stamp written square-microlens cavity, a photopolymer square-microlens structures can be rapidly fabricated. Under the proper processing conditions, the polymeric square-microlens structures with feature size of width 100.3um and height 15.2um across a large area can be successfully fabricated. Scanning electron microscopy (SEM) and surface profiler observations confirm that the micro-scale polymer structures are produced without defects or distortion and with good pattern fidelity over a 60x60mm2 area. This technique shows great potential for the efficient replication of the micro-scale structure array at room temperature and with high productivity and low cost.Keywords: square-microlens structures, electromagnetic force-assisted imprinting, magnetic soft stamp
Procedia PDF Downloads 3342273 Comparison Analysis of Multi-Channel Echo Cancellation Using Adaptive Filters
Authors: Sahar Mobeen, Anam Rafique, Irum Baig
Abstract:
Acoustic echo cancellation in multichannel is a system identification application. In real time environment, signal changes very rapidly which required adaptive algorithms such as Least Mean Square (LMS), Leaky Least Mean Square (LLMS), Normalized Least Mean square (NLMS) and average (AFA) having high convergence rate and stable. LMS and NLMS are widely used adaptive algorithm due to less computational complexity and AFA used of its high convergence rate. This research is based on comparison of acoustic echo (generated in a room) cancellation thorough LMS, LLMS, NLMS, AFA and newly proposed average normalized leaky least mean square (ANLLMS) adaptive filters.Keywords: LMS, LLMS, NLMS, AFA, ANLLMS
Procedia PDF Downloads 5662272 Nonuniformity Correction Technique in Infrared Video Using Feedback Recursive Least Square Algorithm
Authors: Flavio O. Torres, Maria J. Castilla, Rodrigo A. Augsburger, Pedro I. Cachana, Katherine S. Reyes
Abstract:
In this paper, we present a scene-based nonuniformity correction method using a modified recursive least square algorithm with a feedback system on the updates. The feedback is designed to remove impulsive noise contamination images produced by a recursive least square algorithm by measuring the output of the proposed algorithm. The key advantage of the method is based on its capacity to estimate detectors parameters and then compensate for impulsive noise contamination image in a frame by frame basics. We define the algorithm and present several experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published recursive least square-based methods. We show that the proposed method removes impulsive noise contamination image.Keywords: infrared focal plane arrays, infrared imaging, least mean square, nonuniformity correction
Procedia PDF Downloads 1432271 Tunable in Phase, out of Phase and T/4 Square-Wave Pulses in Delay-Coupled Optoelectronic Oscillators
Authors: Jade Martínez-Llinàs, Pere Colet
Abstract:
By exploring the possible dynamical regimes in a prototypical model for mutually delay-coupled OEOs, here it is shown that two mutually coupled non-identical OEOs, besides in- and out-of-phase square-waves, can generate stable square-wave pulses synchronized at a quarter of the period (T/4) in a broad parameter region. The key point to obtain T/4 solutions is that the two OEO operate with mixed feedback, namely with negative feedback in one and positive in the other. Furthermore, the coexistence of multiple solutions provides a large degree of flexibility for tuning the frequency in the GHz range without changing any parameter. As a result the two coupled OEOs system is good candidate to be implemented for information encoding as a high-capacity memory device.Keywords: nonlinear optics, optoelectronic oscillators, square waves, synchronization
Procedia PDF Downloads 3702270 Estimation of Coefficients of Ridge and Principal Components Regressions with Multicollinear Data
Authors: Rajeshwar Singh
Abstract:
The presence of multicollinearity is common in handling with several explanatory variables simultaneously due to exhibiting a linear relationship among them. A great problem arises in understanding the impact of explanatory variables on the dependent variable. Thus, the method of least squares estimation gives inexact estimates. In this case, it is advised to detect its presence first before proceeding further. Using the ridge regression degree of its occurrence is reduced but principal components regression gives good estimates in this situation. This paper discusses well-known techniques of the ridge and principal components regressions and applies to get the estimates of coefficients by both techniques. In addition to it, this paper also discusses the conflicting claim on the discovery of the method of ridge regression based on available documents.Keywords: conflicting claim on credit of discovery of ridge regression, multicollinearity, principal components and ridge regressions, variance inflation factor
Procedia PDF Downloads 4182269 Polynomially Adjusted Bivariate Density Estimates Based on the Saddlepoint Approximation
Authors: S. B. Provost, Susan Sheng
Abstract:
An alternative bivariate density estimation methodology is introduced in this presentation. The proposed approach involves estimating the density function associated with the marginal distribution of each of the two variables by means of the saddlepoint approximation technique and applying a bivariate polynomial adjustment to the product of these density estimates. Since the saddlepoint approximation is utilized in the context of density estimation, such estimates are determined from empirical cumulant-generating functions. In the univariate case, the saddlepoint density estimate is itself adjusted by a polynomial. Given a set of observations, the coefficients of the polynomial adjustments are obtained from the sample moments. Several illustrative applications of the proposed methodology shall be presented. Since this approach relies essentially on a determinate number of sample moments, it is particularly well suited for modeling massive data sets.Keywords: density estimation, empirical cumulant-generating function, moments, saddlepoint approximation
Procedia PDF Downloads 2802268 Acoustic Echo Cancellation Using Different Adaptive Algorithms
Authors: Hamid Sharif, Nazish Saleem Abbas, Muhammad Haris Jamil
Abstract:
An adaptive filter is a filter that self-adjusts its transfer function according to an optimization algorithm driven by an error signal. Because of the complexity of the optimization algorithms, most adaptive filters are digital filters. Adaptive filtering constitutes one of the core technologies in digital signal processing and finds numerous application areas in science as well as in industry. Adaptive filtering techniques are used in a wide range of applications, including adaptive noise cancellation and echo cancellation. Acoustic echo cancellation is a common occurrence in today’s telecommunication systems. The signal interference caused by acoustic echo is distracting to both users and causes a reduction in the quality of the communication. In this paper, we review different techniques of adaptive filtering to reduce this unwanted echo. In this paper, we see the behavior of techniques and algorithms of adaptive filtering like Least Mean Square (LMS), Normalized Least Mean Square (NLMS), Variable Step-Size Least Mean Square (VSLMS), Variable Step-Size Normalized Least Mean Square (VSNLMS), New Varying Step Size LMS Algorithm (NVSSLMS) and Recursive Least Square (RLS) algorithms to reduce this unwanted echo, to increase communication quality.Keywords: adaptive acoustic, echo cancellation, LMS algorithm, adaptive filter, normalized least mean square (NLMS), variable step-size least mean square (VSLMS)
Procedia PDF Downloads 802267 Evaluation of the Impact of Information and Communications Technology (ICT) on the Accuracy of Preliminary Cost Estimates of Building Projects in Nigeria
Authors: Nofiu A. Musa, Olubola Babalola
Abstract:
The study explored the effect of ICT on the accuracy of Preliminary Cost Estimates (PCEs) prepared by quantity surveying consulting firms in Nigeria for building projects, with a view to determining the desirability of the adoption and use of the technological innovation for preliminary estimating. Thus, data pertinent to the study were obtained through questionnaire survey conducted on a sample of one hundred and eight (108) quantity surveying firms selected from the list of registered firms compiled by the Nigerian Institute of Quantity Surveyors (NIQS), Lagos State Chapter through systematic random sampling. The data obtained were analyzed with SPSS version 17 using student’s t-tests at 5% significance level. The results obtained revealed that the mean bias and co-efficient of variation of the PCEs of the firms are significantly less at post ICT adoption period than the pre ICT adoption period, F < 0.05 in each case. The paper concluded that the adoption and use of the Technological Innovation (ICT) has significantly improved the accuracy of the Preliminary Cost Estimates (PCEs) of building projects, hence, it is desirable.Keywords: accepted tender price, accuracy, bias, building projects, consistency, information and communications technology, preliminary cost estimates
Procedia PDF Downloads 4282266 A Semiparametric Approach to Estimate the Mode of Continuous Multivariate Data
Authors: Tiee-Jian Wu, Chih-Yuan Hsu
Abstract:
Mode estimation is an important task, because it has applications to data from a wide variety of sources. We propose a semi-parametric approach to estimate the mode of an unknown continuous multivariate density function. Our approach is based on a weighted average of a parametric density estimate using the Box-Cox transform and a non-parametric kernel density estimate. Our semi-parametric mode estimate improves both the parametric- and non-parametric- mode estimates. Specifically, our mode estimate solves the non-consistency problem of parametric mode estimates (at large sample sizes) and reduces the variability of non-parametric mode estimates (at small sample sizes). The performance of our method at practical sample sizes is demonstrated by simulation examples and two real examples from the fields of climatology and image recognition.Keywords: Box-Cox transform, density estimation, mode seeking, semiparametric method
Procedia PDF Downloads 2842265 Analysis of Space Requirements of Chinese Square-Dancing Space through Newspaper Reports
Authors: Xiaobing Liu, Bo Zhang, Xiaolong Zhao
Abstract:
The square-dancing is one of the most popular new physical activities in China in recent years, which has become a hotspot of Chinese landscape research. This paper collects 749 news reports from four authoritative newspapers in Harbin for 3 years, and probes into the space use needs of participants and non-participants of square-dancing. In this paper, the research results are compared with the contents of three related planning and design codes in China, and some modification or supplementary suggestions are proposed from three aspects, such as decision-making process, total-quantity control, and site design. Different from the traditional research, this research does not use the data from interviews and the questionnaires, but uses the traditional media report content for analyzing. To some extent, it avoids the research result being excessively subjective, enhances objectivity and the authority.Keywords: China, landscape, space design, square-dancing
Procedia PDF Downloads 2652264 Study on the Impact of Size and Position of the Shear Field in Determining the Shear Modulus of Glulam Beam Using Photogrammetry Approach
Authors: Niaz Gharavi, Hexin Zhang
Abstract:
The shear modulus of a timber beam can be determined using torsion test or shear field test method. The shear field test method is based on shear distortion measurement of the beam at the zone with the constant transverse load in the standardized four-point bending test. The current code of practice advises using two metallic arms act as an instrument to measure the diagonal displacement of the constructing square. The size and the position of the constructing square might influence the shear modulus determination. This study aimed to investigate the size and the position effect of the square in the shear field test method. A binocular stereo vision system has been employed to determine the 3D displacement of a grid of target points. Six glue laminated beams were produced and tested. Analysis of Variance (ANOVA) was performed on the acquired data to evaluate the significance of the size effect and the position effect of the square. The results have shown that the size of the square has a noticeable influence on the value of shear modulus, while, the position of the square within the area with the constant shear force does not affect the measured mean shear modulus.Keywords: shear field test method, structural-sized test, shear modulus of Glulam beam, photogrammetry approach
Procedia PDF Downloads 2912263 Investigating the Impact of Task Demand and Duration on Passage of Time Judgements and Duration Estimates
Authors: Jesika A. Walker, Mohammed Aswad, Guy Lacroix, Denis Cousineau
Abstract:
There is a fundamental disconnect between the experience of time passing and the chronometric units by which time is quantified. Specifically, there appears to be no relationship between the passage of time judgments (PoTJs) and verbal duration estimates at short durations (e.g., < 2000 milliseconds). When a duration is longer than several minutes, however, evidence suggests that a slower feeling of time passing is predictive of overestimation. Might the length of a task moderate the relation between PoTJs and duration estimates? Similarly, the estimation paradigm (prospective vs. retrospective) and the mental effort demanded of a task (task demand) have both been found to influence duration estimates. However, only a handful of experiments have investigated these effects for tasks of long durations, and the results have been mixed. Thus, might the length of a task also moderate the effects of the estimation paradigm and task demand on duration estimates? To investigate these questions, 273 participants performed either an easy or difficult visual and memory search task for either eight or 58 minutes, under prospective or retrospective instructions. Afterward, participants provided a duration estimate in minutes, followed by a PoTJ on a Likert scale (1 = very slow, 7 = very fast). A 2 (prospective vs. retrospective) × 2 (eight minutes vs. 58 minutes) × 2 (high vs. low difficulty) between-subjects ANOVA revealed a two-way interaction between task demand and task duration on PoTJs, p = .02. Specifically, time felt faster in the more challenging task, but only in the eight-minute condition, p < .01. Duration estimates were transformed into RATIOs (estimate/actual duration) to standardize estimates across durations. An ANOVA revealed a two-way interaction between estimation paradigm and task duration, p = .03. Specifically, participants overestimated the task more if they were given prospective instructions, but only in the eight-minute task. Surprisingly, there was no effect of task difficulty on duration estimates. Thus, the demands of a task may influence ‘feeling of time’ and ‘estimation time’ differently, contributing to the existing theory that these two forms of time judgement rely on separate underlying cognitive mechanisms. Finally, a significant main effect of task duration was found for both PoTJs and duration estimates (ps < .001). Participants underestimated the 58-minute task (m = 42.5 minutes) and overestimated the eight-minute task (m = 10.7 minutes). Yet, they reported the 58-minute task as passing significantly slower on a Likert scale (m = 2.5) compared to the eight-minute task (m = 4.1). In fact, a significant correlation was found between PoTJ and duration estimation (r = .27, p <.001). This experiment thus provides evidence for a compensatory effect at longer durations, in which people underestimate a ‘slow feeling condition and overestimate a ‘fast feeling condition. The results are discussed in relation to heuristics that might alter the relationship between these two variables when conditions range from several minutes up to almost an hour.Keywords: duration estimates, long durations, passage of time judgements, task demands
Procedia PDF Downloads 1302262 Mean Square Responses of a Cantilever Beam with Various Damping Mechanisms
Authors: Yaping Zhao, Yimin Zhang
Abstract:
In the present paper, the stationary random vibration of a uniform cantilever beam is investigated. Two types of damping mechanism, i.e. the external and internal viscous dampings, are taken into account simultaneously. The excitation form is the support motion, and it is ideal white. Because two type of damping mechanism are considered concurrently, the product of the modal damping ratio and the natural frequency is not a constant anymore. As a result, the infinite definite integral encountered in the process of computing the mean square response is more complex than that in the existing literature. One signal progress of this work is to have calculated these definite integrals accurately. The precise solution of the mean square response is thus obtained in the infinite series form finally. Numerical examples are supplied and the numerical outcomes acquired confirm the validity of the theoretical analyses.Keywords: random vibration, cantilever beam, mean square response, white noise
Procedia PDF Downloads 3842261 A Stepwise Approach to Automate the Search for Optimal Parameters in Seasonal ARIMA Models
Authors: Manisha Mukherjee, Diptarka Saha
Abstract:
Reliable forecasts of univariate time series data are often necessary for several contexts. ARIMA models are quite popular among practitioners in this regard. Hence, choosing correct parameter values for ARIMA is a challenging yet imperative task. Thus, a stepwise algorithm is introduced to provide automatic and robust estimates for parameters (p; d; q)(P; D; Q) used in seasonal ARIMA models. This process is focused on improvising the overall quality of the estimates, and it alleviates the problems induced due to the unidimensional nature of the methods that are currently used such as auto.arima. The fast and automated search of parameter space also ensures reliable estimates of the parameters that possess several desirable qualities, consequently, resulting in higher test accuracy especially in the cases of noisy data. After vigorous testing on real as well as simulated data, the algorithm doesn’t only perform better than current state-of-the-art methods, it also completely obviates the need for human intervention due to its automated nature.Keywords: time series, ARIMA, auto.arima, ARIMA parameters, forecast, R function
Procedia PDF Downloads 1652260 Achieving Conviviality in Terms of Collective Experience through Creative Public Spaces in Namik Kemal Square, Famagusta, North Cyprus
Authors: Shirin Shaideh, Nina Shirkhanloo
Abstract:
Creative public spaces were needed to foster conviviality in an urban form. The conviviality could be enhanced by facilitating variety of opportunities to participate in communal activities and promoting collective experiences. In this regard, The Namik Kemal Square as a major public space of Walled City of Famagusta in North Cyprus was found as the creative public space because it supports collective practices by leisure activities which enclosed the space. The square also utilized creative collaboration such as festivals and outdoor exhibition. Accordingly this paper focuses on the issue of conviviality in urban public space, in the perspective of square, as a major indicator of their success. The survey firstly provides a theoretical framework for understanding conviviality in creative public space to empower collective experience. Secondly it discusses the essential components of conviviality in form of square and finally investigating conviviality and also its determinants in Namik Kemal square. Hence, the main challenges of this study are going to focus on how convivial public spaces impact collective experience, what people expect from a kind of public space, or what they perceive as a good place to be in. Since it seems essential to respond positively, inclusively to the needs of people to socialize in public spaces by involving them in collective and common practices, this article aims to tease out what gives some places personality and conviviality so that we can learn to design, maintain and manage better quality built environment in future.Keywords: conviviality, creative public space, collective experience, Namik Kemal square
Procedia PDF Downloads 4292259 The Sequential Estimation of the Seismoacoustic Source Energy in C-OTDR Monitoring Systems
Authors: Andrey V. Timofeev, Dmitry V. Egorov
Abstract:
The practical efficient approach is suggested for estimation of the seismoacoustic sources energy in C-OTDR monitoring systems. This approach represents the sequential plan for confidence estimation both the seismoacoustic sources energy, as well the absorption coefficient of the soil. The sequential plan delivers the non-asymptotic guaranteed accuracy of obtained estimates in the form of non-asymptotic confidence regions with prescribed sizes. These confidence regions are valid for a finite sample size when the distributions of the observations are unknown. Thus, suggested estimates are non-asymptotic and nonparametric, and also these estimates guarantee the prescribed estimation accuracy in the form of the prior prescribed size of confidence regions, and prescribed confidence coefficient value.Keywords: nonparametric estimation, sequential confidence estimation, multichannel monitoring systems, C-OTDR-system, non-lineary regression
Procedia PDF Downloads 3562258 The Effect of Accounting Conservatism on Cost of Capital: A Quantile Regression Approach for MENA Countries
Authors: Maha Zouaoui Khalifa, Hakim Ben Othman, Hussaney Khaled
Abstract:
Prior empirical studies have investigated the economic consequences of accounting conservatism by examining its impact on the cost of equity capital (COEC). However, findings are not conclusive. We assume that inconsistent results of such association may be attributed to the regression models used in data analysis. To address this issue, we re-examine the effect of different dimension of accounting conservatism: unconditional conservatism (U_CONS) and conditional conservatism (C_CONS) on the COEC for a sample of listed firms from Middle Eastern and North Africa (MENA) countries, applying quantile regression (QR) approach developed by Koenker and Basset (1978). While classical ordinary least square (OLS) method is widely used in empirical accounting research, however it may produce inefficient and bias estimates in the case of departures from normality or long tail error distribution. QR method is more powerful than OLS to handle this kind of problem. It allows the coefficient on the independent variables to shift across the distribution of the dependent variable whereas OLS method only estimates the conditional mean effects of a response variable. We find as predicted that U_CONS has a significant positive effect on the COEC however, C_CONS has a negative impact. Findings suggest also that the effect of the two dimensions of accounting conservatism differs considerably across COEC quantiles. Comparing results from QR method with those of OLS, this study throws more lights on the association between accounting conservatism and COEC.Keywords: unconditional conservatism, conditional conservatism, cost of equity capital, OLS, quantile regression, emerging markets, MENA countries
Procedia PDF Downloads 3552257 Square Concrete Columns under Axial Compression
Authors: Suniti Suparp, Panuwat Joyklad, Qudeer Hussain
Abstract:
This is a well-known fact that the actual latera forces due to natural disasters, for example, earthquakes, floods and storms are difficult to predict accurately. Among these natural disasters, so far, the highest amount of deaths and injuries have been recorded for the case of earthquakes all around the world. Therefore, there is always an urgent need to establish suitable strengthening methods for existing concrete and steel structures. This paper is investigating the structural performance of square concrete columns strengthened using low cost and easily available steel clamps. The salient features of these steel clamps are comparatively low cost, easy availability and ease of installation. To achieve research objectives, a large-scale experimental program was established in which a total number of 12 square concrete columns were constructed and tested under pure axial compression. Three square concrete columns were tested without any steel lamps to serve as a reference specimen. Whereas, remaining concrete columns were externally strengthened using steel clamps. The steel clamps were installed at a different spacing to investigate the best configuration of the steel clamps. The experimental results indicate that steel clamps are very effective in altering the structural performance of the square concrete columns. The square concrete columns externally strengthened using steel clamps demonstrate higher load carrying capacity and ductility as compared with the control specimens.Keywords: concrete, strength, ductility, pre-stressed, steel, clamps, axial compression, columns, stress and strain
Procedia PDF Downloads 130