Search results for: hypothesis
284 The Physics of Gravity: A Hypothesis Based on Classical Physics
Authors: I. V. Kuzminov
Abstract:
The alternative hypothesis of the physics of gravitation is put forward in this paper. The hypothesis is constructed on the laws of classical physics. The process of expansion of the Universe explains the physics of gravity. The expansion of the Universe induces the resistance of gyroscopic forces of electron’s rotation. The second component of gravity forces is the resistance arising from the second derivative of linear expansion. This hypothesis does not reject the existing foundation of settlement, particularly as it is empirically constructed. The forces of gravitation and inertia share a common nature, which has been recognized before. The presented hypothesis does not criticize existing theories of gravitation; rather, it explores a separate theme. It is important to acknowledge that the expansion of the Universe exhibits isotropic characteristics. The proposed hypothesis provides a fundamental direction for further research. It is worth noting that this article does not aim to encompass all possible aspects of future investigations.
Keywords: Gyroscopic forces, the unity of the micro- and macrocosm, the expansion of the universe, the second derivative of expansion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 217283 Websites for Hypothesis Testing
Authors: František Mošna
Abstract:
E-learning has become an efficient and widespread means of education at all levels of human activities. Statistics is no exception. Unfortunately the main focus in statistics teaching is usually paid to the substitution in formulas. Suitable websites can simplify and automate calculations and provide more attention and time to the basic principles of statistics, mathematization of real-life situations and following interpretation of results. We now introduce our own web-site for hypothesis testing. Its didactic aspects, the technical possibilities of the individual tools, the experience of use and the advantages or disadvantages are discussed in this paper. This web-site is not a substitute for common statistical software but should significantly improve the teaching of statistics at universities.
Keywords: E-learning, hypothesis testing, PHP, websites.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2350282 Additional Considerations on a Sequential Life Testing Approach using a Weibull Model
Authors: D. I. De Souza, D. R. Fonseca, R. Rocha
Abstract:
In this paper we will develop further the sequential life test approach presented in a previous article by [1] using an underlying two parameter Weibull sampling distribution. The minimum life will be considered equal to zero. We will again provide rules for making one of the three possible decisions as each observation becomes available; that is: accept the null hypothesis H0; reject the null hypothesis H0; or obtain additional information by making another observation. The product being analyzed is a new type of a low alloy-high strength steel product. To estimate the shape and the scale parameters of the underlying Weibull model we will use a maximum likelihood approach for censored failure data. A new example will further develop the proposed sequential life testing approach.Keywords: Sequential Life Testing, Underlying Weibull Model, Maximum Likelihood Approach, Hypothesis Testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1385281 Distributed Motion Control Real-Time Contouring Algorithm Implementation and Performance Test
Authors: Francisco J. Lopez-Jaquez, Sandra E. Ramirez-Jara
Abstract:
This paper presents an implementation and performance test of a distributed motion control system based on a master-slave configuration used to move a plasma-cutting torch over a predefined trajectory. The master is a general-purpose computer running on an open source operating system platform and software developer. Software running in the master computer generates commands on real time and we measure performance based on a selected set of differences between expected and observed distances. We are testing the null hypothesis that the outcome trajectory is identical to the input against the alternative hypothesis that there is a shift to the right or left of the input one. We used the Wilcoxon signed ranks test method for the hypothesis test.
Keywords: Distributed, motion, control, real-time, contouring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1490280 Bootstrap and MLS Methods-based Individual Bioequivalence Assessment
Authors: Kongsheng Zhang, Li Ge
Abstract:
It is a one-sided hypothesis testing process for assessing bioequivalence. Bootstrap and modified large-sample(MLS) methods are considered to study individual bioequivalence(IBE), type I error and power of hypothesis tests are simulated and compared with FDA(2001). The results show that modified large-sample method is equivalent to the method of FDA(2001) .
Keywords: Individual bioequivalence, bootstrap, Bayesian bootstrap, modified large-sample.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584279 Further Thoughtson a Sequential Life Testing Approach Using an Inverse Weibull Model
Authors: D. I. De Souza, G. P. Azevedo, D. R. Fonseca
Abstract:
In this paper we will develop further the sequential life test approach presented in a previous article by [1] using an underlying two parameter Inverse Weibull sampling distribution. The location parameter or minimum life will be considered equal to zero. Once again we will provide rules for making one of the three possible decisions as each observation becomes available; that is: accept the null hypothesis H0; reject the null hypothesis H0; or obtain additional information by making another observation. The product being analyzed is a new electronic component. There is little information available about the possible values the parameters of the corresponding Inverse Weibull underlying sampling distribution could have.To estimate the shape and the scale parameters of the underlying Inverse Weibull model we will use a maximum likelihood approach for censored failure data. A new example will further develop the proposed sequential life testing approach.
Keywords: Sequential Life Testing, Inverse Weibull Model, Maximum Likelihood Approach, Hypothesis Testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420278 Linguistic, Pragmatic and Evolutionary Factors in Wason Selection Task
Authors: Olimpia Matarazzo, Fabrizio Ferrara
Abstract:
In two studies we tested the hypothesis that the appropriate linguistic formulation of a deontic rule – i.e. the formulation which clarifies the monadic nature of deontic operators - should produce more correct responses than the conditional formulation in Wason selection task. We tested this assumption by presenting a prescription rule and a prohibition rule in conditional vs. proper deontic formulation. We contrasted this hypothesis with two other hypotheses derived from social contract theory and relevance theory. According to the first theory, a deontic rule expressed in terms of cost-benefit should elicit a cheater detection module, sensible to mental states attributions and thus able to discriminate intentional rule violations from accidental rule violations. We tested this prevision by distinguishing the two types of violations. According to relevance theory, performance in selection task should improve by increasing cognitive effect and decreasing cognitive effort. We tested this prevision by focusing experimental instructions on the rule vs. the action covered by the rule. In study 1, in which 480 undergraduates participated, we tested these predictions through a 2 x 2 x 2 x 2 (type of the rule x rule formulation x type of violation x experimental instructions) between-subjects design. In study 2 – carried out by means of a 2 x 2 (rule formulation x type of violation) between-subjects design - we retested the hypothesis of rule formulation vs. the cheaterdetection hypothesis through a new version of selection task in which intentional vs. accidental rule violations were better discriminated. 240 undergraduates participated in this study. Results corroborate our hypothesis and challenge the contrasting assumptions. However, they show that the conditional formulation of deontic rules produces a lower performance than what is reported in literature.Keywords: Deontic reasoning; Evolutionary, linguistic, logical, pragmatic factors; Wason selection task
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609277 Creation and Annihilation of Spacetime Elements
Authors: Dnyanesh P. Mathur, Gregory L. Slater
Abstract:
Gravitation and the expansion of the universe at a large scale are generally regarded as two completely distinct phenomena. Yet, in General theory of Relativity (GR), they both manifest as 'curvature' of spacetime. We propose a hypothesis which treats these two 'curvature-producing' phenomena as aspects of an underlying process. This process treats spacetime itself as composed of discrete units (Plancktons) and is 'dynamic' in the sense that these elements of spacetime are continually being both created and annihilated. It is these two complementary processes of Planckton creation and Planckton annihilation which manifest themselves as - 'cosmic expansion' on the one hand and as 'gravitational attraction’ on the other. The Planckton hypothesis treats spacetime as a perfect fluid in the same manner as the co-moving frame of reference of Friedman equations and the Gullstrand-Painleve metric; i.e., Planckton hypothesis replaces 'curvature' of spacetime by the 'flow' of Plancktons (spacetime). Here we discuss how this perspective may allow a unified description of both cosmological and gravitational acceleration as well as providing a mechanism for inducing an irreducible action at every point associated with the creation and annihilation of Plancktons, which could be identified as the zero point energy.
Keywords: Discrete spacetime, spacetime flow, zero point energy, dark energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179276 Semantic Support for Hypothesis-Based Research from Smart Environment Monitoring and Analysis Technologies
Authors: T. S. Myers, J. Trevathan
Abstract:
Improvements in the data fusion and data analysis phase of research are imperative due to the exponential growth of sensed data. Currently, there are developments in the Semantic Sensor Web community to explore efficient methods for reuse, correlation and integration of web-based data sets and live data streams. This paper describes the integration of remotely sensed data with web-available static data for use in observational hypothesis testing and the analysis phase of research. The Semantic Reef system combines semantic technologies (e.g., well-defined ontologies and logic systems) with scientific workflows to enable hypothesis-based research. A framework is presented for how the data fusion concepts from the Semantic Reef architecture map to the Smart Environment Monitoring and Analysis Technologies (SEMAT) intelligent sensor network initiative. The data collected via SEMAT and the inferred knowledge from the Semantic Reef system are ingested to the Tropical Data Hub for data discovery, reuse, curation and publication.
Keywords: Information architecture, Semantic technologies Sensor networks, Ontologies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715275 Change Detection and Non Stationary Signals Tracking by Adaptive Filtering
Authors: Mounira RouaÐùnia, Noureddine Doghmane
Abstract:
In this paper we consider the problem of change detection and non stationary signals tracking. Using parametric estimation of signals based on least square lattice adaptive filters we consider for change detection statistical parametric methods using likelihood ratio and hypothesis tests. In order to track signals dynamics, we introduce a compensation procedure in the adaptive estimation. This will improve the adaptive estimation performances and fasten it-s convergence after changes detection.Keywords: Change detection, Hypothesis test, likelihood ratioleast square lattice adaptive filters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1633274 Integrated Method for Detection of Unknown Steganographic Content
Authors: Magdalena Pejas
Abstract:
This article concerns the presentation of an integrated method for detection of steganographic content embedded by new unknown programs. The method is based on data mining and aggregated hypothesis testing. The article contains the theoretical basics used to deploy the proposed detection system and the description of improvement proposed for the basic system idea. Further main results of experiments and implementation details are collected and described. Finally example results of the tests are presented.Keywords: Steganography, steganalysis, data embedding, data mining, feature extraction, knowledge base, system learning, hypothesis testing, error estimation, black box program, file structure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563273 Vehicle Detection Method using Haar-like Feature on Real Time System
Authors: Sungji Han, Youngjoon Han, Hernsoo Hahn
Abstract:
This paper presents a robust vehicle detection approach using Haar-like feature. It is possible to get a strong edge feature from this Haar-like feature. Therefore it is very effective to remove the shadow of a vehicle on the road. And we can detect the boundary of vehicles accurately. In the paper, the vehicle detection algorithm can be divided into two main steps. One is hypothesis generation, and the other is hypothesis verification. In the first step, it determines vehicle candidates using features such as a shadow, intensity, and vertical edge. And in the second step, it determines whether the candidate is a vehicle or not by using the symmetry of vehicle edge features. In this research, we can get the detection rate over 15 frames per second on our embedded system.
Keywords: vehicle detection, haar-like feauture, single camera, real time
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3332272 Tests for Gaussianity of a Stationary Time Series
Authors: Adnan Al-Smadi
Abstract:
One of the primary uses of higher order statistics in signal processing has been for detecting and estimation of non- Gaussian signals in Gaussian noise of unknown covariance. This is motivated by the ability of higher order statistics to suppress additive Gaussian noise. In this paper, several methods to test for non- Gaussianity of a given process are presented. These methods include histogram plot, kurtosis test, and hypothesis testing using cumulants and bispectrum of the available sequence. The hypothesis testing is performed by constructing a statistic to test whether the bispectrum of the given signal is non-zero. A zero bispectrum is not a proof of Gaussianity. Hence, other tests such as the kurtosis test should be employed. Examples are given to demonstrate the performance of the presented methods.Keywords: Non-Gaussian, bispectrum, kurtosis, hypothesistesting, histogram.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915271 Organic Contribution on Particles Formed on Pacific Ocean: From Phytoplankton Blooms to Climate
Authors: Petri Vaattovaara, Luke Cravigan, Zoran Ristovski, Marc Mallet, Ari Laaksonen, Sarah Lawson, Nick Talbot, Gustavo Olivares, Mike Harvey, Cliff Law
Abstract:
These SOAP project Pacific Ocean measurements reveal that phytoplankton blooms with sunny conditions make possible secondary organic contribution to ultrafine particles size and composition, and thus on cloud formation ability, and finally on climate. This is in agreement with other biologically active region observations about the presence of secondary organics even the exact fraction is also depending on the local marine life (e.g. plankton blooms, seaweeds, corals). An organic contribution is clearly needed to add to CLAW hypothesis.
Keywords: Climate, marine aerosols, phytoplankton, secondary organics, CLAW hypothesis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833270 Child Homicide Victimization and Community Context: A Research Note
Authors: Bohsiu Wu
Abstract:
Among serious crimes, child homicide is a rather rare event. However, the killing of children stirs up a special type of emotion in society that pales other criminal acts. This study examines the relevancy of three possible community-level explanations for child homicide: social deprivation, female empowerment, and social isolation. The social deprivation hypothesis posits that child homicide results from lack of resources in communities. The female empowerment hypothesis argues that a higher female status translates into a higher level of capability to prevent child homicide. Finally, the social isolation hypothesis regards child homicide as a result of lack of social connectivity. Child homicide data, aggregated by US postal ZIP codes in California from 1990 to 1999, were analyzed with a negative binomial regression. The results of the negative binomial analysis demonstrate that social deprivation is the most salient and consistent predictor among all other factors in explaining child homicide victimization at the ZIP-code level. Both social isolation and female labor force participation are weak predictors of child homicide victimization across communities. Further, results from the negative binomial regression show that it is the communities with a higher, not lower, degree of female labor force participation that are associated with a higher count of child homicide. It is possible that poor communities with a higher level of female employment have a lesser capacity to provide the necessary care and protection for the children. Policies aiming at reducing social deprivation and strengthening female empowerment possess the potential to reduce child homicide in the community.
Keywords: Child homicide, deprivation, empowerment, isolation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 689269 Six Sigma Assessment in the Latvian Commercial Banking Sector
Abstract:
The goals of the present research are to estimate Six Sigma implementation in Latvian commercial banks and to identify the perceived benefits of its implementation. To achieve the goals, the authors used sequential explanatory method. To obtain empirical data, the authors have developed the questionnaire and adapted it for the employees of Latvian commercial banks. The questions are related to Six Sigma implementation and its perceived benefits. The questionnaire mainly consists of closed questions, the evaluation of which is based on 5 point Likert scale. The obtained empirical data has shown that of the two hypotheses put forward in the present research – Hypothesis 1 – has to be rejected, while Hypothesis 2 has been partially confirmed. The authors have also faced some research limitations related to the fact that the participants in the questionnaire belong to different rank of the organization hierarchy.
Keywords: Six Sigma, Quality, Commercial banking sector, Latvia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2556268 Internet Purchases in European Union Countries: Multiple Linear Regression Approach
Authors: Ksenija Dumičić, Anita Čeh Časni, Irena Palić
Abstract:
This paper examines economic and Information and Communication Technology (ICT) development influence on recently increasing Internet purchases by individuals for European Union member states. After a growing trend for Internet purchases in EU27 was noticed, all possible regression analysis was applied using nine independent variables in 2011. Finally, two linear regression models were studied in detail. Conducted simple linear regression analysis confirmed the research hypothesis that the Internet purchases in analyzed EU countries is positively correlated with statistically significant variable Gross Domestic Product per capita (GDPpc). Also, analyzed multiple linear regression model with four regressors, showing ICT development level, indicates that ICT development is crucial for explaining the Internet purchases by individuals, confirming the research hypothesis.
Keywords: European Union, Internet purchases, multiple linear regression model, outlier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2955267 Arterial CO2 Pressure Drives Ventilation with a Time Delay during Recovery from an Impulse-like Exercise without Metabolic Acidosis
Authors: R. Afroundeh, T. Arimitsu, R. Yamanaka, C. S. Lian, T. Yunoki, T. Yano, K. Shirakawa
Abstract:
We investigated this hypothesis that arterial CO2 pressure (PaCO2) drives ventilation (V.E) with a time delay duringrecovery from short impulse-like exercise (10 s) with work load of 200 watts. V.E and end tidal CO2 pressure (PETCO2) were measured continuously during rest, warming up, exercise and recovery periods. PaCO2 was predicted (PaCO2 pre) from PETCO2 and tidal volume (VT). PETCO2 and PaCO2 pre peaked at 20 s of recovery. V.E increased and peaked at the end of exercise and then decreased during recovery; however, it peaked again at 30 s of recovery, which was 10 s later than the peak of PaCO2 pre. The relationship between V. E and PaCO2pre was not significant by using data of them obtained at the same time but was significant by using data of V.E obtained 10 s later for data of PaCO2 pre. The results support our hypothesis that PaCO2 drives V.E with a time delay.
Keywords: Arterial CO2 pressure, impulse-like exercise, time delay, ventilation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430266 An Investigation into the Role of Market Beta in Asset Pricing: Evidence from the Romanian Stock Market
Authors: Ioan Popa, Radu Lupu, Cristiana Tudor
Abstract:
In this paper, we apply the FM methodology to the cross-section of Romanian-listed common stocks and investigate the explanatory power of market beta on the cross-section of commons stock returns from Bucharest Stock Exchange. Various assumptions are empirically tested, such us linearity, market efficiency, the “no systematic effect of non-beta risk" hypothesis or the positive expected risk-return trade-off hypothesis. We find that the Romanian stock market shows the same properties as the other emerging markets in terms of efficiency and significance of the linear riskreturn models. Our analysis included weekly returns from January 2002 until May 2010 and the portfolio formation, estimation and testing was performed in a rolling manner using 51 observations (one year) for each stage of the analysis.Keywords: Bucharest Stock Exchange, Fama-Macbeth methodology, systematic risk, non-linear risk-return dependence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903265 Analysis of the Coupled Stretching Bending Problem of Stiffened Plates by a BEM Formulation Based on Reissner's Hypothesis
Authors: Gabriela R. Fernandes, Danilo H. Konda, Luiz C. F. Sanches
Abstract:
In this work, the plate bending formulation of the boundary element method - BEM, based on the Reissner?s hypothesis, is extended to the analysis of plates reinforced by beams taking into account the membrane effects. The formulation is derived by assuming a zoned body where each sub-region defines a beam or a slab and all of them are represented by a chosen reference surface. Equilibrium and compatibility conditions are automatically imposed by the integral equations, which treat this composed structure as a single body. In order to reduce the number of degrees of freedom, the problem values defined on the interfaces are written in terms of their values on the beam axis. Initially are derived separated equations for the bending and stretching problems, but in the final system of equations the two problems are coupled and can not be treated separately. Finally are presented some numerical examples whose analytical results are known to show the accuracy of the proposed model.
Keywords: Boundary elements, Building floor structures, Platebending.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980264 The Quality Assessment of Seismic Reflection Survey Data Using Statistical Analysis: A Case Study of Fort Abbas Area, Cholistan Desert, Pakistan
Authors: U. Waqas, M. F. Ahmed, A. Mehmood, M. A. Rashid
Abstract:
In geophysical exploration surveys, the quality of acquired data holds significant importance before executing the data processing and interpretation phases. In this study, 2D seismic reflection survey data of Fort Abbas area, Cholistan Desert, Pakistan was taken as test case in order to assess its quality on statistical bases by using normalized root mean square error (NRMSE), Cronbach’s alpha test (α) and null hypothesis tests (t-test and F-test). The analysis challenged the quality of the acquired data and highlighted the significant errors in the acquired database. It is proven that the study area is plain, tectonically least affected and rich in oil and gas reserves. However, subsurface 3D modeling and contouring by using acquired database revealed high degrees of structural complexities and intense folding. The NRMSE had highest percentage of residuals between the estimated and predicted cases. The outcomes of hypothesis testing also proved the biasness and erraticness of the acquired database. Low estimated value of alpha (α) in Cronbach’s alpha test confirmed poor reliability of acquired database. A very low quality of acquired database needs excessive static correction or in some cases, reacquisition of data is also suggested which is most of the time not feasible on economic grounds. The outcomes of this study could be used to assess the quality of large databases and to further utilize as a guideline to establish database quality assessment models to make much more informed decisions in hydrocarbon exploration field.
Keywords: Data quality, null hypothesis, seismic lines, seismic reflection survey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 615263 Identification of Regulatory Mechanism of Orthostatic Response
Authors: E. Hlavacova, J. Chrenova, Z. Rausova, M. Vlcek, A. Penesova, L. Dedik
Abstract:
En bloc assumes modeling all phases of the orthostatic test with the only one mathematical model, which allows the complex parametric view of orthostatic response. The work presents the implementation of a mathematical model for processing of the measurements of systolic, diastolic blood pressure and heart rate performed on volunteers during orthostatic test. The original assumption of model hypothesis that every postural change means only one Stressor, did not complying with the measurements of physiological circulation factor-time profiles. Results of the identification support the hypothesis that second postural change of orthostatic test causes induced Stressors, with the observation of a physiological regulation mechanism. Maximal demonstrations are on the heart rate and diastolic blood pressure-time profile, minimal are for the measurements of the systolic blood pressure. Presented study gives a new view on orthostatic test with impact on clinical practice.
Keywords: En bloc modeling, physiological circulatory factor, postural change, stressor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1435262 BEM Formulations Based on Kirchhoffs Hypoyhesis to Perform Linear Bending Analysis of Plates Reinforced by Beams
Authors: Gabriela R. Fernandes, Renato F. Denadai, Guido J. Denipotti
Abstract:
In this work, are discussed two formulations of the boundary element method - BEM to perform linear bending analysis of plates reinforced by beams. Both formulations are based on the Kirchhoff's hypothesis and they are obtained from the reciprocity theorem applied to zoned plates, where each sub-region defines a beam or a slab. In the first model the problem values are defined along the interfaces and the external boundary. Then, in order to reduce the number of degrees of freedom kinematics hypothesis are assumed along the beam cross section, leading to a second formulation where the collocation points are defined along the beam skeleton, instead of being placed on interfaces. On these formulations no approximation of the generalized forces along the interface is required. Moreover, compatibility and equilibrium conditions along the interface are automatically imposed by the integral equation. Thus, these formulations require less approximation and the total number of the degree s of freedom is reduced. In the numerical examples are discussed the differences between these two BEM formulations, comparing as well the results to a well-known finite element code.
Keywords: Boundary elements, Building floor structures, Platebending.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1663261 Analysis of Causality between Economic Growth and Carbon Emissions: The Case of Mexico 1971-2011
Authors: Mario Gómez, José Carlos Rodríguez
Abstract:
This paper analyzes the Environmental Kuznets Curve (EKC) hypothesis to test the causality relationship between economic activity, trade openness and carbon dioxide emissions in Mexico (1971-2011). The results achieved in this research show that there are three long-run relationships between production, trade openness, energy consumption and carbon dioxide emissions. The EKC hypothesis was not verified in this research. Indeed, it was found evidence of a short-term unidirectional causality from GDP and GDP squared to carbon dioxide emissions, from GDP, GDP squared and TO to EC, and bidirectional causality between TO and GDP. Finally, it was found evidence of long-term unidirectional causality from all variables to carbon emissions. These results suggest that a reduction in energy consumption, economic activity, or an increase in trade openness would reduce pollution.
Keywords: Energy consumption, environmental Kuznets curve, economic growth, causality, co-integration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1118260 Between Legal Authority and Epistemic Competence: A Case Study of the Brazilian Supreme Court
Authors: Júlia Massadas
Abstract:
The objective of this paper is to analyze the role played by the institute of the public hearings in the Brazilian Supreme Court. The public hearings are regulated since 1999 by the Brazilian Laws nº 9.868, nº 9.882 and by the Intern Regiment of the Brazilian Supreme Court. According to this legislation, the public hearings are supposed to be called when a matter of circumstance of fact must be clarified, what can be done through the hearing of the testimonies of persons with expertise and authority in the theme related to the cause. This work aims to investigate what is the role played by the public hearings and by the experts in the Brazilian Supreme Court. The hypothesis of this research is that: (I) The public hearings in the Brazilian Supreme Court are used to uphold a rhetoric of a democratic legitimacy of the Court`s decisions; (II) The Legislative intentions have been distorted. To test this hypothesis, the adopted methodology involves an empirical study of the Brazilian jurisprudence. As a conclusion, it follows that the public hearings convened by the Brazilian Supreme Court do not correspond, in practice, to the role assigned to them by the Congress since they do not serve properly to epistemic interests. The public hearings not only do not legitimate democratically the decisions, but also, do not properly clarify technical issues.
Keywords: Brazilian Supreme Court, constitutional law, public hearings, epistemic competence, legal authority.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574259 Tolerance and Perspective towards Disability: A Mixed Methods Study
Authors: L. Koštić, P. Karaman
Abstract:
Society has a lot of diversities according to sex, age, religion, abilities or disabilities, education, etc. According to differences, everybody needs to be tolerated and equally included in society. In order to provide quality inclusion, society needs to tolerate differences. This study relates to the differences in disability. To examine tolerance towards disability and inclusion, this study was conducted with students attending regular elementary and high school. The main goal was to examine their attitudes towards their classmates and elderly people with disabilities. The study begins with the hypothesis that the environment has a highly developed tolerance towards people with disabilities, regardless of age. The sample was divided according to tasks and methodology analysis. Students attending regular elementary school were asked to make drawings of their classmates with disabilities. The drawings were analyzed using quantitative methodology according to the colors children used and the position of character on the paper. Students attending high school and members of general population were asked to complete a questionnaire designed for this study during a workshop held on the International Day for Tolerance. Responses were analyzed using qualitative methodology. The hypothesis was confirmed.
Keywords: Classmates, disability, students, tolerance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1151258 Experimenting the Influence of Input Modality on Involvement Load Hypothesis
Authors: Mohammad Hassanzadeh
Abstract:
As far as incidental vocabulary learning is concerned, the basic contention of the Involvement Load Hypothesis (ILH) is that retention of unfamiliar words is, generally, conditional upon the degree of involvement in processing them. This study examined input modality and incidental vocabulary uptake in a task-induced setting whereby three variously loaded task types (marginal glosses, fill-in-task, and sentence-writing) were alternately assigned to one group of students at Allameh Tabataba’i University (n=2l) during six classroom sessions. While one round of exposure was comprised of the audiovisual medium (TV talk shows), the second round consisted of textual materials with approximately similar subject matter (reading texts). In both conditions, however, the tasks were equivalent to one another. Taken together, the study pursued the dual objectives of establishing a litmus test for the ILH and its proposed values of ‘need’, ‘search’ and ‘evaluation’ in the first place. Secondly, it sought to bring to light the superiority issue of exposure to audiovisual input versus the written input as far as the incorporation of tasks is concerned. At the end of each treatment session, a vocabulary active recall test was administered to measure their incidental gains. Running a one-way analysis of variance revealed that the audiovisual intervention yielded higher gains than the written version even when differing tasks were included. Meanwhile, task 'three' (sentence-writing) turned out the most efficient in tapping learners' active recall of the target vocabulary items. In addition to shedding light on the superiority of audiovisual input over the written input when circumstances are relatively held constant, this study for the most part, did support the underlying tenets of ILH.
Keywords: Evaluation, incidental vocabulary learning, input mode, involvement load hypothesis, need, search.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1152257 Variational Explanation Generator: Generating Explanation for Natural Language Inference Using Variational Auto-Encoder
Authors: Zhen Cheng, Xinyu Dai, Shujian Huang, Jiajun Chen
Abstract:
Recently, explanatory natural language inference has attracted much attention for the interpretability of logic relationship prediction, which is also known as explanation generation for Natural Language Inference (NLI). Existing explanation generators based on discriminative Encoder-Decoder architecture have achieved noticeable results. However, we find that these discriminative generators usually generate explanations with correct evidence but incorrect logic semantic. It is due to that logic information is implicitly encoded in the premise-hypothesis pairs and difficult to model. Actually, logic information identically exists between premise-hypothesis pair and explanation. And it is easy to extract logic information that is explicitly contained in the target explanation. Hence we assume that there exists a latent space of logic information while generating explanations. Specifically, we propose a generative model called Variational Explanation Generator (VariationalEG) with a latent variable to model this space. Training with the guide of explicit logic information in target explanations, latent variable in VariationalEG could capture the implicit logic information in premise-hypothesis pairs effectively. Additionally, to tackle the problem of posterior collapse while training VariaztionalEG, we propose a simple yet effective approach called Logic Supervision on the latent variable to force it to encode logic information. Experiments on explanation generation benchmark—explanation-Stanford Natural Language Inference (e-SNLI) demonstrate that the proposed VariationalEG achieves significant improvement compared to previous studies and yields a state-of-the-art result. Furthermore, we perform the analysis of generated explanations to demonstrate the effect of the latent variable.Keywords: Natural Language Inference, explanation generation, variational auto-encoder, generative model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 692256 Investigation of Buoyant Parameters of k-ε Turbulence Model in Gravity Stratified Flows
Authors: A. Majid Bahari, Kourosh Hejazi
Abstract:
Different variants for buoyancy-affected terms in k-ε turbulence model have been utilized to predict the flow parameters more accurately, and investigate applicability of alternative k-ε turbulence buoyant closures in numerical simulation of a horizontal gravity current. The additional non-isotropic turbulent stress due to buoyancy has been considered in production term, based on Algebraic Stress Model (ASM). In order to account for turbulent scalar fluxes, general gradient diffusion hypothesis has been used along with Boussinesq gradient diffusion hypothesis with a variable turbulent Schmidt number and additional empirical constant c3ε.To simulate buoyant flow domain a 2D vertical numerical model (WISE, Width Integrated Stratified Environments), based on Reynolds- Averaged Navier-Stokes (RANS) equations, has been deployed and the model has been further developed for different k-ε turbulence closures. Results are compared against measured laboratory values of a saline gravity current to explore the efficient turbulence model.
Keywords: Buoyant flows, Buoyant k-ε turbulence model, saline gravity current.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3908255 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement
Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov
Abstract:
One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.
Keywords: Fuzzy logic, fuzzy sets, performance measurement, project analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1078