Search results for: Bayesian estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2141

Search results for: Bayesian estimation

1121 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models

Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini

Abstract:

The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.

Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion

Procedia PDF Downloads 140
1120 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts

Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár

Abstract:

The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.

Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting

Procedia PDF Downloads 183
1119 ANFIS Approach for Locating Faults in Underground Cables

Authors: Magdy B. Eteiba, Wael Ismael Wahba, Shimaa Barakat

Abstract:

This paper presents a fault identification, classification and fault location estimation method based on Discrete Wavelet Transform and Adaptive Network Fuzzy Inference System (ANFIS) for medium voltage cable in the distribution system. Different faults and locations are simulated by ATP/EMTP, and then certain selected features of the wavelet transformed signals are used as an input for a training process on the ANFIS. Then an accurate fault classifier and locator algorithm was designed, trained and tested using current samples only. The results obtained from ANFIS output were compared with the real output. From the results, it was found that the percentage error between ANFIS output and real output is less than three percent. Hence, it can be concluded that the proposed technique is able to offer high accuracy in both of the fault classification and fault location.

Keywords: ANFIS, fault location, underground cable, wavelet transform

Procedia PDF Downloads 513
1118 Competitiveness of African Countries through Open Quintuple Helix Model

Authors: B. G. C. Ahodode, S. Fekkaklouhail

Abstract:

Following the triple helix theory, this study aims to evaluate the innovation system effect on African countries’ competitiveness by taking into account external contributions; according to the extent that developing countries (especially African countries) are characterized by weak innovation systems whose synergy operates more at the foreign level than domestic and global. To do this, we used the correlation test, parsimonious regression techniques, and panel estimation between 2013 and 2016. Results show that the degree of innovation synergy has a significant effect on competitiveness in Africa. Specifically, while the opening system (OPESYS) and social system (SOCSYS) contribute respectively in importance order to 0.634 and 0.284 (at 1%) significant points of increase in the GCI, the political system (POLSYS) and educational system (EDUSYS) only increase it to 0.322 and 0.169 at 5% significance level while the effect of the economic system (ECOSYS) is not significant on Global Competitiveness Index.

Keywords: innovation system, innovation, competitiveness, Africa

Procedia PDF Downloads 69
1117 Introducing Two Species of Parastagonospora (Phaeosphaeriaceae) on Grasses from Italy and Russia, Based on Morphology and Phylogeny

Authors: Ishani D. Goonasekara, Erio Camporesi, Timur Bulgakov, Rungtiwa Phookamsak, Kevin D. Hyde

Abstract:

Phaeosphaeriaceae comprises a large number of species occurring mainly on grasses and cereal crops as endophytes, saprobes and especially pathogens. Parastagonospora is an important genus in Phaeosphaeriaceae that includes pathogens causing leaf and glume blotch on cereal crops. Currently, there are fifteen Parastagonospora species described, including both pathogens and saprobes. In this study, one sexual morph species and an asexual morph species, occurring as saprobes on members of Poaceae are introduced based on morphology and a combined molecular analysis of the LSU, SSU, ITS, and RPB2 gene sequence data. The sexual morph species Parastagonospora elymi was isolated from a Russian sample of Elymus repens, a grass commonly known as couch grass, and important for grazing animals, as a weed and used in traditional Austrian medicine. P. elymi is similar to the sexual morph of P. avenae in having cylindrical asci, bearing 8, overlapping biseriate, fusiform ascospores but can be distinguished by its subglobose to conical shaped, wider ascomata. In addition, no sheath was observed surrounding the ascospores. The asexual morph species was isolated from a specimen from Italy, on Dactylis glomerata, a commonly found grass distributed in temperate regions. It is introduced as Parastagonospora macrouniseptata, a coelomycete, and bears a close resemblance to P. allouniseptata and P. uniseptata in having globose to subglobose, pycnidial conidiomata and hyaline, cylindrical, 1-septate conidia. However, the new species could be distinguished in having much larger conidiomata. In the phylogenetic analysis which consisted of a maximum likelihood and Bayesian analysis P. elymi showed low bootstrap support, but well segregated from other strains within the Parastagonospora clade. P. neoallouniseptata formed a sister clade with P. allouniseptata with high statistical support.

Keywords: dothideomycetes, multi-gene analysis, Poaceae, saprobes, taxonomy

Procedia PDF Downloads 119
1116 Review on Quaternion Gradient Operator with Marginal and Vector Approaches for Colour Edge Detection

Authors: Nadia Ben Youssef, Aicha Bouzid

Abstract:

Gradient estimation is one of the most fundamental tasks in the field of image processing in general, and more particularly for color images since that the research in color image gradient remains limited. The widely used gradient method is Di Zenzo’s gradient operator, which is based on the measure of squared local contrast of color images. The proposed gradient mechanism, presented in this paper, is based on the principle of the Di Zenzo’s approach using quaternion representation. This edge detector is compared to a marginal approach based on multiscale product of wavelet transform and another vector approach based on quaternion convolution and vector gradient approach. The experimental results indicate that the proposed color gradient operator outperforms marginal approach, however, it is less efficient then the second vector approach.

Keywords: gradient, edge detection, color image, quaternion

Procedia PDF Downloads 234
1115 An Unified Model for Longshore Sediment Transport Rate Estimation

Authors: Aleksandra Dudkowska, Gabriela Gic-Grusza

Abstract:

Wind wave-induced sediment transport is an important multidimensional and multiscale dynamic process affecting coastal seabed changes and coastline evolution. The knowledge about sediment transport rate is important to solve many environmental and geotechnical issues. There are many types of sediment transport models but none of them is widely accepted. It is bacause the process is not fully defined. Another problem is a lack of sufficient measurment data to verify proposed hypothesis. There are different types of models for longshore sediment transport (LST, which is discussed in this work) and cross-shore transport which is related to different time and space scales of the processes. There are models describing bed-load transport (discussed in this work), suspended and total sediment transport. LST models use among the others the information about (i) the flow velocity near the bottom, which in case of wave-currents interaction in coastal zone is a separate problem (ii) critical bed shear stress that strongly depends on the type of sediment and complicates in the case of heterogeneous sediment. Moreover, LST rate is strongly dependant on the local environmental conditions. To organize existing knowledge a series of sediment transport models intercomparisons was carried out as a part of the project “Development of a predictive model of morphodynamic changes in the coastal zone”. Four classical one-grid-point models were studied and intercompared over wide range of bottom shear stress conditions, corresponding with wind-waves conditions appropriate for coastal zone in polish marine areas. The set of models comprises classical theories that assume simplified influence of turbulence on the sediment transport (Du Boys, Meyer-Peter & Muller, Ribberink, Engelund & Hansen). It turned out that the values of estimated longshore instantaneous mass sediment transport are in general in agreement with earlier studies and measurements conducted in the area of interest. However, none of the formulas really stands out from the rest as being particularly suitable for the test location over the whole analyzed flow velocity range. Therefore, based on the models discussed a new unified formula for longshore sediment transport rate estimation is introduced, which constitutes the main original result of this study. Sediment transport rate is calculated based on the bed shear stress and critical bed shear stress. The dependence of environmental conditions is expressed by one coefficient (in a form of constant or function) thus the model presented can be quite easily adjusted to the local conditions. The discussion of the importance of each model parameter for specific velocity ranges is carried out. Moreover, it is shown that the value of near-bottom flow velocity is the main determinant of longshore bed-load in storm conditions. Thus, the accuracy of the results depends less on the sediment transport model itself and more on the appropriate modeling of the near-bottom velocities.

Keywords: bedload transport, longshore sediment transport, sediment transport models, coastal zone

Procedia PDF Downloads 387
1114 Flow Field Analysis of a Liquid Ejector Pump Using Embedded Large Eddy Simulation Methodology

Authors: Qasim Zaheer, Jehanzeb Masud

Abstract:

The understanding of entrainment and mixing phenomenon in the ejector pump is of pivotal importance for designing and performance estimation. In this paper, the existence of turbulent vortical structures due to Kelvin-Helmholtz instability at the free surface between the motive and the entrained fluids streams are simulated using Embedded LES methodology. The efficacy of Embedded LES for simulation of complex flow field of ejector pump is evaluated using ANSYS Fluent®. The enhanced mixing and entrainment process due to breaking down of larger eddies into smaller ones as a consequence of Vortex Stretching phenomenon is captured in this study. Moreover, the flow field characteristics of ejector pump like pressure velocity fields and mass flow rates are analyzed and validated against the experimental results.

Keywords: Kelvin Helmholtz instability, embedded LES, complex flow field, ejector pump

Procedia PDF Downloads 297
1113 Modelling High-Frequency Crude Oil Dynamics Using Affine and Non-Affine Jump-Diffusion Models

Authors: Katja Ignatieva, Patrick Wong

Abstract:

We investigated the dynamics of high frequency energy prices, including crude oil and electricity prices. The returns of underlying quantities are modelled using various parametric models such as stochastic framework with jumps and stochastic volatility (SVCJ) as well as non-parametric alternatives, which are purely data driven and do not require specification of the drift or the diffusion coefficient function. Using different statistical criteria, we investigate the performance of considered parametric and nonparametric models in their ability to forecast price series and volatilities. Our models incorporate possible seasonalities in the underlying dynamics and utilise advanced estimation techniques for the dynamics of energy prices.

Keywords: stochastic volatility, affine jump-diffusion models, high frequency data, model specification, markov chain monte carlo

Procedia PDF Downloads 104
1112 Direct Transient Stability Assessment of Stressed Power Systems

Authors: E. Popov, N. Yorino, Y. Zoka, Y. Sasaki, H. Sugihara

Abstract:

This paper discusses the performance of critical trajectory method (CTrj) for power system transient stability analysis under various loading settings and heavy fault condition. The method obtains Controlling Unstable Equilibrium Point (CUEP) which is essential for estimation of power system stability margins. The CUEP is computed by applying the CTrjto the boundary controlling unstable equilibrium point (BCU) method. The Proposed method computes a trajectory on the stability boundary that starts from the exit point and reaches CUEP under certain assumptions. The robustness and effectiveness of the method are demonstrated via six power system models and five loading conditions. As benchmark is used conventional simulation method whereas the performance is compared with and BCU Shadowing method.

Keywords: power system, transient stability, critical trajectory method, energy function method

Procedia PDF Downloads 386
1111 Aliasing Free and Additive Error in Spectra for Alpha Stable Signals

Authors: R. Sabre

Abstract:

This work focuses on the symmetric alpha stable process with continuous time frequently used in modeling the signal with indefinitely growing variance, often observed with an unknown additive error. The objective of this paper is to estimate this error from discrete observations of the signal. For that, we propose a method based on the smoothing of the observations via Jackson polynomial kernel and taking into account the width of the interval where the spectral density is non-zero. This technique allows avoiding the “Aliasing phenomenon” encountered when the estimation is made from the discrete observations of a process with continuous time. We have studied the convergence rate of the estimator and have shown that the convergence rate improves in the case where the spectral density is zero at the origin. Thus, we set up an estimator of the additive error that can be subtracted for approaching the original signal without error.

Keywords: spectral density, stable processes, aliasing, non parametric

Procedia PDF Downloads 130
1110 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions

Authors: Pirta Palola, Richard Bailey, Lisa Wedding

Abstract:

Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.

Keywords: economics of biodiversity, environmental valuation, natural capital, value function

Procedia PDF Downloads 194
1109 Feasibility Assessment of High-Temperature Superconducting AC Cable Lines Implementation in Megacities

Authors: Andrey Kashcheev, Victor Sytnikov, Mikhail Dubinin, Elena Filipeva, Dmitriy Sorokin

Abstract:

Various variants of technical solutions aimed at improving the reliability of power supply to consumers of 110 kV substation are considered. For each technical solution, the results of calculation and analysis of electrical modes and short-circuit currents in the electrical network are presented. The estimation of electric energy consumption for losses within the boundaries of substation reconstruction was carried out in accordance with the methodology for determining the standards of technological losses of electricity during its transmission through electric networks. The assessment of the technical and economic feasibility of the use of HTS CL compared with the complex reconstruction of the 110 kV substation was carried out. It is shown that the use of high-temperature superconducting AC cable lines is a possible alternative to traditional technical solutions used in the reconstruction of substations.

Keywords: superconductivity, cable lines, superconducting cable, AC cable, feasibility

Procedia PDF Downloads 97
1108 Population Size Estimation Based on the GPD

Authors: O. Anan, D. Böhning, A. Maruotti

Abstract:

The purpose of the study is to estimate the elusive target population size under a truncated count model that accounts for heterogeneity. The purposed estimator is based on the generalized Poisson distribution (GPD), which extends the Poisson distribution by adding a dispersion parameter. Thus, it becomes an useful model for capture-recapture data where concurrent events are not homogeneous. In addition, it can account for over-dispersion and under-dispersion. The ratios of neighboring frequency counts are used as a tool for investigating the validity of whether generalized Poisson or Poisson distribution. Since capture-recapture approaches do not provide the zero counts, the estimated parameters can be achieved by modifying the EM-algorithm technique for the zero-truncated generalized Poisson distribution. The properties and the comparative performance of proposed estimator were investigated through simulation studies. Furthermore, some empirical examples are represented insights on the behavior of the estimators.

Keywords: capture, recapture methods, ratio plot, heterogeneous population, zero-truncated count

Procedia PDF Downloads 435
1107 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study

Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming

Abstract:

Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.

Keywords: binary outcomes, statistical methods, clinical trials, simulation study

Procedia PDF Downloads 115
1106 Capacity Estimation of Hybrid Automated Repeat Request Protocol for Low Earth Orbit Mega-Constellations

Authors: Arif Armagan Gozutok, Alper Kule, Burak Tos, Selman Demirel

Abstract:

Wireless communication chain requires effective ways to keep throughput efficiency high while it suffers location-dependent, time-varying burst errors. Several techniques are developed in order to assure that the receiver recovers the transmitted information without errors. The most fundamental approaches are error checking and correction besides re-transmission of the non-acknowledged packets. In this paper, stop & wait (SAW) and chase combined (CC) hybrid automated repeat request (HARQ) protocols are compared and analyzed in terms of throughput and average delay for the usage of low earth orbit (LEO) mega-constellations case. Several assumptions and technological implementations are considered as well as usage of low-density parity check (LDPC) codes together with several constellation orbit configurations.

Keywords: HARQ, LEO, satellite constellation, throughput

Procedia PDF Downloads 145
1105 Problems Occurring in the Process of Audit by Taking into Consideration their Theoretic Aspects against the Background of Reforms Conducted in a Country: The Example of Georgia

Authors: Levan Sabauri

Abstract:

The purpose of this article is an examination of the meaning of theoretic aspects of audit in the context of solving of specific problems of the audit. The audit’s aim is the estimation of financial statements by the auditor, i.e. if they are prepared according to the basic requirements of current financial statements. By examination of concrete examples, we can clearly see problems created in an audit and in often cases, those contradictions which can be caused by incompliance of matters regulated by legislation and by reality. An important part of this work is the analysis of reform in the direction of business accounting, statements and audit in Georgia and its comparison with EU countries. In the article, attention is concentrated on the analysis of specific problems of auditing practice and ways of their solving by taking into consideration theoretical aspects of the audit are proposed.

Keywords: audit, auditor, auditors’ ethic code, auditor’s risk, financial statement, objectivity

Procedia PDF Downloads 358
1104 In vivo Estimation of Mutation Rate of the Aleutian Mink Disease Virus

Authors: P.P. Rupasinghe, A.H. Farid

Abstract:

The Aleutian mink disease virus (AMDV, Carnivore amdoparvovirus 1) causes persistent infection, plasmacytosis, and formation and deposition of immune complexes in various organs in adult mink, leading to glomerulonephritis, arteritis and sometimes death. The disease has no cure nor an effective vaccine, and identification and culling of mink positive for anti-AMDV antibodies have not been successful in controlling the infection in many countries. The failure to eradicate the virus from infected farms may be caused by keeping false-negative individuals on the farm, virus transmission from wild animals, or neighboring farms. The identification of sources of infection, which can be performed by comparing viral sequences, is important in the success of viral eradication programs. High mutation rates could cause inaccuracies when viral sequences are used to trace back an infection to its origin. There is no published information on the mutation rate of AMDV either in vivo or in vitro. The in vivo estimation is the most accurate method, but it is difficult to perform because of the inherent technical complexities, namely infecting live animals, the unknown numbers of viral generations (i.e., infection cycles), the removal of deleterious mutations over time and genetic drift. The objective of this study was to determine the mutation rate of AMDV on which no information was available. A homogenate was prepared from the spleen of one naturally infected American mink (Neovison vison) from Nova Scotia, Canada (parental template). The near full-length genome of this isolate (91.6%, 4,143 bp) was bidirectionally sequenced. A group of black mink was inoculated with this homogenate (descendant mink). Spleen sampled were collected from 10 descendant mink after 16 weeks post-inoculation (wpi) and from anther 10 mink after 176 wpi, and their near-full length genomes were bi-directionally sequenced. Sequences of these mink were compared with each other and with the sequence of the parental template. The number of nucleotide substitutions at 176 wpi was 3.1 times greater than that at 16 wpi (113 vs 36) whereas the estimates of mutation rate at 176 wpi was 3.1 times lower than that at 176 wpi (2.85×10-3 vs 9.13×10-4 substitutions/ site/ year), showing a decreasing trend in the mutation rate per unit of time. Although there is no report on in vivo estimate of the mutation rate of DNA viruses in animals using the same method which was used in the current study, these estimates are at the higher range of reported values for DNA viruses determined by various techniques. These high estimates are logical based on the wide range of diversity and pathogenicity of AMDV isolates. The results suggest that increases in the number of nucleotide substitutions over time and subsequent divergence make it difficult to accurately trace back AMDV isolates to their origin when several years elapsed between the two samplings.

Keywords: Aleutian mink disease virus, American mink, mutation rate, nucleotide substitution

Procedia PDF Downloads 125
1103 Predicting Trapezoidal Weir Discharge Coefficient Using Evolutionary Algorithm

Authors: K. Roushanger, A. Soleymanzadeh

Abstract:

Weirs are structures often used in irrigation techniques, sewer networks and flood protection. However, the hydraulic behavior of this type of weir is complex and difficult to predict accurately. An accurate flow prediction over a weir mainly depends on the proper estimation of discharge coefficient. In this study, the Genetic Expression Programming (GEP) approach was used for predicting trapezoidal and rectangular sharp-crested side weirs discharge coefficient. Three different performance indexes are used as comparing criteria for the evaluation of the model’s performances. The obtained results approved capability of GEP in prediction of trapezoidal and rectangular side weirs discharge coefficient. The results also revealed the influence of downstream Froude number for trapezoidal weir and upstream Froude number for rectangular weir in prediction of the discharge coefficient for both of side weirs.

Keywords: discharge coefficient, genetic expression programming, trapezoidal weir

Procedia PDF Downloads 387
1102 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods

Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard

Abstract:

The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.

Keywords: algorithms, genetics, matching, population

Procedia PDF Downloads 143
1101 Equity Risk Premiums and Risk Free Rates in Modelling and Prediction of Financial Markets

Authors: Mohammad Ghavami, Reza S. Dilmaghani

Abstract:

This paper presents an adaptive framework for modelling financial markets using equity risk premiums, risk free rates and volatilities. The recorded economic factors are initially used to train four adaptive filters for a certain limited period of time in the past. Once the systems are trained, the adjusted coefficients are used for modelling and prediction of an important financial market index. Two different approaches based on least mean squares (LMS) and recursive least squares (RLS) algorithms are investigated. Performance analysis of each method in terms of the mean squared error (MSE) is presented and the results are discussed. Computer simulations carried out using recorded data show MSEs of 4% and 3.4% for the next month prediction using LMS and RLS adaptive algorithms, respectively. In terms of twelve months prediction, RLS method shows a better tendency estimation compared to the LMS algorithm.

Keywords: adaptive methods, LSE, MSE, prediction of financial Markets

Procedia PDF Downloads 336
1100 Enhancement of MIMO H₂S Gas Sweetening Separator Tower Using Fuzzy Logic Controller Array

Authors: Muhammad M. A. S. Mahmoud

Abstract:

Natural gas sweetening process is a controlled process that must be done at maximum efficiency and with the highest quality. In this work, due to complexity and non-linearity of the process, the H₂S gas separation and the intelligent fuzzy controller, which is used to enhance the process, are simulated in MATLAB – Simulink. The new design of fuzzy control for Gas Separator is discussed in this paper. The design is based on the utilization of linear state-estimation to generate the internal knowledge-base that stores input-output pairs. The obtained input/output pairs are then used to design a feedback fuzzy controller. The proposed closed-loop fuzzy control system maintains the system asymptotically-stability while it enhances the system time response to achieve better control of the concentration of the output gas from the tower. Simulation studies are carried out to illustrate the Gas Separator system performance.

Keywords: gas separator, gas sweetening, intelligent controller, fuzzy control

Procedia PDF Downloads 471
1099 Particle Filter Implementation of a Non-Linear Dynamic Fall Model

Authors: T. Kobayashi, K. Shiba, T. Kaburagi, Y. Kurihara

Abstract:

For the elderly living alone, falls can be a serious problem encountered in daily life. Some elderly people are unable to stand up without the assistance of a caregiver. They may become unconscious after a fall, which can lead to serious aftereffects such as hypothermia, dehydration, and sometimes even death. We treat the subject as an inverted pendulum and model its angle from the equilibrium position and its angular velocity. As the model is non-linear, we implement the filtering method with a particle filter which can estimate true states of the non-linear model. In order to evaluate the accuracy of the particle filter estimation results, we calculate the root mean square error (RMSE) between the estimated angle/angular velocity and the true values generated by the simulation. The experimental results give the highest accuracy RMSE of 0.0141 rad and 0.1311 rad/s for the angle and angular velocity, respectively.

Keywords: fall, microwave Doppler sensor, non-linear dynamics model, particle filter

Procedia PDF Downloads 213
1098 Machine Learning in Agriculture: A Brief Review

Authors: Aishi Kundu, Elhan Raza

Abstract:

"Necessity is the mother of invention" - Rapid increase in the global human population has directed the agricultural domain toward machine learning. The basic need of human beings is considered to be food which can be satisfied through farming. Farming is one of the major revenue generators for the Indian economy. Agriculture is not only considered a source of employment but also fulfils humans’ basic needs. So, agriculture is considered to be the source of employment and a pillar of the economy in developing countries like India. This paper provides a brief review of the progress made in implementing Machine Learning in the agricultural sector. Accurate predictions are necessary at the right time to boost production and to aid the timely and systematic distribution of agricultural commodities to make their availability in the market faster and more effective. This paper includes a thorough analysis of various machine learning algorithms applied in different aspects of agriculture (crop management, soil management, water management, yield tracking, livestock management, etc.).Due to climate changes, crop production is affected. Machine learning can analyse the changing patterns and come up with a suitable approach to minimize loss and maximize yield. Machine Learning algorithms/ models (regression, support vector machines, bayesian models, artificial neural networks, decision trees, etc.) are used in smart agriculture to analyze and predict specific outcomes which can be vital in increasing the productivity of the Agricultural Food Industry. It is to demonstrate vividly agricultural works under machine learning to sensor data. Machine Learning is the ongoing technology benefitting farmers to improve gains in agriculture and minimize losses. This paper discusses how the irrigation and farming management systems evolve in real-time efficiently. Artificial Intelligence (AI) enabled programs to emerge with rich apprehension for the support of farmers with an immense examination of data.

Keywords: machine Learning, artificial intelligence, crop management, precision farming, smart farming, pre-harvesting, harvesting, post-harvesting

Procedia PDF Downloads 105
1097 The Labor Participation–Fertility Trade-off: The Case of the Philippines

Authors: Daphne Ashley Sze, Kenneth Santos, Ariane Gabrielle Lim

Abstract:

As women are now given more freedom and choice to pursue employment, the world’s over-all fertility has been decreasing mainly due to the shift in time allocation between working and child rearing. As such, we study the case of the Philippines, where there exists a decreasing fertility rate and increasing openness for women labor participation. We focused on the distinction between fertility and fecundity, the former being the manifestation of the latter and aim to trace and compare the effects of both fecundity and fertility to women’s employment status through the estimation of the reproduction function and multinomial logistic function. Findings suggest that the perception of women regarding employment opportunities in the Philippines links the negative relationship observed between fertility, fecundity and women’s employment status. Today, there has been a convergence in the traditional family roles of men and women, as both genders now have identical employment opportunities that continue to shape their preferences.

Keywords: multinomial logistic function, tobit, fertility, women employment status, fecundity

Procedia PDF Downloads 606
1096 Causality Channels between Corruption and Democracy: A Threshold Non-Linear Analysis

Authors: Khalid Sekkat, Fredj Fhima, Ridha Nouira

Abstract:

This paper focuses on three main limitations of the literature regarding the impact of corruption on democracy. These limitations relate to the distinction between causality and correlation, the components of democracy underlying the impact and the shape of the relationship between corruption and democracy. The study uses recent developments in panel data causality econometrics, breaks democracy down into different components, and examines the types of the relationship. The results show that Control of Corruption leads to a higher quality of democracy. Regarding the estimated coefficients of the components of democracy, they are significant at the 1% level, and their signs and levels are in accordance with expectations except in a few cases. Overall, the results add to the literature in three respects: i). corruption has a causal effect on democracy and, hence, single equation estimation may pose a problem, ii) the assumption of the linearity of the relationships between control of corruption and democracy is also possibly problematic, and iii) the channels of transmission of the effects of corruption on democracy can be diverse. Disentangling them is useful from a policy perspective.

Keywords: corruption, governance, causality, threshold models

Procedia PDF Downloads 48
1095 Entropy-Based Multichannel Stationary Measure for Characterization of Non-Stationary Patterns

Authors: J. D. Martínez-Vargas, C. Castro-Hoyos, G. Castellanos-Dominguez

Abstract:

In this work, we propose a novel approach for measuring the stationarity level of a multichannel time-series. This measure is based on a stationarity definition over time-varying spectrum, and it is aimed to quantify the relation between local stationarity (single-channel) and global dynamic behavior (multichannel dynamics). To assess the proposed approach validity, we use a well known EEG-BCI database, that was constructed for separate between motor/imagery tasks. Thus, based on the statement that imagination of movements implies an increase on the EEG dynamics, we use as discriminant features the proposed measure computed over an estimation of the non-stationary components of input time-series. As measure of separability we use a t-student test, and the obtained results evidence that such measure is able to accurately detect the brain areas projected on the scalp where motor tasks are realized.

Keywords: stationary measure, entropy, sub-space projection, multichannel dynamics

Procedia PDF Downloads 412
1094 Estimation of Natural Convection Heat Transfer from Plate-Fin Heat Sinks in a Closed Enclosure

Authors: Han-Taw Chen, Chung-Hou Lai, Tzu-Hsiang Lin, Ge-Jang He

Abstract:

This study applies the inverse method and three-dimensional CFD commercial software in conjunction with the experimental temperature data to investigate the heat transfer and fluid flow characteristics of the plate-fin heat sink in a closed rectangular enclosure for various values of fin height. The inverse method with the finite difference method and the experimental temperature data is applied to determine the heat transfer coefficient. The k-ε turbulence model is used to obtain the heat transfer and fluid flow characteristics within the fins. To validate the accuracy of the results obtained, the comparison of the average heat transfer coefficient is made. The calculated temperature at selected measurement locations on the plate-fin is also compared with experimental data.

Keywords: inverse method, FLUENT, k-ε model, heat transfer characteristics, plate-fin heat sink

Procedia PDF Downloads 460
1093 The Labor Participation-Fertility Trade-Off: Exploring Fecundity and Its Consequences to Women's Employment in the Philippines

Authors: Ariane C. Lim, Daphne Ashley L. Sze, Kenneth S. Santos

Abstract:

As women are now given more freedom and choice to pursue employment, the world’s over-all fertility has been decreasing mainly due to the shift in time allocation between working and child-rearing. As such, we study the case of the Philippines, where there exists a decreasing fertility rate and increasing openness for women labor participation. We focused on the distinction between fertility and fecundity, the former being the manifestation of the latter and aim to trace and compare the effects of both fecundity and fertility to women’s employment status through the estimation of the reproduction function and multinomial logistic function. Findings suggest that the perception of women regarding employment opportunities in the Philippines links the negative relationship observed between fertility, fecundity and women’s employment status. Today, there has been a convergence in the traditional family roles of men and women, as both genders now have identical employment opportunities that continue to shape their preferences.

Keywords: multinomial logistic function, tobit, fertility, women employment status, fecundity

Procedia PDF Downloads 629
1092 Seismic Bearing Capacity Estimation of Shallow Foundations on Dense Sand Underlain by Loose Sand Strata by Using Finite Elements Limit Analysis

Authors: Pragyan Paramita Das, Vishwas N. Khatri

Abstract:

By using the lower- and upper- bound finite elements to limit analysis in conjunction with second-order conic programming (SOCP), the effect of seismic forces on the bearing capacity of surface strip footing resting on dense sand underlain by loose sand deposit is explored. The soil is assumed to obey the Mohr-Coulomb’s yield criterion and an associated flow rule. The angle of internal friction (ϕ) of the top and the bottom layer is varied from 42° to 44° and 32° to 34° respectively. The coefficient of seismic acceleration is varied from 0 to 0.3. The variation of bearing capacity with different thickness of top layer for various seismic acceleration coefficients is generated. A comparison will be made with the available solutions from literature wherever applicable.

Keywords: bearing capacity, conic programming, finite elements, seismic forces

Procedia PDF Downloads 170