Search results for: Conditional Diagnosability
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 65

Search results for: Conditional Diagnosability

35 Residual Life Prediction for a System Subject to Condition Monitoring and Two Failure Modes

Authors: Akram Khaleghei Ghosheh Balagh, Viliam Makis

Abstract:

In this paper, we investigate the residual life prediction problem for a partially observable system subject to two failure modes, namely a catastrophic failure and a failure due to the system degradation. The system is subject to condition monitoring and the degradation process is described by a hidden Markov model with unknown parameters. The parameter estimation procedure based on an EM algorithm is developed and the formulas for the conditional reliability function and the mean residual life are derived, illustrated by a numerical example.

Keywords: Partially observable system, hidden Markov model, competing risks, residual life prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1994
34 The Effect of Oil Price Uncertainty on Food Price in South Africa

Authors: Goodness C. Aye

Abstract:

This paper examines the effect of the volatility of oil prices on food price in South Africa using monthly data covering the period 2002:01 to 2014:09. Food price is measured by the South African consumer price index for food while oil price is proxied by the Brent crude oil. The study employs the GARCH-in-mean VAR model, which allows the investigation of the effect of a negative and positive shock in oil price volatility on food price. The model also allows the oil price uncertainty to be measured as the conditional standard deviation of a one-step-ahead forecast error of the change in oil price. The results show that oil price uncertainty has a positive and significant effect on food price in South Africa. The responses of food price to a positive and negative oil price shocks is asymmetric.

Keywords: Oil price volatility, Food price, Bivariate GARCH-in- mean VAR, Asymmetric.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2681
33 Developing of Fragility Curve for Two-Span Simply Supported Concrete Bridge in Near-Fault Area

Authors: S. Shirazian, M.R. Ghayamghamian, G.R. Nouri

Abstract:

Bridges are one of the main components of transportation networks. They should be functional before and after earthquake for emergency services. Therefore we need to assess seismic performance of bridges under different seismic loadings. Fragility curve is one of the popular tools in seismic evaluations. The fragility curves are conditional probability statements, which give the probability of a bridge reaching or exceeding a particular damage level for a given intensity level. In this study, the seismic performance of a two-span simply supported concrete bridge is assessed. Due to usual lack of empirical data, the analytical fragility curve was developed by results of the dynamic analysis of bridge subjected to the different time histories in near-fault area.

Keywords: Fragility curve, Seismic behavior, Time historyanalysis, Transportation Network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2748
32 Interactive Fuzzy Multi-objective Programming in Land Re-organisational Planning for Sustainable Rural Development

Authors: Bijaya Krushna Mangaraj, Deepak Kumar Das

Abstract:

Sustainability in rural production system can only be achieved if it can suitably satisfy the local requirement as well as the outside demand with the changing time. With the increased pressure from the food sector in a globalised world, the agrarian economy needs to re-organise its cultivable land system to be compatible with new management practices as well as the multiple needs of various stakeholders and the changing resource scenario. An attempt has been made to transform this problem into a multi-objective decisionmaking problem considering various objectives, resource constraints and conditional constraints. An interactive fuzzy multi-objective programming approach has been used for such a purpose taking a case study in Indian context to demonstrate the validity of the method.

Keywords: Land re-organisation, Crop planning, Multiobjective Decision-Making, Fuzzy Goal Programming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1407
31 Modeling of Alpha-Particles’ Epigenetic Effects in Short-Term Test on Drosophila melanogaster

Authors: Z. M. Biyasheva, M. Zh. Tleubergenova, Y. A. Zaripova, A. L. Shakirov, V. V. Dyachkov

Abstract:

In recent years, interest in ecogenetic and biomedical problems related to the effects on the population of radon and its daughter decay products has increased significantly. Of particular interest is the assessment of the consequence of irradiation at hazardous radon areas, which includes the Almaty region due to the large number of tectonic faults that enhance radon emanation. In connection with the foregoing, the purpose of this work was to study the genetic effects of exposure to supernormal radon doses on the alpha-radiation model. Irradiation does not affect the growth of the cell, but rather its ability to differentiate. In addition, irradiation can lead to somatic mutations, morphoses and modifications. These damages most likely occur from changes in the composition of the substances of the cell. Such changes are epigenetic since they affect the regulatory processes of ontogenesis. Variability in the expression of regulatory genes refers to conditional mutations that modify the formation of signs of intraspecific similarity. Characteristic features of these conditional mutations are the dominant type of their manifestation, phenotypic asymmetry and their instability in the generations. Currently, the terms “morphosis” and “modification” are used to describe epigenetic variability, which are maintained in Drosophila melanogaster cultures using linkaged X- chromosomes, and the mutant X-chromosome is transmitted along the paternal line. In this paper, we investigated the epigenetic effects of alpha particles, whose source in nature is mainly radon and its daughter decay products. In the experiment, an isotope of plutonium-238 (Pu238), generating radiation with an energy of about 5500 eV, was used as a source of alpha particles. In an experiment in the first generation (F1), deformities or morphoses were found, which can be called "radiation syndromes" or mutations, the manifestation of which is similar to the pleiotropic action of genes. The proportion of morphoses in the experiment was 1.8%, and in control 0.4%. In this experiment, the morphoses in the flies of the first and second generation looked like black spots, or melanomas on different parts of the imago body; "generalized" melanomas; curled, curved wings; shortened wing; bubble on one wing; absence of one wing, deformation of thorax, interruption and violation of tergite patterns, disruption of distribution of ocular facets and bristles; absence of pigmentation of the second and third legs. Statistical analysis by the Chi-square method showed the reliability of the difference in experiment and control at P ≤ 0.01. On the basis of this, it can be considered that alpha particles, which in the environment are mainly generated by radon and its isotopes, have a mutagenic effect that manifests itself, mainly in the formation of morphoses or deformities.

Keywords: Alpha-radiation, genotoxicity, morphoses, radioecology, radon.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 879
30 Effects of the Stock Market Dynamic Linkages on the Central and Eastern European Capital Markets

Authors: Ioan Popa, Cristiana Tudor, Radu Lupu

Abstract:

The interdependences among stock market indices were studied for a long while by academics in the entire world. The current financial crisis opened the door to a wide range of opinions concerning the understanding and measurement of the connections considered to provide the controversial phenomenon of market integration. Using data on the log-returns of 17 stock market indices that include most of the CEE markets, from 2005 until 2009, our paper studies the problem of these dependences using a new methodological tool that takes into account both the volatility clustering effect and the stochastic properties of these linkages through a Dynamic Conditional System of Simultaneous Equations. We find that the crisis is well captured by our model as it provides evidence for the high volatility – high dependence effect.

Keywords: Stock market interdependences, Dynamic System ofSimultaneous Equations, financial crisis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
29 Uneven Development: Structural Changes and Income Outcomes across States in Malaysia

Authors: Siti Aiysyah Tumin

Abstract:

This paper looks at the nature of structural changes—the transition of employment from agriculture, to manufacturing, then to different types of services—in different states in Malaysia and links it to income outcomes for households and workers. Specifically, this paper investigates the conditional association between the concentration of different economic activities and income outcomes (household incomes and employee wages) in almost four decades. Using publicly available state-level employment and income data, we found that significant wage premium was associated with “modern” services (finance, real estate, professional, information and communication), which are urban-based services sectors that employ a larger proportion of skilled and educated workers. However, employment in manufacturing and other services subsectors was significantly associated with a lower income dispersion and inequality, alluding to their importance in welfare improvements.

Keywords: Employment, labour market, structural change, wages.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 463
28 Discovery of Fuzzy Censored Production Rules from Large Set of Discovered Fuzzy if then Rules

Authors: Tamanna Siddiqui, M. Afshar Alam

Abstract:

Censored Production Rule is an extension of standard production rule, which is concerned with problems of reasoning with incomplete information, subject to resource constraints and problem of reasoning efficiently with exceptions. A CPR has a form: IF A (Condition) THEN B (Action) UNLESS C (Censor), Where C is the exception condition. Fuzzy CPR are obtained by augmenting ordinary fuzzy production rule “If X is A then Y is B with an exception condition and are written in the form “If X is A then Y is B Unless Z is C. Such rules are employed in situation in which the fuzzy conditional statement “If X is A then Y is B" holds frequently and the exception condition “Z is C" holds rarely. Thus “If X is A then Y is B" part of the fuzzy CPR express important information while the unless part acts only as a switch that changes the polarity of “Y is B" to “Y is not B" when the assertion “Z is C" holds. The proposed approach is an attempt to discover fuzzy censored production rules from set of discovered fuzzy if then rules in the form: A(X) ÔçÆ B(Y) || C(Z).

Keywords: Uncertainty Quantification, Fuzzy if then rules, Fuzzy Censored Production Rules, Learning algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
27 Investigating Real Ship Accidents with Descriptive Analysis in Turkey

Authors: İsmail Karaca, Ömer Söner

Abstract:

The use of advanced methods has been increasing day by day in the maritime sector, which is one of the sectors least affected by the COVID-19 pandemic. It is aimed to minimize accidents, especially by using advanced methods in the investigation of marine accidents. This research aimed to conduct an exploratory statistical analysis of particular ship accidents in the Transport Safety Investigation Center of Turkey database. 46 ship accidents, which occurred between 2010-2018, have been selected from the database. In addition to the availability of a reliable and comprehensive database, taking advantage of the robust statistical models for investigation is critical to improving the safety of ships. Thus, descriptive analysis has been used in the research to identify causes and conditional factors related to different types of ship accidents. The research outcomes underline the fact that environmental factors and day and night ratio have great influence on ship safety.

Keywords: Descriptive analysis, maritime industry, maritime safety, marine accident analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 623
26 Data Envelopment Analysis under Uncertainty and Risk

Authors: P. Beraldi, M. E. Bruni

Abstract:

Data Envelopment Analysis (DEA) is one of the most widely used technique for evaluating the relative efficiency of a set of homogeneous decision making units. Traditionally, it assumes that input and output variables are known in advance, ignoring the critical issue of data uncertainty. In this paper, we deal with the problem of efficiency evaluation under uncertain conditions by adopting the general framework of the stochastic programming. We assume that output parameters are represented by discretely distributed random variables and we propose two different models defined according to a neutral and risk-averse perspective. The models have been validated by considering a real case study concerning the evaluation of the technical efficiency of a sample of individual firms operating in the Italian leather manufacturing industry. Our findings show the validity of the proposed approach as ex-ante evaluation technique by providing the decision maker with useful insights depending on his risk aversion degree.

Keywords: DEA, Stochastic Programming, Ex-ante evaluation technique, Conditional Value at Risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1922
25 Geometric Representation of Modified Forms of Seven Important Failure Criteria

Authors: Ranajay Bhowmick

Abstract:

Elastoplastic analysis of a structural system involves defining failure/yield criterion, flow rules and hardening rules. The failure/yield criterion defines the limit beyond which the material flows plastically and hardens/softens or remains perfectly plastic before ultimate collapse. The failure/yield criterion is represented geometrically in three/two dimensional Haigh-Westergaard stress-space to facilitate a better understanding of the behavior of the material. In the present study geometric representations in three and two-dimensional stress-space of a few important failure/yield criterion are presented. The criteria presented are the modified forms obtained due to the conditional solutions of the equation of stress invariants. A comparison of the failure/yield surfaces is also presented here to obtain the effectiveness of each of them and it has been found that for identical conditions the Rankine’s criterion gives the largest values of limiting stresses.

Keywords: Deviatoric plane, failure criteria, geometric representation, hydrostatic axis, modified form.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 300
24 On the Efficiency and Robustness of Commingle Wiener and Lévy Driven Processes for Vasciek Model

Authors: Rasaki O. Olanrewaju

Abstract:

The driven processes of Wiener and Lévy are known self-standing Gaussian-Markov processes for fitting non-linear dynamical Vasciek model. In this paper, a coincidental Gaussian density stationarity condition and autocorrelation function of the two driven processes were established. This led to the conflation of Wiener and Lévy processes so as to investigate the efficiency of estimates incorporated into the one-dimensional Vasciek model that was estimated via the Maximum Likelihood (ML) technique. The conditional laws of drift, diffusion and stationarity process was ascertained for the individual Wiener and Lévy processes as well as the commingle of the two processes for a fixed effect and Autoregressive like Vasciek model when subjected to financial series; exchange rate of Naira-CFA Franc. In addition, the model performance error of the sub-merged driven process was miniature compared to the self-standing driven process of Wiener and Lévy.

Keywords: Wiener process, Lévy process, Vasciek model, drift, diffusion, Gaussian density stationary.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 630
23 Reliability Analysis of Underground Pipelines Using Subset Simulation

Authors: Kong Fah Tee, Lutfor Rahman Khan, Hongshuang Li

Abstract:

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

Keywords: Underground pipelines, Probability of failure, Reliability and Subset Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3504
22 Monotonicity of Dependence Concepts from Independent Random Vector into Dependent Random Vector

Authors: Guangpu Chen

Abstract:

When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.

Keywords: Monotonic, Rosenblatt, Nataf transformation, dependence concepts, completely positive matrices, Gaussiancopulas

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1171
21 Spatio-Temporal Analysis and Mapping of Malaria in Thailand

Authors: Krisada Lekdee, Sunee Sammatat, Nittaya Boonsit

Abstract:

This paper proposes a GLMM with spatial and temporal effects for malaria data in Thailand. A Bayesian method is used for parameter estimation via Gibbs sampling MCMC. A conditional autoregressive (CAR) model is assumed to present the spatial effects. The temporal correlation is presented through the covariance matrix of the random effects. The malaria quarterly data have been extracted from the Bureau of Epidemiology, Ministry of Public Health of Thailand. The factors considered are rainfall and temperature. The result shows that rainfall and temperature are positively related to the malaria morbidity rate. The posterior means of the estimated morbidity rates are used to construct the malaria maps. The top 5 highest morbidity rates (per 100,000 population) are in Trat (Q3, 111.70), Chiang Mai (Q3, 104.70), Narathiwat (Q4, 97.69), Chiang Mai (Q2, 88.51), and Chanthaburi (Q3, 86.82). According to the DIC criterion, the proposed model has a better performance than the GLMM with spatial effects but without temporal terms.

Keywords: Bayesian method, generalized linear mixed model (GLMM), malaria, spatial effects, temporal correlation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2101
20 RASPE – Risk Advisory Smart System for Pipeline Projects in Egypt

Authors: Nael Y. Zabel, Maged E. Georgy, Moheeb E. Ibrahim

Abstract:

A knowledge-based expert system with the acronym RASPE is developed as an application tool to help decision makers in construction companies make informed decisions about managing risks in pipeline construction projects. Choosing to use expert systems from all available artificial intelligence techniques is due to the fact that an expert system is more suited to representing a domain’s knowledge and the reasoning behind domain-specific decisions. The knowledge-based expert system can capture the knowledge in the form of conditional rules which represent various project scenarios and potential risk mitigation/response actions. The built knowledge in RASPE is utilized through the underlying inference engine that allows the firing of rules relevant to a project scenario into consideration. Paper provides an overview of the knowledge acquisition process and goes about describing the knowledge structure which is divided up into four major modules. The paper shows one module in full detail for illustration purposes and concludes with insightful remarks.

Keywords: Expert System, Knowledge Management, Pipeline Projects, Risk Mismanagement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2239
19 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modeling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behavior of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: Central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2475
18 Software Reliability Prediction Model Analysis

Authors: L. Mirtskhulava, M. Khunjgurua, N. Lomineishvili, K. Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: Exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1631
17 Bayesian Decision Approach to Protection on the Flood Event in Upper Ayeyarwady River, Myanmar

Authors: Min Min Swe Zin

Abstract:

This paper introduces the foundations of Bayesian probability theory and Bayesian decision method. The main goal of Bayesian decision theory is to minimize the expected loss of a decision or minimize the expected risk. The purposes of this study are to review the decision process on the issue of flood occurrences and to suggest possible process for decision improvement. This study examines the problem structure of flood occurrences and theoretically explicates the decision-analytic approach based on Bayesian decision theory and application to flood occurrences in Environmental Engineering. In this study, we will discuss about the flood occurrences upon an annual maximum water level in cm, 43-year record available from 1965 to 2007 at the gauging station of Sagaing on the Ayeyarwady River with the drainage area - 120193 sq km by using Bayesian decision method. As a result, we will discuss the loss and risk of vast areas of agricultural land whether which will be inundated or not in the coming year based on the two standard maximum water levels during 43 years. And also we forecast about that lands will be safe from flood water during the next 10 years.

Keywords: Bayesian decision method, conditional binomial distribution, minimax rules, prior beta distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1531
16 Economic Evaluation of Bowland Shale Gas Wells Development in the UK

Authors: Elijah Acquah-Andoh

Abstract:

The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.

Keywords: Capex, economical, investment, profitability, shale gas development, sustainable.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2667
15 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: Bootstrap, Edgeworth approximation, independent and Identical distributed, quantile.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 375
14 A Two-Step Approach for Tree-structured XPath Query Reduction

Authors: Minsoo Lee, Yun-mi Kim, Yoon-kyung Lee

Abstract:

XML data consists of a very flexible tree-structure which makes it difficult to support the storing and retrieving of XML data. The node numbering scheme is one of the most popular approaches to store XML in relational databases. Together with the node numbering storage scheme, structural joins can be used to efficiently process the hierarchical relationships in XML. However, in order to process a tree-structured XPath query containing several hierarchical relationships and conditional sentences on XML data, many structural joins need to be carried out, which results in a high query execution cost. This paper introduces mechanisms to reduce the XPath queries including branch nodes into a much more efficient form with less numbers of structural joins. A two step approach is proposed. The first step merges duplicate nodes in the tree-structured query and the second step divides the query into sub-queries, shortens the paths and then merges the sub-queries back together. The proposed approach can highly contribute to the efficient execution of XML queries. Experimental results show that the proposed scheme can reduce the query execution cost by up to an order of magnitude of the original execution cost.

Keywords: XML, Xpath, tree-structured query, query reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1505
13 Computer Verification in Cryptography

Authors: Markus Kaiser, Johannes Buchmann

Abstract:

In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.

Keywords: prime numbers, primality tests, (conditional) proba¬bility distributions, formal proof system, higher-order logic, formal verification, Bayes' Formula, Miller-Rabin primality test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2138
12 Mining Network Data for Intrusion Detection through Naïve Bayesian with Clustering

Authors: Dewan Md. Farid, Nouria Harbi, Suman Ahmmed, Md. Zahidur Rahman, Chowdhury Mofizur Rahman

Abstract:

Network security attacks are the violation of information security policy that received much attention to the computational intelligence society in the last decades. Data mining has become a very useful technique for detecting network intrusions by extracting useful knowledge from large number of network data or logs. Naïve Bayesian classifier is one of the most popular data mining algorithm for classification, which provides an optimal way to predict the class of an unknown example. It has been tested that one set of probability derived from data is not good enough to have good classification rate. In this paper, we proposed a new learning algorithm for mining network logs to detect network intrusions through naïve Bayesian classifier, which first clusters the network logs into several groups based on similarity of logs, and then calculates the prior and conditional probabilities for each group of logs. For classifying a new log, the algorithm checks in which cluster the log belongs and then use that cluster-s probability set to classify the new log. We tested the performance of our proposed algorithm by employing KDD99 benchmark network intrusion detection dataset, and the experimental results proved that it improves detection rates as well as reduces false positives for different types of network intrusions.

Keywords: Clustering, detection rate, false positive, naïveBayesian classifier, network intrusion detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5493
11 Numerical Optimization within Vector of Parameters Estimation in Volatility Models

Authors: J. Arneric, A. Rozga

Abstract:

In this paper usefulness of quasi-Newton iteration procedure in parameters estimation of the conditional variance equation within BHHH algorithm is presented. Analytical solution of maximization of the likelihood function using first and second derivatives is too complex when the variance is time-varying. The advantage of BHHH algorithm in comparison to the other optimization algorithms is that requires no third derivatives with assured convergence. To simplify optimization procedure BHHH algorithm uses the approximation of the matrix of second derivatives according to information identity. However, parameters estimation in a/symmetric GARCH(1,1) model assuming normal distribution of returns is not that simple, i.e. it is difficult to solve it analytically. Maximum of the likelihood function can be founded by iteration procedure until no further increase can be found. Because the solutions of the numerical optimization are very sensitive to the initial values, GARCH(1,1) model starting parameters are defined. The number of iterations can be reduced using starting values close to the global maximum. Optimization procedure will be illustrated in framework of modeling volatility on daily basis of the most liquid stocks on Croatian capital market: Podravka stocks (food industry), Petrokemija stocks (fertilizer industry) and Ericsson Nikola Tesla stocks (information-s-communications industry).

Keywords: Heteroscedasticity, Log-likelihood Maximization, Quasi-Newton iteration procedure, Volatility.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2603
10 EFL Teachers’ Metacognitive Awareness as a Predictor of Their Professional Success

Authors: Saeedeh Shafiee Nahrkhalaji

Abstract:

Metacognitive knowledge increases EFL students’ ability to be successful learners. Although this relationship has been investigated by a number of scholars, EFL teachers’ explicit awareness of their cognitive knowledge has not been sufficiently explored. The aim of this study was to examine the role of EFL teachers’ metacognitive knowledge in their pedagogical performance. Furthermore, the role played by years of their academic education and teaching experience was also studied. Fifty female EFL teachers were selected. They completed Metacognitive Awareness Inventory (MAI) that assessed six components of metacognition including procedural knowledge, declarative knowledge, conditional knowledge, planning, evaluating, and management strategies. Near the end of the academic semester, the students of each class filled in ‘the Language Teacher Characteristics Questionnaire’ to evaluate their teachers’ pedagogical performance. Four elements of MAI, declarative knowledge, planning, evaluating, and management strategies were found to be significantly correlated with EFL teachers’ pedagogical success. Significant correlation was also established between metacognitive knowledge and EFL teachers’ years of academic education and teaching experience. The findings obtained from this research have contributing implication for EFL teacher educators. The discussion concludes by setting out directions for future research.

Keywords: Metacognotive Knowledge, Pedagogical Performance, Language Teacher Characteristics Questionnaire, Metacognitive Awareness Inventory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2659
9 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data

Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer

Abstract:

This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.

Keywords: Non-stationary, BINARMA(1, 1) model, Poisson Innovations, CML

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 547
8 Underivatized Amino Acid Analyses Using Liquid Chromatography-Tandem Mass Spectrometry in Scalp Hair of Children with Autism Spectrum Disorder

Authors: Ayat Bani Rashaid, Zain Khasawneh, Mazin Alqhazo, Shreen Nusair, Mohammad El-Khateeb, Mahmoud Bashtawi

Abstract:

Autism Spectrum disorder (ASD) is a psychiatric disorder with unknown etiology that mainly affects children in the first three years of life. Alterations of amino acid levels are believed to contribute to ASD. The levels of six essential amino acids (methionine, histidine, valine, leucine, threonine, and phenylalanine), five conditional amino acids (proline, tyrosine, glutamine, cysteine, and cystine), and five non-essential amino acids (asparagine, aspartic acid, alanine, serine, and glutamic acid) in hair samples of children with ASD (n = 25) were analyzed and compared to corresponding levels in healthy age-matched controls (n = 25). The results showed that the levels of methionine, alanine, and asparagine were significantly lower in the hair samples of ASD group compared to those of the control group (p ≤ 0.05). However, the levels of glutamic acid were significantly higher in the ASD group than the control group (p ≤ 0.05). The current findings could contribute towards further understanding of ASD etiology and provide specialists with a hair amino acid profile utilized as a biomarker for early diagnosis of ASD. Such biomarkers could participate in future developments of therapies that reduce ASD-related symptoms.

Keywords: Autism spectrum disorder, amino acids, liquid chromatography-tandem mass spectrometry, human hair.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 585
7 Computing Entropy for Ortholog Detection

Authors: Hsing-Kuo Pao, John Case

Abstract:

Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.

Keywords: compression, decision tree, entropy, ortholog, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787
6 Evolutionary Approach for Automated Discovery of Censored Production Rules

Authors: Kamal K. Bharadwaj, Basheer M. Al-Maqaleh

Abstract:

In the recent past, there has been an increasing interest in applying evolutionary methods to Knowledge Discovery in Databases (KDD) and a number of successful applications of Genetic Algorithms (GA) and Genetic Programming (GP) to KDD have been demonstrated. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. The PRs, however, are unable to handle exceptions and do not exhibit variable precision. The Censored Production Rules (CPRs), an extension of PRs, were proposed by Michalski & Winston that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations, in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence are tight or there is simply no information available as to whether it holds or not. Thus, the 'If P Then D' part of the CPR expresses important information, while the Unless C part acts only as a switch and changes the polarity of D to ~D. This paper presents a classification algorithm based on evolutionary approach that discovers comprehensible rules with exceptions in the form of CPRs. The proposed approach has flexible chromosome encoding, where each chromosome corresponds to a CPR. Appropriate genetic operators are suggested and a fitness function is proposed that incorporates the basic constraints on CPRs. Experimental results are presented to demonstrate the performance of the proposed algorithm.

Keywords: Censored Production Rule, Data Mining, MachineLearning, Evolutionary Algorithms.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831