Search results for: probabilistic programs
438 Service-Oriented Architecture for Object- Centric Information Fusion
Authors: Jeffrey A. Dunne, Kevin Ligozio
Abstract:
In many applications there is a broad variety of information relevant to a focal “object" of interest, and the fusion of such heterogeneous data types is desirable for classification and categorization. While these various data types can sometimes be treated as orthogonal (such as the hull number, superstructure color, and speed of an oil tanker), there are instances where the inference and the correlation between quantities can provide improved fusion capabilities (such as the height, weight, and gender of a person). A service-oriented architecture has been designed and prototyped to support the fusion of information for such “object-centric" situations. It is modular, scalable, and flexible, and designed to support new data sources, fusion algorithms, and computational resources without affecting existing services. The architecture is designed to simplify the incorporation of legacy systems, support exact and probabilistic entity disambiguation, recognize and utilize multiple types of uncertainties, and minimize network bandwidth requirements.Keywords: Data fusion, distributed computing, service-oriented architecture, SOA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1469437 Reliability Analysis of Tubular Joints of Offshore Platforms in Malaysia
Authors: Nelson J. Cossa, Narayanan S. Potty, Mohd Shahir Liew, Arazi B. Idrus
Abstract:
The oil and gas industry has moved towards Load and Resistance Factor Design through API RP2A - LRFD and the recently published international standard, ISO-19902, for design of fixed steel offshore structures. The ISO 19902 is intended to provide a harmonized design practice that offers a balanced structural fitness for the purpose, economy and safety. As part of an ongoing work, the reliability analysis of tubular joints of the jacket structure has been carried out to calibrate the load and resistance factors for the design of offshore platforms in Malaysia, as proposed in the ISO. Probabilistic models have been established for the load effects (wave, wind and current) and the tubular joints strengths. In this study the First Order Reliability Method (FORM), coded in MATLAB Software has been employed to evaluate the reliability index of the typical joints, designed using API RP2A - WSD and ISO 19902.Keywords: FORM, Reliability Analysis, Tubular Joints
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3495436 Reliability Analysis of Underground Pipelines Using Subset Simulation
Authors: Kong Fah Tee, Lutfor Rahman Khan, Hongshuang Li
Abstract:
An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.
Keywords: Underground pipelines, Probability of failure, Reliability and Subset Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3556435 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model
Authors: Siyuan Jing, Kun She
Abstract:
Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588434 Metrology-Inspired Methods to Assess the Biases of Artificial Intelligence Systems
Authors: Belkacem Laimouche
Abstract:
With the field of Artificial Intelligence (AI) experiencing exponential growth, fueled by technological advancements that pave the way for increasingly innovative and promising applications, there is an escalating need to develop rigorous methods for assessing their performance in pursuit of transparency and equity. This article proposes a metrology-inspired statistical framework for evaluating bias and explainability in AI systems. Drawing from the principles of metrology, we propose a pioneering approach, using a concrete example, to evaluate the accuracy and precision of AI models, as well as to quantify the sources of measurement uncertainty that can lead to bias in their predictions. Furthermore, we explore a statistical approach for evaluating the explainability of AI systems based on their ability to provide interpretable and transparent explanations of their predictions.
Keywords: Artificial intelligence, metrology, measurement uncertainty, prediction error, bias, machine learning algorithms, probabilistic models, inter-laboratory comparison, data analysis, data reliability, bias impact assessment, bias measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148433 Solving Single Machine Total Weighted Tardiness Problem Using Gaussian Process Regression
Authors: Wanatchapong Kongkaew
Abstract:
This paper proposes an application of probabilistic technique, namely Gaussian process regression, for estimating an optimal sequence of the single machine with total weighted tardiness (SMTWT) scheduling problem. In this work, the Gaussian process regression (GPR) model is utilized to predict an optimal sequence of the SMTWT problem, and its solution is improved by using an iterated local search based on simulated annealing scheme, called GPRISA algorithm. The results show that the proposed GPRISA method achieves a very good performance and a reasonable trade-off between solution quality and time consumption. Moreover, in the comparison of deviation from the best-known solution, the proposed mechanism noticeably outperforms the recently existing approaches.
Keywords: Gaussian process regression, iterated local search, simulated annealing, single machine total weighted tardiness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2236432 Bin Bloom Filter Using Heuristic Optimization Techniques for Spam Detection
Authors: N. Arulanand, K. Premalatha
Abstract:
Bloom filter is a probabilistic and memory efficient data structure designed to answer rapidly whether an element is present in a set. It tells that the element is definitely not in the set but its presence is with certain probability. The trade-off to use Bloom filter is a certain configurable risk of false positives. The odds of a false positive can be made very low if the number of hash function is sufficiently large. For spam detection, weight is attached to each set of elements. The spam weight for a word is a measure used to rate the e-mail. Each word is assigned to a Bloom filter based on its weight. The proposed work introduces an enhanced concept in Bloom filter called Bin Bloom Filter (BBF). The performance of BBF over conventional Bloom filter is evaluated under various optimization techniques. Real time data set and synthetic data sets are used for experimental analysis and the results are demonstrated for bin sizes 4, 5, 6 and 7. Finally analyzing the results, it is found that the BBF which uses heuristic techniques performs better than the traditional Bloom filter in spam detection.
Keywords: Cuckoo search algorithm, levy’s flight, metaheuristic, optimal weight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2262431 Cloud Computing Support for Diagnosing Researches
Authors: A. Amirov, O. Gerget, V. Kochegurov
Abstract:
One of the main biomedical problem lies in detecting dependencies in semi structured data. Solution includes biomedical portal and algorithms (integral rating health criteria, multidimensional data visualization methods). Biomedical portal allows to process diagnostic and research data in parallel mode using Microsoft System Center 2012, Windows HPC Server cloud technologies. Service does not allow user to see internal calculations instead it provides practical interface. When data is sent for processing user may track status of task and will achieve results as soon as computation is completed. Service includes own algorithms and allows diagnosing and predicating medical cases. Approved methods are based on complex system entropy methods, algorithms for determining the energy patterns of development and trajectory models of biological systems and logical–probabilistic approach with the blurring of images.
Keywords: Biomedical portal, cloud computing, diagnostic and prognostic research, mathematical data analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1644430 Differential Protection for Power Transformer Using Wavelet Transform and PNN
Authors: S. Sendilkumar, B. L. Mathur, Joseph Henry
Abstract:
A new approach for protection of power transformer is presented using a time-frequency transform known as Wavelet transform. Different operating conditions such as inrush, Normal, load, External fault and internal fault current are sampled and processed to obtain wavelet coefficients. Different Operating conditions provide variation in wavelet coefficients. Features like energy and Standard deviation are calculated using Parsevals theorem. These features are used as inputs to PNN (Probabilistic neural network) for fault classification. The proposed algorithm provides more accurate results even in the presence of noise inputs and accurately identifies inrush and fault currents. Overall classification accuracy of the proposed method is found to be 96.45%. Simulation of the fault (with and without noise) was done using MATLAB AND SIMULINK software taking 2 cycles of data window (40 m sec) containing 800 samples. The algorithm was evaluated by using 10 % Gaussian white noise.Keywords: Power Transformer, differential Protection, internalfault, inrush current, Wavelet Energy, Db9.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3132429 Seismic Fragility of Weir Structure Considering Aging Degradation of Concrete Material
Authors: HoYoung Son, DongHoon Shin, WooYoung Jung
Abstract:
This study presented the seismic fragility framework of concrete weir structure subjected to strong seismic ground motions and in particular, concrete aging condition of the weir structure was taken into account in this study. In order to understand the influence of concrete aging on the weir structure, by using probabilistic risk assessment, the analytical seismic fragility of the weir structure was derived for pre- and post-deterioration of concrete. The performance of concrete weir structure after five years was assumed for the concrete aging or deterioration, and according to after five years’ condition, the elastic modulus was simply reduced about one–tenth compared with initial condition of weir structures. A 2D nonlinear finite element analysis was performed considering the deterioration of concrete in weir structures using ABAQUS platform, a commercial structural analysis program. Simplified concrete degradation was resulted in the increase of almost 45% of the probability of failure at Limit State 3, in comparison to initial construction stage, by analyzing the seismic fragility.
Keywords: Weir, FEM, concrete, fragility, aging
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1154428 Monotonicity of Dependence Concepts from Independent Random Vector into Dependent Random Vector
Authors: Guangpu Chen
Abstract:
When the failure function is monotone, some monotonic reliability methods are used to gratefully simplify and facilitate the reliability computations. However, these methods often work in a transformed iso-probabilistic space. To this end, a monotonic simulator or transformation is needed in order that the transformed failure function is still monotone. This note proves at first that the output distribution of failure function is invariant under the transformation. And then it presents some conditions under which the transformed function is still monotone in the newly obtained space. These concern the copulas and the dependence concepts. In many engineering applications, the Gaussian copulas are often used to approximate the real word copulas while the available information on the random variables is limited to the set of marginal distributions and the covariances. So this note catches an importance on the conditional monotonicity of the often used transformation from an independent random vector into a dependent random vector with Gaussian copulas.
Keywords: Monotonic, Rosenblatt, Nataf transformation, dependence concepts, completely positive matrices, Gaussiancopulas
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1212427 Use of Bayesian Network in Information Extraction from Unstructured Data Sources
Authors: Quratulain N. Rajput, Sajjad Haider
Abstract:
This paper applies Bayesian Networks to support information extraction from unstructured, ungrammatical, and incoherent data sources for semantic annotation. A tool has been developed that combines ontologies, machine learning, and information extraction and probabilistic reasoning techniques to support the extraction process. Data acquisition is performed with the aid of knowledge specified in the form of ontology. Due to the variable size of information available on different data sources, it is often the case that the extracted data contains missing values for certain variables of interest. It is desirable in such situations to predict the missing values. The methodology, presented in this paper, first learns a Bayesian network from the training data and then uses it to predict missing data and to resolve conflicts. Experiments have been conducted to analyze the performance of the presented methodology. The results look promising as the methodology achieves high degree of precision and recall for information extraction and reasonably good accuracy for predicting missing values.Keywords: Information Extraction, Bayesian Network, ontology, Machine Learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2232426 Accreditation and Quality Assurance of Nigerian Universities: The Management Imperative
Authors: F. O Anugom
Abstract:
The general functions of the university amongst other things include teaching, research and community service. Universities are recognized as the apex of learning, accumulating and imparting knowledge and skills of all kinds to students to enable them to be productive, earn their living and to make optimum contributions to national development. This is equivalent to the production of human capital in the form of high level manpower needed to administer the educational society, be useful to the society and manage the economy. Quality has become a matter of major importance for university education in Nigeria. Accreditation is the systematic review of educational programs to ensure that acceptable standards of education, scholarship and infrastructure are being maintained. Accreditation ensures that institution maintain quality. The process is designed to determine whether or not an institution has met or exceeded the published standards for accreditation, and whether it is achieving its mission and stated purposes. Ensuring quality assurance in accreditation process falls in the hands of university management which justified the need for this study. This study examined accreditation and quality assurance: the management imperative. Three research questions and three hypotheses guided the study. The design was a correlation survey with a population of 2,893 university administrators out of which 578 Heads of department and Dean of faculties were sampled. The instrument for data collection was titled Programme Accreditation Exercise scale with high levels of reliability. The research questions were answered with Pearson ‘r’ statistics. T-test statistics was used to test the hypotheses. It was found among others that the quality of accredited programme depends on the level of funding of universities in Nigeria. It was also indicated that quality of programme accreditation and physical facilities of universities in Nigeria have high relationship. But it was also revealed that programme accreditation is positively related to staffing in Nigerian universities. Based on the findings of the study, the researcher recommend that academic administrators should be included in the team of those who ensure quality programs in the universities. Private sector partnership should be encouraged to fund programs to ensure quality of programme in the universities. Independent agencies should be engaged to monitor the activities of accreditation teams to avoid bias.
Keywords: Accreditation, quality assurance, NUC, physical facilities, staffing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927425 Developing of Knowledge-Based System for the Medical Treatment with Herbs
Authors: Rujijan Vichivanives
Abstract:
This research aims to create a knowledge-based system as a database for self-healthcare analysis, diagnosis of simple illnesses, and the use of Thai herbs instead of modern medicine by using principles of Thai traditional medication theory. These were disseminated by website network programs within Suan Sunandha Rajabhat University. The population used in this study was divided into two groups: the first group consisted of four experts of Thai traditional medication and the second group was 300 website users. The methods used for collecting data were paper questionnaires and poll questionnaires on the website. The statistics used for analyzing data was at an average level. The results were divided into three parts: the first part was the development of a knowledge-based system and the second part was applied programs on website. Both parts could be fulfilled and achieved according to the set goal. The third part was the evaluation of the study: The evaluation of the viewpoints of the experts towards website designs were evaluated at a good level of 4.20. The satisfaction evaluation of the users was found at a good level of average satisfactory level at 4.24. It was found that the young population of those under the age of 16 had less cares about their health than the population of other teenagers, working age adults and those of older age. The research findings should be extended in order to encourage the lifestyle modifications to people of all ages by using the self-healthcare principles.
Keywords: Developing, Herbs, Knowledge-based system, Medical treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1745424 Optimal Simultaneous Sizing and Siting of DGs and Smart Meters Considering Voltage Profile Improvement in Active Distribution Networks
Authors: T. Sattarpour, D. Nazarpour
Abstract:
This paper investigates the effect of simultaneous placement of DGs and smart meters (SMs), on voltage profile improvement in active distribution networks (ADNs). A substantial center of attention has recently been on responsive loads initiated in power system problem studies such as distributed generations (DGs). Existence of responsive loads in active distribution networks (ADNs) would have undeniable effect on sizing and siting of DGs. For this reason, an optimal framework is proposed for sizing and siting of DGs and SMs in ADNs. SMs are taken into consideration for the sake of successful implementing of demand response programs (DRPs) such as direct load control (DLC) with end-side consumers. Looking for voltage profile improvement, the optimization procedure is solved by genetic algorithm (GA) and tested on IEEE 33-bus distribution test system. Different scenarios with variations in the number of DG units, individual or simultaneous placing of DGs and SMs, and adaptive power factor (APF) mode for DGs to support reactive power have been established. The obtained results confirm the significant effect of DRPs and APF mode in determining the optimal size and site of DGs to be connected in ADN resulting to the improvement of voltage profile as well.
Keywords: Active distribution network (ADN), distributed generations (DGs), smart meters (SMs), demand response programs (DRPs), adaptive power factor (APF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770423 Classifying of Maize Inbred Lines into Heterotic Groups using Diallel Analysis
Authors: Mozhgan Ziaie Bidhendi, Rajab Choukan, Farokh Darvish, Khodadad Mostafavi, Eslam Majidi
Abstract:
The selection of parents and breeding strategies for the successful maize hybrid production will be facilitated by heterotic groupings of parental lines and determination of combining abilities of them. Fourteen maize inbred lines, used in maize breeding programs in Iran, were crossed in a diallel mating design. The 91 F1 hybrids and the 14 parental lines were studied during two years at four locations of Iran for investigation of combining ability of gentypes for grain yield and to determine heterotic patterns among germplasm sources, using both, the Griffing-s method and the biplot approach for diallel analysis. The graphical representation offered by biplot analysis allowed a rapid and effective overview of general combining ability (GCA) and specific combining ability (SCA) effects of the inbred lines, their performance in crosses, as well as grouping patterns of similar genotypes. GCA and SCA effects were significant for grain yield (GY). Based on significant positive GCA effects, the lines derived from LSC could be used as parent in crosses to increase GY. The maximum best- parent heterosis values and highest SCA effects resulted from crosses B73 × MO17 and A679 × MO17 for GY. The best heterotic patterns were LSC × RYD, which would be potentially useful in maize breeding programs to obtain high-yielding hybrids in the same climate of Iran.Keywords: biplot, diallel, Griffing, Heterotic pattern
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5621422 Application of Java-based Pointcuts in Aspect Oriented Programming (AOP) for Data Race Detection
Authors: Sadaf Khalid, Fahim Arif
Abstract:
Wide applicability of concurrent programming practices in developing various software applications leads to different concurrency errors amongst which data race is the most important. Java provides greatest support for concurrent programming by introducing various concurrency packages. Aspect oriented programming (AOP) is modern programming paradigm facilitating the runtime interception of events of interest and can be effectively used to handle the concurrency problems. AspectJ being an aspect oriented extension to java facilitates the application of concepts of AOP for data race detection. Volatile variables are usually considered thread safe, but they can become the possible candidates of data races if non-atomic operations are performed concurrently upon them. Various data race detection algorithms have been proposed in the past but this issue of volatility and atomicity is still unaddressed. The aim of this research is to propose some suggestions for incorporating certain conditions for data race detection in java programs at the volatile fields by taking into account support for atomicity in java concurrency packages and making use of pointcuts. Two simple test programs will demonstrate the results of research. The results are verified on two different Java Development Kits (JDKs) for the purpose of comparison.Keywords: Aspect Bench Compiler (abc), Aspect OrientedProgramming (AOP), AspectJ, Aspects, Concurrency packages, Concurrent programming, Cross-cutting Concerns, Data race, Eclipse, Java, Java Development Kits (JDKs), Pointcuts
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1931421 A New Approach for Network Reconfiguration Problem in Order to Deviation Bus Voltage Minimization with Regard to Probabilistic Load Model and DGs
Authors: Mahmood Reza Shakarami, Reza Sedaghati
Abstract:
Recently, distributed generation technologies have received much attention for the potential energy savings and reliability assurances that might be achieved as a result of their widespread adoption. The distribution feeder reconfiguration (DFR) is one of the most important control schemes in the distribution networks, which can be affected by DGs. This paper presents a new approach to DFR at the distribution networks considering wind turbines. The main objective of the DFR is to minimize the deviation of the bus voltage. Since the DFR is a nonlinear optimization problem, we apply the Adaptive Modified Firefly Optimization (AMFO) approach to solve it. As a result of the conflicting behavior of the single- objective function, a fuzzy based clustering technique is employed to reach the set of optimal solutions called Pareto solutions. The approach is tested on the IEEE 32-bus standard test system.
Keywords: Adaptive Modified Firefly Optimization (AMFO), Pareto solutions, feeder reconfiguration, wind turbines, bus voltage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2017420 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis
Authors: Sidi Yang, Haiyi Zhang
Abstract:
Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.
Keywords: Text mining, Twitter, topic model, sentiment analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1811419 The Impact of Brand Loyalty on Product Performance
Authors: Tanzeel bin Abdul Rauf Patker, Saba Mateen
Abstract:
This research investigates the impact of Brand Loyalty on the product performance and the factors those are considered more important in brand reputation. Variables selected for this research are Brand quality, Brand Equity, Brand Reputation to explore the impact of these variables on Product performance. For this purpose, primary research has been conducted. The questionnaire survey for this research study was administered among the population mainly at the shopping malls. For this research study, a sample size of 250 respondents has been taken into consideration. Customers from the shopping malls and university students constitute the sample for this research study using random sampling (non-probabilistic) used as a sampling technique for conducting the research survey. According to the results obtained from the collected data, it is interpreted that product performance shares a direct relationship with brand quality, brand quality, and brand reputation. Result also showed that brand quality and brand equity has a significant effect on product performance, whereas brand reputation has an insignificant effect on product performance.Keywords: Product performance, brand quality, brand equity and brand reputation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2759418 The Impact of Innovation Best Practices in Economic Development
Authors: Hanadi Mubarak AL-Mubaraki, Michael Busler
Abstract:
Innovation is the process of making changes, differences, and novelties in the products and services, adding values and business practices to create economic and social benefit. The purpose of this paper is to identify the strengths and weaknesses of innovation programs in developed and developing countries. We used a mixed-methods approach, quantitative as survey and qualitative as a multi-case study to examine innovation best practices in developed and developing countries. In addition, four case studies of innovation organisations based on the best practices and successful implementation in the developed and developing countries are selected for examination. The research findings provide guidance, suggestions, and recommendations for future implementation in developed and developing countries for practitioners such as policy makers, governments, funded organizations, and strategic institutions. In conclusion, innovation programs are vital tools for economic growth, knowledge, and technology transfer based on the several indicators such as creativity, entrepreneurship, role of government, role of university, strategic focus, new products, survival rate, job creation, start-up companies, and number of patents. The authors aim to conduct future research which will include a comparative study of innovation case studies between developed and developing countries for policy implications worldwide. The originality of this study makes a contribution to the current literature about the innovation best practice in developed and developing countries.
Keywords: Economic development, entrepreneurship, developed countries, innovation program.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1063417 Availability Analysis of Milling System in a Rice Milling Plant
Authors: P. C. Tewari, Parveen Kumar
Abstract:
The paper describes the availability analysis of milling system of a rice milling plant using probabilistic approach. The subsystems under study are special purpose machines. The availability analysis of the system is carried out to determine the effect of failure and repair rates of each subsystem on overall performance (i.e. steady state availability) of system concerned. Further, on the basis of effect of repair rates on the system availability, maintenance repair priorities have been suggested. The problem is formulated using Markov Birth-Death process taking exponential distribution for probable failures and repair rates. The first order differential equations associated with transition diagram are developed by using mnemonic rule. These equations are solved using normalizing conditions and recursive method to drive out the steady state availability expression of the system. The findings of the paper are presented and discussed with the plant personnel to adopt a suitable maintenance policy to increase the productivity of the rice milling plant.Keywords: Markov process, milling system, availability modeling, rice milling plant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1579416 Unit Selection Algorithm Using Bi-grams Model For Corpus-Based Speech Synthesis
Authors: Mohamed Ali KAMMOUN, Ahmed Ben HAMIDA
Abstract:
In this paper, we present a novel statistical approach to corpus-based speech synthesis. Classically, phonetic information is defined and considered as acoustic reference to be respected. In this way, many studies were elaborated for acoustical unit classification. This type of classification allows separating units according to their symbolic characteristics. Indeed, target cost and concatenation cost were classically defined for unit selection. In Corpus-Based Speech Synthesis System, when using large text corpora, cost functions were limited to a juxtaposition of symbolic criteria and the acoustic information of units is not exploited in the definition of the target cost. In this manuscript, we token in our consideration the unit phonetic information corresponding to acoustic information. This would be realized by defining a probabilistic linguistic Bi-grams model basically used for unit selection. The selected units would be extracted from the English TIMIT corpora.Keywords: Unit selection, Corpus-based Speech Synthesis, Bigram model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441415 Efficient Tools for Managing Uncertainties in Design and Operation of Engineering Structures
Authors: J. Menčík
Abstract:
Actual load, material characteristics and other quantities often differ from the design values. This can cause worse function, shorter life or failure of a civil engineering structure, a machine, vehicle or another appliance. The paper shows main causes of the uncertainties and deviations and presents a systematic approach and efficient tools for their elimination or mitigation of consequences. Emphasis is put on the design stage, which is most important for reliability ensuring. Principles of robust design and important tools are explained, including FMEA, sensitivity analysis and probabilistic simulation methods. The lifetime prediction of long-life objects can be improved by long-term monitoring of the load response and damage accumulation in operation. The condition evaluation of engineering structures, such as bridges, is often based on visual inspection and verbal description. Here, methods based on fuzzy logic can reduce the subjective influences.Keywords: Design, fuzzy methods, Monte Carlo, reliability, robust design, sensitivity analysis, simulation, uncertainties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1815414 Fragility Analysis of Weir Structure Subjected to Flooding Water Damage
Authors: Oh Hyeon Jeon, WooYoung Jung
Abstract:
In this study, seepage analysis was performed by the level difference between upstream and downstream of weir structure for safety evaluation of weir structure against flooding. Monte Carlo Simulation method was employed by considering the probability distribution of the adjacent ground parameter, i.e., permeability coefficient of weir structure. Moreover, by using a commercially available finite element program (ABAQUS), modeling of the weir structure is carried out. Based on this model, the characteristic of water seepage during flooding was determined at each water level with consideration of the uncertainty of their corresponding permeability coefficient. Subsequently, fragility function could be constructed based on this response from numerical analysis; this fragility function results could be used to determine the weakness of weir structure subjected to flooding disaster. They can also be used as a reference data that can comprehensively predict the probability of failur,e and the degree of damage of a weir structure.
Keywords: Weir structure, seepage, flood disaster fragility, probabilistic risk assessment, Monte-Carlo Simulation, permeability coefficient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1165413 A Bayesian Network Reliability Modeling for FlexRay Systems
Authors: Kuen-Long Leu, Yung-Yuan Chen, Chin-Long Wey, Jwu-E Chen, Chung-Hsien Hsu
Abstract:
The increasing importance of FlexRay systems in automotive domain inspires unceasingly relative researches. One primary issue among researches is to verify the reliability of FlexRay systems either from protocol aspect or from system design aspect. However, research rarely discusses the effect of network topology on the system reliability. In this paper, we will illustrate how to model the reliability of FlexRay systems with various network topologies by a well-known probabilistic reasoning technology, Bayesian Network. In this illustration, we especially investigate the effectiveness of error containment built in star topology and fault-tolerant midpoint synchronization algorithm adopted in FlexRay communication protocol. Through a FlexRay steer-by-wire case study, the influence of different topologies on the failure probability of the FlexRay steerby- wire system is demonstrated. The notable value of this research is to show that the Bayesian Network inference is a powerful and feasible method for the reliability assessment of FlexRay systems.Keywords: Bayesian Network, FlexRay, fault tolerance, network topology, reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2029412 Selection of Intensity Measure in Probabilistic Seismic Risk Assessment of a Turkish Railway Bridge
Authors: M. F. Yilmaz, B. Ö. Çağlayan
Abstract:
Fragility curve is an effective common used tool to determine the earthquake performance of structural and nonstructural components. Also, it is used to determine the nonlinear behavior of bridges. There are many historical bridges in the Turkish railway network; the earthquake performances of these bridges are needed to be investigated. To derive fragility curve Intensity measures (IMs) and Engineering demand parameters (EDP) are needed to be determined. And the relation between IMs and EDP are needed to be derived. In this study, a typical simply supported steel girder riveted railway bridge is studied. Fragility curves of this bridge are derived by two parameters lognormal distribution. Time history analyses are done for selected 60 real earthquake data to determine the relation between IMs and EDP. Moreover, efficiency, practicality, and sufficiency of three different IMs are discussed. PGA, Sa(0.2s) and Sa(1s), the most common used IMs parameters for fragility curve in the literature, are taken into consideration in terms of efficiency, practicality and sufficiency.
Keywords: Railway bridges, earthquake performance, fragility analyses, selection of intensity measures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 904411 Seismic Fragility Curves for Shallow Circular Tunnels under Different Soil Conditions
Authors: Siti Khadijah Che Osmi, Syed Mohd Ahmad
Abstract:
This paper presents a methodology to develop fragility curves for shallow tunnels so as to describe a relationship between seismic hazard and tunnel vulnerability. Emphasis is given to the influence of surrounding soil material properties because the dynamic behaviour of the tunnel mostly depends on it. Four ground properties of soils ranging from stiff to soft soils are selected. A 3D nonlinear time history analysis is used to evaluate the seismic response of the tunnel when subjected to five real earthquake ground intensities. The derived curves show the future probabilistic performance of the tunnels based on the predicted level of damage states corresponding to the peak ground acceleration. A comparison of the obtained results with the previous literature is provided to validate the reliability of the proposed fragility curves. Results show the significant role of soil properties and input motions in evaluating the seismic performance and response of shallow tunnels.
Keywords: Fragility analysis, seismic performance, tunnel lining, vulnerability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1391410 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1615409 Prediction of Reusability of Object Oriented Software Systems using Clustering Approach
Authors: Anju Shri, Parvinder S. Sandhu, Vikas Gupta, Sanyam Anand
Abstract:
In literature, there are metrics for identifying the quality of reusable components but the framework that makes use of these metrics to precisely predict reusability of software components is still need to be worked out. These reusability metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the software component and hence improve the productivity due to probabilistic increase in the reuse level. As CK metric suit is most widely used metrics for extraction of structural features of an object oriented (OO) software; So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO and LCOM, is used to obtain the structural analysis of OO-based software components. An algorithm has been proposed in which the inputs can be given to K-Means Clustering system in form of tuned values of the OO software component and decision tree is formed for the 10-fold cross validation of data to evaluate the in terms of linguistic reusability value of the component. The developed reusability model has produced high precision results as desired.Keywords: CK-Metric, Desicion Tree, Kmeans, Reusability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913