Search results for: stochastic modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4324

Search results for: stochastic modeling

2614 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model

Authors: Amit R. Bhende, G. K. Awari

Abstract:

Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.

Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis

Procedia PDF Downloads 434
2613 Parasitic Capacitance Modeling in Pulse Transformer Using FEA

Authors: D. Habibinia, M. R. Feyzi

Abstract:

Nowadays, specialized software is vastly used to verify the performance of an electric machine prototype by evaluating a model of the system. These models mainly consist of electrical parameters such as inductances and resistances. However, when the operating frequency of the device is above one kHz, the effect of parasitic capacitances grows significantly. In this paper, a software-based procedure is introduced to model these capacitances within the electromagnetic simulation of the device. The case study is a high-frequency high-voltage pulse transformer. The Finite Element Analysis (FEA) software with coupled field analysis is used in this method.

Keywords: finite element analysis, parasitic capacitance, pulse transformer, high frequency

Procedia PDF Downloads 513
2612 Evaluation of the Elastic Mechanical Properties of a Hybrid Adhesive Material

Authors: Moudar H. A. Zgoul, Amin Al Zamer

Abstract:

Adhesive materials and adhesion have been the focal point of multiple research works related to numerous applications, particularly, aerospace, and aviation industries. To enhance the properties of conventional adhesive materials, additives have been introduced to the mix in order to enhance their mechanical and physical properties by creating a hybrid adhesive material. The evaluation of the mechanical properties of such hybrid adhesive materials is thus of an essential requirement for the purpose of properly modeling their behavior accurately. This paper presents an approach/tool to simulate the behavior such hybrid adhesives in a way that will allow researchers to better understand their behavior while in service.

Keywords: adhesive materials, analysis, hybrid adhesives, mechanical properties, simulation

Procedia PDF Downloads 418
2611 Contribution to the Analytical Study of the Stability of a DC-DC Converter (Boost) Used for MPPT Control

Authors: Mohamed Amarouayache, Badia Amrouche, Gharbi Akila, Boukadoume Mohamed

Abstract:

This work is devoted to the modeling of DC-DC converter (boost) used for MPPT applications to set conditions of stability. For this, we establish a linear mathematical model of the DC-DC converter with an average small signal model. This model has allowed us to apply conventional linear methods of automation. A mathematical relationship between the duty cycle and the voltage of the panel has been set up. With this relationship we specify the conditions of the stability in closed-loop depending on the system parameters (the elements of storage capacity and inductance, PWM control).

Keywords: MPPT, PWM, stability, criterion of Routh, average small signal model

Procedia PDF Downloads 440
2610 An Efficient Approach to Optimize the Cost and Profit of a Tea Garden by Using Branch and Bound Method

Authors: Abu Hashan Md Mashud, M. Sharif Uddin, Aminur Rahman Khan

Abstract:

In this paper, we formulate a new problem as a linear programming and Integer Programming problem and maximize profit within the limited budget and limited resources based on the construction of a tea garden problem. It describes a new idea about how to optimize profit and focuses on the practical aspects of modeling and the challenges of providing a solution to a complex real life problem. Finally, a comparative study is carried out among Graphical method, Simplex method and Branch and bound method.

Keywords: integer programming, tea garden, graphical method, simplex method, branch and bound method

Procedia PDF Downloads 621
2609 Prosody Generation in Neutral Speech Storytelling Application Using Tilt Model

Authors: Manjare Chandraprabha A., S. D. Shirbahadurkar, Manjare Anil S., Paithne Ajay N.

Abstract:

This paper proposes Intonation Modeling for Prosody generation in Neutral speech for Marathi (language spoken in Maharashtra, India) story telling applications. Nowadays audio story telling devices are very eminent for children. In this paper, we proposed tilt model for stressed words in Marathi for speech modification. Tilt model predicts modification in tone of neutral speech. GMM is used to identify stressed words for modification.

Keywords: tilt model, fundamental frequency, statistical parametric speech synthesis, GMM

Procedia PDF Downloads 390
2608 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil

Procedia PDF Downloads 128
2607 Requirements Gathering for Improved Software Usability and the Potential for Usage-Centred Design

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

Usability is an important software quality that is often neglected at the design stage. Although methods exist to incorporate elements of usability engineering, there is a need for more balanced usability focused methods that can enhance the experience of software usability for users. In this regard, the potential for Usage-Centered Design is explored with respect to requirements gathering and is shown to lead to high software usability besides other benefits. It achieves this through its focus on usage, defining essential use cases, by conducting task modeling, encouraging user collaboration, refining requirements, and so on. The requirements gathering process in UgCD is described in detail.

Keywords: requirements gathering, usability, usage-centred design, computer science

Procedia PDF Downloads 356
2606 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood

Authors: Randa Alharbi, Vladislav Vyshemirsky

Abstract:

Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.

Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)

Procedia PDF Downloads 201
2605 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 494
2604 Static Modeling of the Delamination of a Composite Material Laminate in Mode II

Authors: Y. Madani, H. Achache, B. Boutabout

Abstract:

The purpose of this paper is to analyze numerically by the three-dimensional finite element method, using ABAQUS calculation code, the mechanical behavior of a unidirectional and multidirectional delaminated stratified composite under mechanical loading in Mode II. This study consists of the determination of the energy release rate G in mode II as well as the distribution of equivalent von Mises stresses along the damaged zone by varying several parameters such as the applied load and the delamination length. It allowed us to deduce that the high energy release rate favors delamination at the free edges of a stratified plate subjected to bending.

Keywords: delamination, energy release rate, finite element method, stratified composite

Procedia PDF Downloads 175
2603 Modeling the Acquisition of Expertise in a Sequential Decision-Making Task

Authors: Cristóbal Moënne-Loccoz, Rodrigo C. Vergara, Vladimir López, Domingo Mery, Diego Cosmelli

Abstract:

Our daily interaction with computational interfaces is plagued of situations in which we go from inexperienced users to experts through self-motivated exploration of the same task. In many of these interactions, we must learn to find our way through a sequence of decisions and actions before obtaining the desired result. For instance, when drawing cash from an ATM machine, choices are presented in a step-by-step fashion so that a specific sequence of actions must be performed in order to produce the expected outcome. But, as they become experts in the use of such interfaces, do users adopt specific search and learning strategies? Moreover, if so, can we use this information to follow the process of expertise development and, eventually, predict future actions? This would be a critical step towards building truly adaptive interfaces that can facilitate interaction at different moments of the learning curve. Furthermore, it could provide a window into potential mechanisms underlying decision-making behavior in real world scenarios. Here we tackle this question using a simple game interface that instantiates a 4-level binary decision tree (BDT) sequential decision-making task. Participants have to explore the interface and discover an underlying concept-icon mapping in order to complete the game. We develop a Hidden Markov Model (HMM)-based approach whereby a set of stereotyped, hierarchically related search behaviors act as hidden states. Using this model, we are able to track the decision-making process as participants explore, learn and develop expertise in the use of the interface. Our results show that partitioning the problem space into such stereotyped strategies is sufficient to capture a host of exploratory and learning behaviors. Moreover, using the modular architecture of stereotyped strategies as a Mixture of Experts, we are able to simultaneously ask the experts about the user's most probable future actions. We show that for those participants that learn the task, it becomes possible to predict their next decision, above chance, approximately halfway through the game. Our long-term goal is, on the basis of a better understanding of real-world decision-making processes, to inform the construction of interfaces that can establish dynamic conversations with their users in order to facilitate the development of expertise.

Keywords: behavioral modeling, expertise acquisition, hidden markov models, sequential decision-making

Procedia PDF Downloads 251
2602 Consumer Load Profile Determination with Entropy-Based K-Means Algorithm

Authors: Ioannis P. Panapakidis, Marios N. Moschakis

Abstract:

With the continuous increment of smart meter installations across the globe, the need for processing of the load data is evident. Clustering-based load profiling is built upon the utilization of unsupervised machine learning tools for the purpose of formulating the typical load curves or load profiles. The most commonly used algorithm in the load profiling literature is the K-means. While the algorithm has been successfully tested in a variety of applications, its drawback is the strong dependence in the initialization phase. This paper proposes a novel modified form of the K-means that addresses the aforementioned problem. Simulation results indicate the superiority of the proposed algorithm compared to the K-means.

Keywords: clustering, load profiling, load modeling, machine learning, energy efficiency and quality

Procedia PDF Downloads 163
2601 A Compressor Map Optimizing Tool for Prediction of Compressor Off-Design Performance

Authors: Zhongzhi Hu, Jie Shen, Jiqiang Wang

Abstract:

A high precision aeroengine model is needed when developing the engine control system. Compared with other main components, the axial compressor is the most challenging component to simulate. In this paper, a compressor map optimizing tool based on the introduction of a modifiable β function is developed for FWorks (FADEC Works). Three parameters (d density, f fitting coefficient, k₀ slope of the line β=0) are introduced to the β function to make it modifiable. The comparison of the traditional β function and the modifiable β function is carried out for a certain type of compressor. The interpolation errors show that both methods meet the modeling requirements, while the modifiable β function can predict compressor performance more accurately for some areas of the compressor map where the users are interested in.

Keywords: beta function, compressor map, interpolation error, map optimization tool

Procedia PDF Downloads 265
2600 Structural Equation Modeling Exploration for the Multiple College Admission Criteria in Taiwan

Authors: Tzu-Ling Hsieh

Abstract:

When the Taiwan Ministry of Education implemented a new university multiple entrance policy in 2002, most colleges and universities still use testing scores as mainly admission criteria. With forthcoming 12 basic-year education curriculum, the Ministry of Education provides a new college admission policy, which will be implemented in 2021. The new college admission policy will highlight the importance of holistic education by more emphases on the learning process of senior high school, except only on the outcome of academic testing. However, the development of college admission criteria doesn’t have a thoughtful process. Universities and colleges don’t have an idea about how to make suitable multi-admission criteria. Although there are lots of studies in other countries which have implemented multi-college admission criteria for years, these studies still cannot represent Taiwanese students. Also, these studies are limited without the comparison of two different academic fields. Therefore, this study investigated multiple admission criteria and its relationship with college success. This study analyzed the Taiwan Higher Education Database with 12,747 samples from 156 universities and tested a conceptual framework that examines factors by structural equation model (SEM). The conceptual framework of this study was adapted from Pascarella's general causal model and focused on how different admission criteria predict students’ college success. It discussed the relationship between admission criteria and college success, also the relationship how motivation (one of admission standard) influence college success through engagement behaviors of student effort and interactions with agents of socialization. After processing missing value, reliability and validity analysis, the study found three indicators can significantly predict students’ college success which was defined as average grade of last semester. These three indicators are the Chinese language scores at college entrance exam, high school class rank, and quality of student academic engagement. In addition, motivation can significantly predict quality of student academic engagement and interactions with agents of socialization. However, the multi-group SEM analysis showed that there is no difference to predict college success between the students from liberal arts and science. Finally, this study provided some suggestions for universities and colleges to develop multi-admission criteria through the empirical research of Taiwanese higher education students.

Keywords: college admission, admission criteria, structural equation modeling, higher education, education policy

Procedia PDF Downloads 177
2599 The Importance of Clinical Pharmacy and Computer Aided Drug Design

Authors: Mario Hanna Louis Hanna

Abstract:

The use of CAD (pc Aided layout) generation is ubiquitous inside the structure, engineering and construction (AEC) industry. This has led to its inclusion in the curriculum of structure faculties in Nigeria as an important part of the training module. This newsletter examines the moral troubles involved in implementing CAD (pc Aided layout) content into the architectural training curriculum. Using current literature, this study begins with the advantages of integrating CAD into architectural education and the responsibilities of various stakeholders in the implementation process. It also examines issues related to the terrible use of records generation and the perceived bad effect of CAD use on design creativity. The use of a survey technique, information from the architecture department of Chukwuemeka Odumegwu Ojukwu Uli college changed into accumulated to serve as a case observe on how the problems raised have been being addressed. The object draws conclusions on what guarantees a hit moral implementation. Tens of millions of human beings around the sector suffer from hepatitis C, one of the international's deadliest sicknesses. Interferon (IFN) is a remedy alternative for patients with hepatitis C, but these treatments have their aspect outcomes. Our research targeted growing an oral small molecule drug that goals hepatitis C virus (HCV) proteins and has fewer facet effects. Our contemporary study targets to broaden a drug primarily based on a small molecule antiviral drug precise for the hepatitis C virus (HCV). Drug improvement and the use of laboratory experiments isn't always best high-priced, however also time-eating to behavior those experiments. instead, on this in silicon have a look at, we used computational strategies to recommend a particular antiviral drug for the protein domain names of discovered in the hepatitis C virus. This examines used homology modeling and abs initio modeling to generate the 3-D shape of the proteins, then figuring out pockets within the proteins. Proper lagans for pocket pills were advanced the usage of the de novo drug design method. Pocket geometry is taken into consideration while designing ligands. A few of the various lagans generated, a different for each of the HCV protein domains has been proposed.

Keywords: drug design, anti-viral drug, in-silicon drug design, Hepatitis C virus (HCV) CAD (Computer Aided Design), CAD education, education improvement, small-size contractor automatic pharmacy, PLC, control system, management system, communication.

Procedia PDF Downloads 26
2598 Modeling of Turbulent Flow for Two-Dimensional Backward-Facing Step Flow

Authors: Alex Fedoseyev

Abstract:

This study investigates a generalized hydrodynamic equation (GHE) simplified model for the simulation of turbulent flow over a two-dimensional backward-facing step (BFS) at Reynolds number Re=132000. The GHE were derived from the generalized Boltzmann equation (GBE). GBE was obtained by first principles from the chain of Bogolubov kinetic equations and considers particles of finite dimensions. The GHE has additional terms, temporal and spatial fluctuations, compared to the Navier-Stokes equations (NSE). These terms have a timescale multiplier τ, and the GHE becomes the NSE when $\tau$ is zero. The nondimensional τ is a product of the Reynolds number and the squared length scale ratio, τ=Re*(l/L)², where l is the apparent Kolmogorov length scale, and L is a hydrodynamic length scale. The BFS flow modeling results obtained by 2D calculations cannot match the experimental data for Re>450. One or two additional equations are required for the turbulence model to be added to the NSE, which typically has two to five parameters to be tuned for specific problems. It is shown that the GHE does not require an additional turbulence model, whereas the turbulent velocity results are in good agreement with the experimental results. A review of several studies on the simulation of flow over the BFS from 1980 to 2023 is provided. Most of these studies used different turbulence models when Re>1000. In this study, the 2D turbulent flow over a BFS with height H=L/3 (where L is the channel height) at Reynolds number Re=132000 was investigated using numerical solutions of the GHE (by a finite-element method) and compared to the solutions from the Navier-Stokes equations, k–ε turbulence model, and experimental results. The comparison included the velocity profiles at X/L=5.33 (near the end of the recirculation zone, available from the experiment), recirculation zone length, and velocity flow field. The mean velocity of NSE was obtained by averaging the solution over the number of time steps. The solution with a standard k −ε model shows a velocity profile at X/L=5.33, which has no backward flow. A standard k−ε model underpredicts the experimental recirculation zone length X/L=7.0∓0.5 by a substantial amount of 20-25%, and a more sophisticated turbulence model is needed for this problem. The obtained data confirm that the GHE results are in good agreement with the experimental results for turbulent flow over two-dimensional BFS. A turbulence model was not required in this case. The computations were stable. The solution time for the GHE is the same or less than that for the NSE and significantly less than that for the NSE with the turbulence model. The proposed approach was limited to 2D and only one Reynolds number. Further work will extend this approach to 3D flow and a higher Re.

Keywords: backward-facing step, comparison with experimental data, generalized hydrodynamic equations, separation, reattachment, turbulent flow

Procedia PDF Downloads 59
2597 SVM-DTC Using for PMSM Speed Tracking Control

Authors: Kendouci Khedidja, Mazari Benyounes, Benhadria Mohamed Rachid, Dadi Rachida

Abstract:

In recent years, direct torque control (DTC) has become an alternative to the well-known vector control especially for permanent magnet synchronous motor (PMSM). However, it presents a problem of field linkage and torque ripple. In order to solve this problem, the conventional DTC is combined with space vector pulse width modulation (SVPWM). This control theory has achieved great success in the control of PMSM. That has become a hotspot for resolving. The main objective of this paper gives us an introduction of the DTC and SVPWM-DTC control theory of PMSM which has been simulating on each part of the system via Matlab/Simulink based on the mathematical modeling. Moreover, the outcome of the simulation proved that the improved SVPWM- DTC of PMSM has a good dynamic and static performance.

Keywords: PMSM, DTC, SVM, speed control

Procedia PDF Downloads 387
2596 Applications of Probabilistic Interpolation via Orthogonal Matrices

Authors: Dariusz Jacek Jakóbczak

Abstract:

Mathematics and computer science are interested in methods of 2D curve interpolation and extrapolation using the set of key points (knots). A proposed method of Hurwitz- Radon Matrices (MHR) is such a method. This novel method is based on the family of Hurwitz-Radon (HR) matrices which possess columns composed of orthogonal vectors. Two-dimensional curve is interpolated via different functions as probability distribution functions: polynomial, sinus, cosine, tangent, cotangent, logarithm, exponent, arcsin, arccos, arctan, arcctg or power function, also inverse functions. It is shown how to build the orthogonal matrix operator and how to use it in a process of curve reconstruction.

Keywords: 2D data interpolation, hurwitz-radon matrices, MHR method, probabilistic modeling, curve extrapolation

Procedia PDF Downloads 523
2595 A Prediction Method of Pollutants Distribution Pattern: Flare Motion Using Computational Fluid Dynamics (CFD) Fluent Model with Weather Research Forecast Input Model during Transition Season

Authors: Benedictus Asriparusa, Lathifah Al Hakimi, Aulia Husada

Abstract:

A large amount of energy is being wasted by the release of natural gas associated with the oil industry. This release interrupts the environment particularly atmosphere layer condition globally which contributes to global warming impact. This research presents an overview of the methods employed by researchers in PT. Chevron Pacific Indonesia in the Minas area to determine a new prediction method of measuring and reducing gas flaring and its emission. The method emphasizes advanced research which involved analytical studies, numerical studies, modeling, and computer simulations, amongst other techniques. A flaring system is the controlled burning of natural gas in the course of routine oil and gas production operations. This burning occurs at the end of a flare stack or boom. The combustion process releases emissions of greenhouse gases such as NO2, CO2, SO2, etc. This condition will affect the chemical composition of air and environment around the boundary layer mainly during transition season. Transition season in Indonesia is absolutely very difficult condition to predict its pattern caused by the difference of two air mass conditions. This paper research focused on transition season in 2013. A simulation to create the new pattern of the pollutants distribution is needed. This paper has outlines trends in gas flaring modeling and current developments to predict the dominant variables in the pollutants distribution. A Fluent model is used to simulate the distribution of pollutants gas coming out of the stack, whereas WRF model output is used to overcome the limitations of the analysis of meteorological data and atmospheric conditions in the study area. Based on the running model, the most influence factor was wind speed. The goal of the simulation is to predict the new pattern based on the time of fastest wind and slowest wind occurs for pollutants distribution. According to the simulation results, it can be seen that the fastest wind (last of March) moves pollutants in a horizontal direction and the slowest wind (middle of May) moves pollutants vertically. Besides, the design of flare stack in compliance according to EPA Oil and Gas Facility Stack Parameters likely shows pollutants concentration remains on the under threshold NAAQS (National Ambient Air Quality Standards).

Keywords: flare motion, new prediction, pollutants distribution, transition season, WRF model

Procedia PDF Downloads 554
2594 Application of Sub-health Diagnosis and Reasoning Method for Avionics

Authors: Weiran An, Junyou Shi

Abstract:

Health management has become one of the design goals in the research and development of new generation avionics systems, and is an important complement and development for the testability and fault diagnosis technology. Currently, the research and application for avionics system health dividing and diagnosis technology is still at the starting stage, lack of related technologies and methods reserve. In this paper, based on the health three-state dividing of avionics products, state lateral transfer coupling modeling and diagnosis reasoning method considering sub-health are researched. With the study of typical case application, the feasibility and correctness of the method and the software are verified.

Keywords: sub-health, diagnosis reasoning, three-valued coupled logic, extended dependency model, avionics

Procedia PDF Downloads 331
2593 Natural Gas Production Forecasts Using Diffusion Models

Authors: Md. Abud Darda

Abstract:

Different options for natural gas production in wide geographic areas may be described through diffusion of innovation models. This type of modeling approach provides an indirect estimate of an ultimately recoverable resource, URR, capture the quantitative effects of observed strategic interventions, and allow ex-ante assessments of future scenarios over time. In order to ensure a sustainable energy policy, it is important to forecast the availability of this natural resource. Considering a finite life cycle, in this paper we try to investigate the natural gas production of Myanmar and Algeria, two important natural gas provider in the world energy market. A number of homogeneous and heterogeneous diffusion models, with convenient extensions, have been used. Models validation has also been performed in terms of prediction capability.

Keywords: diffusion models, energy forecast, natural gas, nonlinear production

Procedia PDF Downloads 225
2592 Backward Erosion Piping through Vertically Layered Sands

Authors: K. Vandenboer, L. Dolphen, A. Bezuijen

Abstract:

Backward erosion piping is an important failure mechanism for water-retaining structures, a phenomenon that results in the formation of shallow pipes at the interface of a sandy or silty foundation and a cohesive cover layer. This paper studies the effect of two soil types on backward erosion piping; both in case of a homogeneous sand layer, and in a vertically layered sand sample, where the pipe is forced to subsequently grow through the different layers. Two configurations with vertical sand layers are tested; they both result in wider pipes and higher critical gradients, thereby making this an interesting topic in research on measures to prevent backward erosion piping failures.

Keywords: backward erosion piping, embankments, physical modeling, sand

Procedia PDF Downloads 388
2591 Embedded Visual Perception for Autonomous Agricultural Machines Using Lightweight Convolutional Neural Networks

Authors: René A. Sørensen, Søren Skovsen, Peter Christiansen, Henrik Karstoft

Abstract:

Autonomous agricultural machines act in stochastic surroundings and therefore, must be able to perceive the surroundings in real time. This perception can be achieved using image sensors combined with advanced machine learning, in particular Deep Learning. Deep convolutional neural networks excel in labeling and perceiving color images and since the cost of high-quality RGB-cameras is low, the hardware cost of good perception depends heavily on memory and computation power. This paper investigates the possibility of designing lightweight convolutional neural networks for semantic segmentation (pixel wise classification) with reduced hardware requirements, to allow for embedded usage in autonomous agricultural machines. Using compression techniques, a lightweight convolutional neural network is designed to perform real-time semantic segmentation on an embedded platform. The network is trained on two large datasets, ImageNet and Pascal Context, to recognize up to 400 individual classes. The 400 classes are remapped into agricultural superclasses (e.g. human, animal, sky, road, field, shelterbelt and obstacle) and the ability to provide accurate real-time perception of agricultural surroundings is studied. The network is applied to the case of autonomous grass mowing using the NVIDIA Tegra X1 embedded platform. Feeding case-specific images to the network results in a fully segmented map of the superclasses in the image. As the network is still being designed and optimized, only a qualitative analysis of the method is complete at the abstract submission deadline. Proceeding this deadline, the finalized design is quantitatively evaluated on 20 annotated grass mowing images. Lightweight convolutional neural networks for semantic segmentation can be implemented on an embedded platform and show competitive performance with regards to accuracy and speed. It is feasible to provide cost-efficient perceptive capabilities related to semantic segmentation for autonomous agricultural machines.

Keywords: autonomous agricultural machines, deep learning, safety, visual perception

Procedia PDF Downloads 394
2590 Integrating Data Mining with Case-Based Reasoning for Diagnosing Sorghum Anthracnose

Authors: Mariamawit T. Belete

Abstract:

Cereal production and marketing are the means of livelihood for millions of households in Ethiopia. However, cereal production is constrained by technical and socio-economic factors. Among the technical factors, cereal crop diseases are the major contributing factors to the low yield. The aim of this research is to develop an integration of data mining and knowledge based system for sorghum anthracnose disease diagnosis that assists agriculture experts and development agents to make timely decisions. Anthracnose diagnosing systems gather information from Melkassa agricultural research center and attempt to score anthracnose severity scale. Empirical research is designed for data exploration, modeling, and confirmatory procedures for testing hypothesis and prediction to draw a sound conclusion. WEKA (Waikato Environment for Knowledge Analysis) was employed for the modeling. Knowledge based system has come across a variety of approaches based on the knowledge representation method; case-based reasoning (CBR) is one of the popular approaches used in knowledge-based system. CBR is a problem solving strategy that uses previous cases to solve new problems. The system utilizes hidden knowledge extracted by employing clustering algorithms, specifically K-means clustering from sampled anthracnose dataset. Clustered cases with centroid value are mapped to jCOLIBRI, and then the integrator application is created using NetBeans with JDK 8.0.2. The important part of a case based reasoning model includes case retrieval; the similarity measuring stage, reuse; which allows domain expert to transfer retrieval case solution to suit for the current case, revise; to test the solution, and retain to store the confirmed solution to the case base for future use. Evaluation of the system was done for both system performance and user acceptance. For testing the prototype, seven test cases were used. Experimental result shows that the system achieves an average precision and recall values of 70% and 83%, respectively. User acceptance testing also performed by involving five domain experts, and an average of 83% acceptance is achieved. Although the result of this study is promising, however, further study should be done an investigation on hybrid approach such as rule based reasoning, and pictorial retrieval process are recommended.

Keywords: sorghum anthracnose, data mining, case based reasoning, integration

Procedia PDF Downloads 79
2589 Design and Optimization of Composite Canopy Structure

Authors: Prakash Kattire, Rahul Pathare, Nilesh Tawde

Abstract:

A canopy is an overhead roof structure generally used at the entrance of a building to provide shelter from rain and sun and may also be used for decorative purposes. In this paper, the canopy structure to cover the conveyor line has been studied. Existing most of the canopy structures are made of steel and glass, which makes a heavier structure, so the purpose of this study is to weight and cost optimization of the canopy. To achieve this goal, the materials of construction considered are Polyvinyl chloride (PVC) natural composite, Fiber Reinforced Plastic (FRP), and Structural steel Fe250. Designing and modeling were done in Solid works, whereas Altair Inspire software was used for the optimization of the structure. Through this study, it was found that there is a total 10% weight reduction in the structure with sufficient reserve for structural strength.

Keywords: canopy, composite, FRP, PVC

Procedia PDF Downloads 144
2588 Groundwater Numerical Modeling, an Application of Remote Sensing, and GIS Techniques in South Darb El Arbaieen, Western Desert, Egypt

Authors: Abdallah M. Fayed

Abstract:

The study area is located in south Darb El Arbaieen, western desert of Egypt. It occupies the area between latitudes 22° 00/ and 22° 30/ North and Longitudes 29° 30/ and 30° 00/ East, from southern border of Egypt to the area north Bir Kuraiym and from the area East of East Owienat to the area west Tushka district, its area about 2750 Km2. The famous features; southern part of Darb El Arbaieen road, G Baraqat El Scab El Qarra, Bir Dibis, Bir El Shab and Bir Kuraiym, Interpretation of soil stratification shows layers that are related to Quaternary and Upper-Lower Cretaceous eras. It is dissected by a series of NE-SW striking faults. The regional groundwater flow direction is in SW-NE direction with a hydraulic gradient is 1m / 2km. Mathematical model program has been applied for evaluation of groundwater potentials in the main Aquifer –Nubian Sandstone- in the area of study and Remote sensing technique is considered powerful, accurate and saving time in this respect. These techniques are widely used for illustrating and analysis different phenomenon such as the new development in the desert (land reclamation), residential development (new communities), urbanization, etc. The major issues concerning water development objective of this work is to determine the new development areas in western desert of Egypt during the period from 2003 to 2015 using remote sensing technique, the impacts of the present and future development have been evaluated by using the two-dimensional numerical groundwater flow Simulation Package (visual modflow 4.2). The package was used to construct and calibrate a numerical model that can be used to simulate the response of the aquifer in the study area under implementing different management alternatives in the form of changes in piezometric levels and salinity. Total period of simulation is 100 years. After steady state calibration, two different scenarios are simulated for groundwater development. 21 production wells are installed at the study area and used in the model, with the total discharge for the two scenarios were 105000 m3/d, 210000 m3/d. The drawdown was 11.8 m and 23.7 m for the two scenarios in the end of 100 year. Contour maps for water heads and drawdown and hydrographs for piezometric head are represented. The drawdown was less than the half of the saturated thickness (the safe yield case).

Keywords: remote sensing, management of aquifer systems, simulation modeling, western desert, South Darb El Arbaieen

Procedia PDF Downloads 399
2587 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 272
2586 Employing Operations Research at Universities to Build Management Systems

Authors: Abdallah A. Hlayel

Abstract:

Operations research science (OR) deals with good success in developing and applying scientific methods for problem solving and decision-making. However, by using OR techniques, we can enhance the use of computer decision support systems to achieve optimal management for institutions. OR applies comprehensive analysis including all factors that affect on it and builds mathematical modeling to solve business or organizational problems. In addition, it improves decision-making and uses available resources efficiently. The adoption of OR by universities would definitely contributes to the development and enhancement of the performance of OR techniques. This paper provides an understanding of the structures, approaches and models of OR in problem solving and decision-making.

Keywords: best candidates' method, decision making, decision support system, operations research

Procedia PDF Downloads 443
2585 Characterization of the Viscoelastic Behavior of Polymeric Composites

Authors: Abir Abdessalem, Sahbi Tamboura, J. Fitoussi, Hachmi Ben Daly, Abbas Tcharkhtchi

Abstract:

Dynamic mechanical analysis (DMA) is one of the most used experimental techniques to investigate the temperature and frequency dependence of the mechanical behavior of viscoelastic materials. The measured data are generally shifted by the application of the principle of the time– temperature superposition (TTS) to obtain the viscoelastic system’s master curve. The aim of this work is to show the methodology to define the horizontal shift factor to be applied to the storage modulus measured in order to indicate the validity of (TTS) principle for this material system. This principle was successfully used to determine the long-term properties of the Sheet Moulding Compound (SMC) composites.

Keywords: composite material, dynamic mechanical analysis, SMC composites, viscoelastic behavior, modeling

Procedia PDF Downloads 231