Search results for: Uncertainty.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 367

Search results for: Uncertainty.

277 The Use of Dynamically Optimised High Frequency Moving Average Strategies for Intraday Trading

Authors: Abdalla Kablan, Joseph Falzon

Abstract:

This paper is motivated by the aspect of uncertainty in financial decision making, and how artificial intelligence and soft computing, with its uncertainty reducing aspects can be used for algorithmic trading applications that trade in high frequency. This paper presents an optimized high frequency trading system that has been combined with various moving averages to produce a hybrid system that outperforms trading systems that rely solely on moving averages. The paper optimizes an adaptive neuro-fuzzy inference system that takes both the price and its moving average as input, learns to predict price movements from training data consisting of intraday data, dynamically switches between the best performing moving averages, and performs decision making of when to buy or sell a certain currency in high frequency.

Keywords: Financial decision making, High frequency trading, Adaprive neuro-fuzzy systems, moving average strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5072
276 Mechanical Structure Design Optimization by Blind Number Theory: Time-dependent Reliability

Authors: Zakari Yaou, Lirong Cui

Abstract:

In a product development process, understanding the functional behavior of the system, the role of components in achieving functions and failure modes if components/subsystem fails its required function will help develop appropriate design validation and verification program for reliability assessment. The integration of these three issues will help design and reliability engineers in identifying weak spots in design and planning future actions and testing program. This case study demonstrate the advantage of unascertained theory described in the subjective cognition uncertainty, and then applies blind number (BN) theory in describing the uncertainty of the mechanical system failure process and the same time used the same theory in bringing out another mechanical reliability system model. The practical calculations shows the BN Model embodied the characters of simply, small account of calculation but betterforecasting capability, which had the value of macroscopic discussion to some extent.

Keywords: Mechanical structure Design, time-dependent stochastic process, unascertained information, blind number theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
275 Low Air Velocity Measurement Characteristics- Variation Due to Flow Regime

Authors: A. Pedišius, V. Janušas, A. Bertašienė

Abstract:

The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.

Keywords: Laser Doppler anemometer, ultrasonic anemometer, air flow velocities, transitional flow regime, measurement, uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
274 Deep Reinforcement Learning for Optimal Decision-making in Supply Chains

Authors: Nitin Singh, Meng Ling, Talha Ahmed, Tianxia Zhao, Reinier van de Pol

Abstract:

We propose the use of Reinforcement Learning (RL) as a viable alternative for optimizing supply chain management, particularly in scenarios with stochasticity in product demands. RL’s adaptability to changing conditions and its demonstrated success in diverse fields of sequential decision-making make it a promising candidate for addressing supply chain problems. We investigate the impact of demand fluctuations in a multi-product supply chain system and develop RL agents with learned generalizable policies. We provide experimentation details for training RL agents and a statistical analysis of the results. We study generalization ability of RL agents for different demand uncertainty scenarios and observe superior performance compared to the agents trained with fixed demand curves. The proposed methodology has the potential to lead to cost reduction and increased profit for companies dealing with frequent inventory movement between supply and demand nodes.

Keywords: Inventory Management, Reinforcement Learning, Supply Chain Optimization, Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 382
273 Monte Carlo Analysis and Fuzzy Sets for Uncertainty Propagation in SIS Performance Assessment

Authors: Fares Innal, Yves Dutuit, Mourad Chebila

Abstract:

The object of this work is the probabilistic performance evaluation of safety instrumented systems (SIS), i.e. the average probability of dangerous failure on demand (PFDavg) and the average frequency of failure (PFH), taking into account the uncertainties related to the different parameters that come into play: failure rate (λ), common cause failure proportion (β), diagnostic coverage (DC)... This leads to an accurate and safe assessment of the safety integrity level (SIL) inherent to the safety function performed by such systems. This aim is in keeping with the requirement of the IEC 61508 standard with respect to handling uncertainty. To do this, we propose an approach that combines (1) Monte Carlo simulation and (2) fuzzy sets. Indeed, the first method is appropriate where representative statistical data are available (using pdf of the relating parameters), while the latter applies in the case characterized by vague and subjective information (using membership function). The proposed approach is fully supported with a suitable computer code.

Keywords: Fuzzy sets, Monte Carlo simulation, Safety instrumented system, Safety integrity level.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2779
272 Improved Torque Control of Electrical Load Simulator with Parameters and State Estimation

Authors: Nasim Ullah, Shaoping Wang

Abstract:

ELS is an important ground based hardware in the loop simulator used for aerodynamics torque loading experiments of the actuators under test. This work focuses on improvement of the transient response of torque controller with parameters uncertainty of Electrical Load Simulator (ELS).The parameters of load simulator are estimated online and the model is updated, eliminating the model error and improving the steady state torque tracking response of torque controller. To improve the Transient control performance the gain of robust term of SMC is updated online using fuzzy logic system based on the amount of uncertainty in parameters of load simulator. The states of load simulator which cannot be measured directly are estimated using luenberger observer with update of new estimated parameters. The stability of the control scheme is verified using Lyapunov theorem. The validity of proposed control scheme is verified using simulations.

Keywords: ELS, Observer, Transient Performance, SMC, Extra Torque, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2036
271 Modeling the Symptom-Disease Relationship by Using Rough Set Theory and Formal Concept Analysis

Authors: Mert Bal, Hayri Sever, Oya Kalıpsız

Abstract:

Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.

Keywords: Formal Concept Analysis, Rough Set Theory, Granular Computing, Medical Decision Support System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
270 Evaluation of New Product Development Projects using Artificial Intelligence and Fuzzy Logic

Authors: Orhan Feyzioğlu, Gülçin Büyüközkan

Abstract:

As a vital activity for companies, new product development (NPD) is also a very risky process due to the high uncertainty degree encountered at every development stage and the inevitable dependence on how previous steps are successfully accomplished. Hence, there is an apparent need to evaluate new product initiatives systematically and make accurate decisions under uncertainty. Another major concern is the time pressure to launch a significant number of new products to preserve and increase the competitive power of the company. In this work, we propose an integrated decision-making framework based on neural networks and fuzzy logic to make appropriate decisions and accelerate the evaluation process. We are especially interested in the two initial stages where new product ideas are selected (go/no go decision) and the implementation order of the corresponding projects are determined. We show that this two-staged intelligent approach allows practitioners to roughly and quickly separate good and bad product ideas by making use of previous experiences, and then, analyze a more shortened list rigorously.

Keywords: Decision Making, Neural Networks, Fuzzy Theory and Systems, Choquet Integral, New Product Development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2832
269 Software Effort Estimation Models Using Radial Basis Function Network

Authors: E. Praynlin, P. Latha

Abstract:

Software Effort Estimation is the process of estimating the effort required to develop software. By estimating the effort, the cost and schedule required to estimate the software can be determined. Accurate Estimate helps the developer to allocate the resource accordingly in order to avoid cost overrun and schedule overrun. Several methods are available in order to estimate the effort among which soft computing based method plays a prominent role. Software cost estimation deals with lot of uncertainty among all soft computing methods neural network is good in handling uncertainty. In this paper Radial Basis Function Network is compared with the back propagation network and the results are validated using six data sets and it is found that RBFN is best suitable to estimate the effort. The Results are validated using two tests the error test and the statistical test.

Keywords: Software cost estimation, Radial Basis Function Network (RBFN), Back propagation function network, Mean Magnitude of Relative Error (MMRE).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2387
268 A 3D Approach for Extraction of the Coronaryartery and Quantification of the Stenosis

Authors: Mahdi Mazinani, S. D. Qanadli, Rahil Hosseini, Tim Ellis, Jamshid Dehmeshki

Abstract:

Segmentation and quantification of stenosis is an important task in assessing coronary artery disease. One of the main challenges is measuring the real diameter of curved vessels. Moreover, uncertainty in segmentation of different tissues in the narrow vessel is an important issue that affects accuracy. This paper proposes an algorithm to extract coronary arteries and measure the degree of stenosis. Markovian fuzzy clustering method is applied to model uncertainty arises from partial volume effect problem. The algorithm employs: segmentation, centreline extraction, estimation of orthogonal plane to centreline, measurement of the degree of stenosis. To evaluate the accuracy and reproducibility, the approach has been applied to a vascular phantom and the results are compared with real diameter. The results of 10 patient datasets have been visually judged by a qualified radiologist. The results reveal the superiority of the proposed method compared to the Conventional thresholding Method (CTM) on both datasets.

Keywords: 3D coronary artery tree extraction, segmentation, quantification, fuzzy clustering, and Markov random field

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1581
267 The Computational Psycholinguistic Situational-Fuzzy Self-Controlled Brain and Mind System under Uncertainty

Authors: Ben Khayut, Lina Fabri, Maya Avikhana

Abstract:

The modern Artificial Narrow Intelligence (ANI) models cannot: a) independently, situationally, and continuously function without of human intelligence, used for retraining and reprogramming the ANI’s models, and b) think, understand, be conscious, and cognize under uncertainty and changing of the environmental objects. To eliminate these shortcomings and build a new generation of Artificial Intelligence systems, the paper proposes a Conception, Model, and Method of Computational Psycholinguistic Cognitive Situational-Fuzzy Self-Controlled Brain and Mind System (CPCSFSCBMSUU). This system uses a neural network as its computational memory, and activates functions of the perception, identification of real objects, fuzzy situational control, and forming images of these objects. These images and objects are used for modeling their psychological, linguistic, cognitive, and neural values of properties and features, the meanings of which are identified, interpreted, generated, and formed taking into account the identified subject area, using the data, information, knowledge, accumulated in the Memory. The functioning of the CPCSFSCBMSUU is carried out by its subsystems of the: fuzzy situational control of all processes, computational perception, identifying of reactions and actions, Psycholinguistic Cognitive Fuzzy Logical Inference, Decision Making, Reasoning, Systems Thinking, Planning, Awareness, Consciousness, Cognition, Intuition, and Wisdom. In doing so are performed analysis and processing of the psycholinguistic, subject, visual, signal, sound and other objects, accumulation and using the data, information and knowledge of the Memory, communication, and interaction with other computing systems, robots and humans in order of solving the joint tasks. To investigate the functional processes of the proposed system, the principles of situational control, fuzzy logic, psycholinguistics, informatics, and modern possibilities of data science were applied. The proposed self-controlled system of brain and mind is oriented on use as a plug-in in multilingual subject applications.

Keywords: Computational psycholinguistic cognitive brain and mind system, situational fuzzy control, uncertainty, AI.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 409
266 Fuzzy Uncertainty Theory for Stealth Fighter Aircraft Selection in Entropic Fuzzy TOPSIS Decision Analysis Process

Authors: C. Ardil

Abstract:

The purpose of this paper is to present fuzzy TOPSIS in an entropic fuzzy environment. Due to the ambiguous concepts often represented in decision data, exact values are insufficient to model real-life situations. In this paper, the rating of each alternative is defined in fuzzy linguistic terms, which can be expressed with triangular fuzzy numbers. The weight of each criterion is then derived from the decision matrix using the entropy weighting method. Next, a vertex method is proposed to calculate the distance between two triangular fuzzy numbers. According to the TOPSIS concept, a closeness coefficient is defined to determine the ranking order of all alternatives by simultaneously calculating the distances to both the fuzzy positive-ideal solution (FPIS) and the fuzzy negative-ideal solution (FNIS). Finally, an illustrative example of selecting stealth fighter aircraft is shown at the end of this article to highlight the procedure of the proposed method. Correlation analysis and validation analysis using TOPSIS, WSM, and WPM methods were performed to compare the ranking order of the alternatives.

Keywords: stealth fighter aircraft selection, fuzzy uncertainty theory (FUT), fuzzy entropic decision (FED), fuzzy linguistic variables, triangular fuzzy numbers, multiple criteria decision making analysis, MCDMA, TOPSIS, WSM, WPM

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 601
265 Network-Constrained AC Unit Commitment under Uncertainty Using a Bender’s Decomposition Approach

Authors: B. Janani, S. Thiruvenkadam

Abstract:

In this work, the system evaluates the impact of considering a stochastic approach on the day ahead basis Unit Commitment. Comparisons between stochastic and deterministic Unit Commitment solutions are provided. The Unit Commitment model consists in the minimization of the total operation costs considering unit’s technical constraints like ramping rates, minimum up and down time. Load shedding and wind power spilling is acceptable, but at inflated operational costs. The evaluation process consists in the calculation of the optimal unit commitment and in verifying the fulfillment of the considered constraints. For the calculation of the optimal unit commitment, an algorithm based on the Benders Decomposition, namely on the Dual Dynamic Programming, was developed. Two approaches were considered on the construction of stochastic solutions. Data related to wind power outputs from two different operational days are considered on the analysis. Stochastic and deterministic solutions are compared based on the actual measured wind power output at the operational day. Through a technique capability of finding representative wind power scenarios and its probabilities, the system can analyze a more detailed process about the expected final operational cost.

Keywords: Benders’ decomposition, network constrained AC unit commitment, stochastic programming, wind power uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1311
264 Resistance and Sub-Resistances of RC Beams Subjected to Multiple Failure Modes

Authors: F. Sangiorgio, J. Silfwerbrand, G. Mancini

Abstract:

Geometric and mechanical properties all influence the resistance of RC structures and may, in certain combination of property values, increase the risk of a brittle failure of the whole system. This paper presents a statistical and probabilistic investigation on the resistance of RC beams designed according to Eurocodes 2 and 8, and subjected to multiple failure modes, under both the natural variation of material properties and the uncertainty associated with cross-section and transverse reinforcement geometry. A full probabilistic model based on JCSS Probabilistic Model Code is derived. Different beams are studied through material nonlinear analysis via Monte Carlo simulations. The resistance model is consistent with Eurocode 2. Both a multivariate statistical evaluation and the data clustering analysis of outcomes are then performed. Results show that the ultimate load behaviour of RC beams subjected to flexural and shear failure modes seems to be mainly influenced by the combination of the mechanical properties of both longitudinal reinforcement and stirrups, and the tensile strength of concrete, of which the latter appears to affect the overall response of the system in a nonlinear way. The model uncertainty of the resistance model used in the analysis plays undoubtedly an important role in interpreting results.

Keywords: Modelling, Monte Carlo Simulations, Probabilistic Models, Data Clustering, Reinforced Concrete Members, Structural Design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2108
263 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: Sanitation systems, nano membrane toilet, LCA, stochastic uncertainty analysis, Monte Carlo Simulations, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 988
262 Calibration of Syringe Pumps Using Interferometry and Optical Methods

Authors: E. Batista, R. Mendes, A. Furtado, M. C. Ferreira, I. Godinho, J. A. Sousa, M. Alvares, R. Martins

Abstract:

Syringe pumps are commonly used for drug delivery in hospitals and clinical environments. These instruments are critical in neonatology and oncology, where any variation in the flow rate and drug dosing quantity can lead to severe incidents and even death of the patient. Therefore it is very important to determine the accuracy and precision of these devices using the suitable calibration methods. The Volume Laboratory of the Portuguese Institute for Quality (LVC/IPQ) uses two different methods to calibrate syringe pumps from 16 nL/min up to 20 mL/min. The Interferometric method uses an interferometer to monitor the distance travelled by a pusher block of the syringe pump in order to determine the flow rate. Therefore, knowing the internal diameter of the syringe with very high precision, the travelled distance, and the time needed for that travelled distance, it was possible to calculate the flow rate of the fluid inside the syringe and its uncertainty. As an alternative to the gravimetric and the interferometric method, a methodology based on the application of optical technology was also developed to measure flow rates. Mainly this method relies on measuring the increase of volume of a drop over time. The objective of this work is to compare the results of the calibration of two syringe pumps using the different methodologies described above. The obtained results were consistent for the three methods used. The uncertainties values were very similar for all the three methods, being higher for the optical drop method due to setup limitations.

Keywords: Calibration, interferometry, syringe pump, optical method, uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 781
261 Production Planning for Animal Food Industry under Demand Uncertainty

Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut

Abstract:

This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.

Keywords: Animal food, Stochastic linear programming, Production planning, Demand Uncertainty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1915
260 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: Hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1647
259 Longitudinal Shear Modulus of Single Aramid, Carbon and Glass Fibres by Torsion Pendulum Tests

Authors: I Prasanna Kumar, Satya Prakash Kushwaha, Preetamkumar Mohite, Sudhir Kamle

Abstract:

The longitudinal shear moduli of a single aramid, carbon and glass fibres are measured in the present study. A popularly known concept of freely oscillating torsion pendulum has been used to characterize the torsional modulus. A simple freely oscillating torsional pendulum setup is designed with two different types of plastic discs: horizontal and vertical, as the known mass of the pendulum. The time period of the torsional oscillation is measured to determine the torsional rigidity of the fibre. Then the shear modulus of the fibre is calculated from its torsional rigidity. The mean shear modulus of aramid, carbon and glass fibres  measured are 6.22±0.09, 18.5±0.91, 38.1±3.55 GPa by horizontal disc pendulum and 6.19±0.13, 18.1±1.34 and 39.5±1.83 GPa by vertical disc pendulum, respectively. The results obtained by both pendulums differed by less than 5% and agreed well with the results reported in literature for these three types of fibres. A detailed uncertainty calculations are carried out for the measurements. It is seen that scatter as well as uncertainty (or error) in the measured shear modulus of these fibres is less than 10%. For aramid fibres the effect of gauge length on the shear modulus value is also studied. It is verified that the scatter in measured shear modulus value increases with gauge length and scatter in fibre diameter.

Keywords: Aramid; Carbon; Glass fibres, Longitudinal shear modulus, Torsion pendulum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3768
258 Estimation of the Road Traffic Emissions and Dispersion in the Developing Countries Conditions

Authors: Hicham Gourgue, Ahmed Aharoune, Ahmed Ihlal

Abstract:

We present in this work our model of road traffic emissions (line sources) and dispersion of these emissions, named DISPOLSPEM (Dispersion of Poly Sources and Pollutants Emission Model). In its emission part, this model was designed to keep the consistent bottom-up and top-down approaches. It also allows to generate emission inventories from reduced input parameters being adapted to existing conditions in Morocco and in the other developing countries. While several simplifications are made, all the performance of the model results are kept. A further important advantage of the model is that it allows the uncertainty calculation and emission rate uncertainty according to each of the input parameters. In the dispersion part of the model, an improved line source model has been developed, implemented and tested against a reference solution. It provides improvement in accuracy over previous formulas of line source Gaussian plume model, without being too demanding in terms of computational resources. In the case study presented here, the biggest errors were associated with the ends of line source sections; these errors will be canceled by adjacent sections of line sources during the simulation of a road network. In cases where the wind is parallel to the source line, the use of the combination discretized source and analytical line source formulas minimizes remarkably the error. Because this combination is applied only for a small number of wind directions, it should not excessively increase the calculation time.

Keywords: Air pollution, dispersion, emissions, line sources, road traffic, urban transport.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1929
257 Enhanced Efficacy of Kinetic Power Transform for High-Speed Wind Field

Authors: Nan-Chyuan Tsai, Chao-Wen Chiang, Bai-Lu Wang

Abstract:

The three-time-scale plant model of a wind power generator, including a wind turbine, a flexible vertical shaft, a Variable Inertia Flywheel (VIF) module, an Active Magnetic Bearing (AMB) unit and the applied wind sequence, is constructed. In order to make the wind power generator be still able to operate as the spindle speed exceeds its rated speed, the VIF is equipped so that the spindle speed can be appropriately slowed down once any stronger wind field is exerted. To prevent any potential damage due to collision by shaft against conventional bearings, the AMB unit is proposed to regulate the shaft position deviation. By singular perturbation order-reduction technique, a lower-order plant model can be established for the synthesis of feedback controller. Two major system parameter uncertainties, an additive uncertainty and a multiplicative uncertainty, are constituted by the wind turbine and the VIF respectively. Frequency Shaping Sliding Mode Control (FSSMC) loop is proposed to account for these uncertainties and suppress the unmodeled higher-order plant dynamics. At last, the efficacy of the FSSMC is verified by intensive computer and experimental simulations for regulation on position deviation of the shaft and counter-balance of unpredictable wind disturbance.

Keywords: Sliding Mode Control, Singular Perturbation, Variable Inertia Flywheel.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1455
256 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
255 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement

Authors: Nadezhda Kvatashidze

Abstract:

The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.

Keywords: Conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2508
254 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: Enhanced ideal gas molecular movement, ideal gas molecular movement, model updating method, probability-based damage detection, uncertainty quantification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1075
253 Long Term Examination of the Profitability Estimation Focused on Benefits

Authors: Stephan Printz, Kristina Lahl, René Vossen, Sabina Jeschke

Abstract:

Strategic investment decisions are characterized by high innovation potential and long-term effects on the competitiveness of enterprises. Due to the uncertainty and risks involved in this complex decision making process, the need arises for well-structured support activities. A method that considers cost and the long-term added value is the cost-benefit effectiveness estimation. One of those methods is the “profitability estimation focused on benefits – PEFB”-method developed at the Institute of Management Cybernetics at RWTH Aachen University. The method copes with the challenges associated with strategic investment decisions by integrating long-term non-monetary aspects whilst also mapping the chronological sequence of an investment within the organization’s target system. Thus, this method is characterized as a holistic approach for the evaluation of costs and benefits of an investment. This participation-oriented method was applied to business environments in many workshops. The results of the workshops are a library of more than 96 cost aspects, as well as 122 benefit aspects. These aspects are preprocessed and comparatively analyzed with regards to their alignment to a series of risk levels. For the first time, an accumulation and a distribution of cost and benefit aspects regarding their impact and probability of occurrence are given. The results give evidence that the PEFB-method combines precise measures of financial accounting with the incorporation of benefits. Finally, the results constitute the basics for using information technology and data science for decision support when applying within the PEFB-method.

Keywords: Cost-benefit analysis, multi-criteria decision, profitability estimation focused on benefits, risk and uncertainty analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
252 Normalization and Constrained Optimization of Measures of Fuzzy Entropy

Authors: K.C. Deshmukh, P.G. Khot, Nikhil

Abstract:

In the literature of information theory, there is necessity for comparing the different measures of fuzzy entropy and this consequently, gives rise to the need for normalizing measures of fuzzy entropy. In this paper, we have discussed this need and hence developed some normalized measures of fuzzy entropy. It is also desirable to maximize entropy and to minimize directed divergence or distance. Keeping in mind this idea, we have explained the method of optimizing different measures of fuzzy entropy.

Keywords: Fuzzy set, Uncertainty, Fuzzy entropy, Normalization, Membership function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
251 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 933
250 Average Turbulent Pipe Flow with Heat Transfer Using a Three-Equation Model

Authors: Khalid Alammar

Abstract:

Aim of this study is to evaluate a new three-equation turbulence model applied to flow and heat transfer through a pipe. Uncertainty is approximated by comparing with published direct numerical simulation results for fully-developed flow. Error in the mean axial velocity, temperature, friction, and heat transfer is found to be negligible.

Keywords: Heat Transfer, Nusselt number, Skin friction, Turbulence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447
249 Mean Codeword Lengths and Their Correspondence with Entropy Measures

Authors: R.K.Tuli

Abstract:

The objective of the present communication is to develop new genuine exponentiated mean codeword lengths and to study deeply the problem of correspondence between well known measures of entropy and mean codeword lengths. With the help of some standard measures of entropy, we have illustrated such a correspondence. In literature, we usually come across many inequalities which are frequently used in information theory. Keeping this idea in mind, we have developed such inequalities via coding theory approach.

Keywords: Codeword, Code alphabet, Uniquely decipherablecode, Mean codeword length, Uncertainty, Noiseless channel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1701
248 Automated Ranking of Hints

Authors: Sylvia Encheva

Abstract:

The importance of hints in an intelligent tutoring system is well understood. The problems however related to their delivering are quite a few. In this paper we propose delivering of hints to be based on considering their usefulness. By this we mean that a hint is regarded as useful to a student if the student has succeeded to solve a problem after the hint was suggested to her/him. Methods from the theory of partial orderings are further applied facilitating an automated process of offering individualized advises on how to proceed in order to solve a particular problem.

Keywords: Decision support services, uncertainty management, partial orderings.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491