Search results for: stochastic uncertainty analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27504

Search results for: stochastic uncertainty analysis

27384 Evaluation of Reliability Flood Control System Based on Uncertainty of Flood Discharge, Case Study Wulan River, Central Java, Indonesia

Authors: Anik Sarminingsih, Krishna V. Pradana

Abstract:

The failure of flood control system can be caused by various factors, such as not considering the uncertainty of designed flood causing the capacity of the flood control system is exceeded. The presence of the uncertainty factor is recognized as a serious issue in hydrological studies. Uncertainty in hydrological analysis is influenced by many factors, starting from reading water elevation data, rainfall data, selection of method of analysis, etc. In hydrological modeling selection of models and parameters corresponding to the watershed conditions should be evaluated by the hydraulic model in the river as a drainage channel. River cross-section capacity is the first defense in knowing the reliability of the flood control system. Reliability of river capacity describes the potential magnitude of flood risk. Case study in this research is Wulan River in Central Java. This river occurring flood almost every year despite some efforts to control floods such as levee, floodway and diversion. The flood-affected areas include several sub-districts, mainly in Kabupaten Kudus and Kabupaten Demak. First step is analyze the frequency of discharge observation from Klambu weir which have time series data from 1951-2013. Frequency analysis is performed using several distribution frequency models such as Gumbel distribution, Normal, Normal Log, Pearson Type III and Log Pearson. The result of the model based on standard deviation overlaps, so the maximum flood discharge from the lower return periods may be worth more than the average discharge for larger return periods. The next step is to perform a hydraulic analysis to evaluate the reliability of river capacity based on the flood discharge resulted from several methods. The selection of the design flood discharge of flood control system is the result of the method closest to bankfull capacity of the river.

Keywords: design flood, hydrological model, reliability, uncertainty, Wulan river

Procedia PDF Downloads 258
27383 A Robust Optimization for Multi-Period Lost-Sales Inventory Control Problem

Authors: Shunichi Ohmori, Sirawadee Arunyanart, Kazuho Yoshimoto

Abstract:

We consider a periodic review inventory control problem of minimizing production cost, inventory cost, and lost-sales under demand uncertainty, in which product demands are not specified exactly and it is only known to belong to a given uncertainty set, yet the constraints must hold for possible values of the data from the uncertainty set. We propose a robust optimization formulation for obtaining lowest cost possible and guaranteeing the feasibility with respect to range of order quantity and inventory level under demand uncertainty. Our formulation is based on the adaptive robust counterpart, which suppose order quantity is affine function of past demands. We derive certainty equivalent problem via second-order cone programming, which gives 'not too pessimistic' worst-case.

Keywords: robust optimization, inventory control, supply chain managment, second-order programming

Procedia PDF Downloads 375
27382 The Effect of Perceived Environmental Uncertainty on Corporate Entrepreneurship Performance: A Field Study in a Large Industrial Zone in Turkey

Authors: Adem Öğüt, M. Tahir Demirsel

Abstract:

Rapid changes and developments today, besides the opportunities and facilities they offer to the organization, may also be a source of danger and difficulties due to the uncertainty. In order to take advantage of opportunities and to take the necessary measures against possible uncertainties, organizations must always follow the changes and developments that occur in the business environment and develop flexible structures and strategies for the alternative cases. Perceived environmental uncertainty is an outcome of managers’ perceptions of the combined complexity, instability and unpredictability in the organizational environment. An environment that is perceived to be complex, changing rapidly, and difficult to predict creates high levels of uncertainty about the appropriate organizational responses to external circumstances. In an uncertain and complex environment, organizations experiencing cutthroat competition may be successful by developing their corporate entrepreneurial ability. Corporate entrepreneurship is a process that includes many elements such as innovation, creating new business, renewal, risk-taking and being predictive. Successful corporate entrepreneurship is a critical factor which has a significant contribution to gain a sustainable competitive advantage, to renew the organization and to adapt the environment. In this context, the objective of this study is to investigate the effect of perceived environmental uncertainty of managers on corporate entrepreneurship performance. The research was conducted on 222 business executives in one of the major industrial zones of Turkey, Konya Organized Industrial Zone (KOS). According to the results, it has been observed that there is a positive statistically significant relationship between perceived environmental uncertainty and corporate entrepreneurial activities.

Keywords: corporate entrepreneurship, entrepreneurship, industrial zone, perceived environmental uncertainty, uncertainty

Procedia PDF Downloads 280
27381 Global Direct Search Optimization of a Tuned Liquid Column Damper Subject to Stochastic Load

Authors: Mansour H. Alkmim, Adriano T. Fabro, Marcus V. G. De Morais

Abstract:

In this paper, a global direct search optimization algorithm to reduce vibration of a tuned liquid column damper (TLCD), a class of passive structural control device, is presented. The objective is to find optimized parameters for the TLCD under stochastic load from different wind power spectral density. A verification is made considering the analytical solution of an undamped primary system under white noise excitation. Finally, a numerical example considering a simplified wind turbine model is given to illustrate the efficacy of the TLCD. Results from the random vibration analysis are shown for four types of random excitation wind model where the response PSDs obtained showed good vibration attenuation.

Keywords: generalized pattern search, parameter optimization, random vibration analysis, vibration suppression

Procedia PDF Downloads 238
27380 Evolution of Performance Measurement Methods in Conditions of Uncertainty: The Implementation of Fuzzy Sets in Performance Measurement

Authors: E. A. Tkachenko, E. M. Rogova, V. V. Klimov

Abstract:

One of the basic issues of development management is connected with performance measurement as a prerequisite for identifying the achievement of development objectives. The aim of our research is to develop an improved model of assessing a company’s development results. The model should take into account the cyclical nature of development and the high degree of uncertainty in dealing with numerous management tasks. Our hypotheses may be formulated as follows: Hypothesis 1. The cycle of a company’s development may be studied from the standpoint of a project cycle. To do that, methods and tools of project analysis are to be used. Hypothesis 2. The problem of the uncertainty when justifying managerial decisions within the framework of a company’s development cycle can be solved through the use of the mathematical apparatus of fuzzy logic. The reasoned justification of the validity of the hypotheses made is given in the suggested article. The fuzzy logic toolkit applies to the case of technology shift within an enterprise. It is proven that some restrictions in performance measurement that are incurred to conventional methods could be eliminated by implementation of the fuzzy logic apparatus in performance measurement models.

Keywords: logic, fuzzy sets, performance measurement, project analysis

Procedia PDF Downloads 337
27379 Valuing Social Sustainability in Agriculture: An Approach Based on Social Outputs’ Shadow Prices

Authors: Amer Ait Sidhoum

Abstract:

Interest in sustainability has gained ground among practitioners, academics and policy-makers due to growing stakeholders’ awareness of environmental and social concerns. This is particularly true for agriculture. However, relatively little research has been conducted on the quantification of social sustainability and the contribution of social issues to the agricultural production efficiency. This research's main objective is to propose a method for evaluating prices of social outputs, more precisely shadow prices, by allowing for the stochastic nature of agricultural production that is to say for production uncertainty. In this article, the assessment of social outputs’ shadow prices is conducted within the methodological framework of nonparametric Data Envelopment Analysis (DEA). An output-oriented directional distance function (DDF) is implemented to represent the technology of a sample of Catalan arable crop farms and derive the efficiency scores the overall production technology of our sample is assumed to be the intersection of two different sub-technologies. The first sub-technology models the production of random desirable agricultural outputs, while the second sub-technology reflects the social outcomes from agricultural activities. Once a nonparametric production technology has been represented, the DDF primal approach can be used for efficiency measurement, while shadow prices are drawn from the dual representation of the DDF. Computing shadow prices is a method to assign an economic value to non-marketed social outcomes. Our research uses cross sectional, farm-level data collected in 2015 from a sample of 180 Catalan arable crop farms specialized in the production of cereals, oilseeds and protein (COP) crops. Our results suggest that our sample farms show high performance scores, from 85% for the bad state of nature to 88% for the normal and ideal crop growing conditions. This suggests that farm performance is increasing with an improvement in crop growth conditions. Results also show that average shadow prices of desirable state-contingent output and social outcomes for efficient and inefficient farms are positive, suggesting that the production of desirable marketable outputs and of non-marketable outputs makes a positive contribution to the farm production efficiency. Results also indicate that social outputs’ shadow prices are contingent upon the growing conditions. The shadow prices follow an upward trend as crop-growing conditions improve. This finding suggests that these efficient farms prefer to allocate more resources in the production of desirable outputs than of social outcomes. To our knowledge, this study represents the first attempt to compute shadow prices of social outcomes while accounting for the stochastic nature of the production technology. Our findings suggest that the decision-making process of the efficient farms in dealing with social issues are stochastic and strongly dependent on the growth conditions. This implies that policy-makers should adjust their instruments according to the stochastic environmental conditions. An optimal redistribution of rural development support, by increasing the public payment with the improvement in crop growth conditions, would likely enhance the effectiveness of public policies.

Keywords: data envelopment analysis, shadow prices, social sustainability, sustainable farming

Procedia PDF Downloads 92
27378 Understanding the Influence of Fibre Meander on the Tensile Properties of Advanced Composite Laminates

Authors: Gaoyang Meng, Philip Harrison

Abstract:

When manufacturing composite laminates, the fibre directions within the laminate are never perfectly straight and inevitably contain some degree of stochastic in-plane waviness or ‘meandering’. In this work we aim to understand the relationship between the degree of meandering of the fibre paths, and the resulting uncertainty in the laminate’s final mechanical properties. To do this, a numerical tool is developed to automatically generate meandering fibre paths in each of the laminate's 8 plies (using Matlab) and after mapping this information into finite element simulations (using Abaqus), the statistical variability of the tensile mechanical properties of a [45°/90°/-45°/0°]s carbon/epoxy (IM7/8552) laminate is predicted. The stiffness, first ply failure strength and ultimate failure strength are obtained. Results are generated by inputting the degree of variability in the fibre paths and the laminate is then examined in all directions (from 0° to 359° in increments of 1°). The resulting predictions are output as flower (polar) plots for convenient analysis. The average fibre orientation of each ply in a given laminate is determined by the laminate layup code [45°/90°/-45°/0°]s. However, in each case, the plies contain increasingly large amounts of in-plane waviness (quantified by the standard deviation of the fibre direction in each ply across the laminate. Four different amounts of variability in the fibre direction are tested (2°, 4°, 6° and 8°). Results show that both the average tensile stiffness and the average tensile strength decrease, while the standard deviations increase, with an increasing degree of fibre meander. The variability in stiffness is found to be relatively insensitive to the rotation angle, but the variability in strength is sensitive. Specifically, the uncertainty in laminate strength is relatively low at orientations centred around multiples of 45° rotation angle, and relatively high between these rotation angles. To concisely represent all the information contained in the various polar plots, rotation-angle dependent Weibull distribution equations are fitted to the data. The resulting equations can be used to quickly estimate the size of the errors bars for the different mechanical properties, resulting from the amount of fibre directional variability contained within the laminate. A longer term goal is to use these equations to quickly introduce realistic variability at the component level.

Keywords: advanced composite laminates, FE simulation, in-plane waviness, tensile properties, uncertainty quantification

Procedia PDF Downloads 51
27377 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 63
27376 Scheduling Jobs with Stochastic Processing Times or Due Dates on a Server to Minimize the Number of Tardy Jobs

Authors: H. M. Soroush

Abstract:

The problem of scheduling products and services for on-time deliveries is of paramount importance in today’s competitive environments. It arises in many manufacturing and service organizations where it is desirable to complete jobs (products or services) with different weights (penalties) on or before their due dates. In such environments, schedules should frequently decide whether to schedule a job based on its processing time, due-date, and the penalty for tardy delivery to improve the system performance. For example, it is common to measure the weighted number of late jobs or the percentage of on-time shipments to evaluate the performance of a semiconductor production facility or an automobile assembly line. In this paper, we address the problem of scheduling a set of jobs on a server where processing times or due-dates of jobs are random variables and fixed weights (penalties) are imposed on the jobs’ late deliveries. The goal is to find the schedule that minimizes the expected weighted number of tardy jobs. The problem is NP-hard to solve; however, we explore three scenarios of the problem wherein: (i) both processing times and due-dates are stochastic; (ii) processing times are stochastic and due-dates are deterministic; and (iii) processing times are deterministic and due-dates are stochastic. We prove that special cases of these scenarios are solvable optimally in polynomial time, and introduce efficient heuristic methods for the general cases. Our computational results show that the heuristics perform well in yielding either optimal or near optimal sequences. The results also demonstrate that the stochasticity of processing times or due-dates can affect scheduling decisions. Moreover, the proposed problem is general in the sense that its special cases reduce to some new and some classical stochastic single machine models.

Keywords: number of late jobs, scheduling, single server, stochastic

Procedia PDF Downloads 453
27375 Low Cost Inertial Sensors Modeling Using Allan Variance

Authors: A. A. Hussen, I. N. Jleta

Abstract:

Micro-electromechanical system (MEMS) accelerometers and gyroscopes are suitable for the inertial navigation system (INS) of many applications due to the low price, small dimensions and light weight. The main disadvantage in a comparison with classic sensors is a worse long term stability. The estimation accuracy is mostly affected by the time-dependent growth of inertial sensor errors, especially the stochastic errors. In order to eliminate negative effect of these random errors, they must be accurately modeled. Where the key is the successful implementation that depends on how well the noise statistics of the inertial sensors is selected. In this paper, the Allan variance technique will be used in modeling the stochastic errors of the inertial sensors. By performing a simple operation on the entire length of data, a characteristic curve is obtained whose inspection provides a systematic characterization of various random errors contained in the inertial-sensor output data.

Keywords: Allan variance, accelerometer, gyroscope, stochastic errors

Procedia PDF Downloads 396
27374 Elementary Education Outcome Efficiency in Indian States

Authors: Jyotsna Rosario, K. R. Shanmugam

Abstract:

Since elementary education is a merit good, considerable public resources are allocated to universalise it. However, elementary education outcomes vary across the Indian States. Evidences indicate that while some states are lagging in elementary education outcome primarily due to lack of resources and poor schooling infrastructure, others are lagging despite resource abundance and well-developed schooling infrastructure. Addressing the issue of efficiency, the study employs Stochastic Frontier Analysis for panel data of 27 Indian states from 2012-13 to 2017-18 to estimate the technical efficiency of State governments in generating enrolment. The mean efficiency of states was estimated to be 58%. Punjab, Meghalaya, and West Bengal were found to be the most efficient states. Whereas Jammu and Kashmir, Nagaland, Madhya Pradesh, and Odisha are one of the most inefficient states. This study emphasizes the efficient utilisation of public resources and helps in the identification of best practices.

Keywords: technical efficiency, public expenditure, elementary education outcome, stochastic frontier analysis

Procedia PDF Downloads 139
27373 Multi-Criteria Based Robust Markowitz Model under Box Uncertainty

Authors: Pulak Swain, A. K. Ojha

Abstract:

Portfolio optimization is based on dealing with the problems of efficient asset allocation. Risk and Expected return are two conflicting criteria in such problems, where the investor prefers the return to be high and the risk to be low. Using multi-objective approach we can solve those type of problems. However the information which we have for the input parameters are generally ambiguous and the input values can fluctuate around some nominal values. We can not ignore the uncertainty in input values, as they can affect the asset allocation drastically. So we use Robust Optimization approach to the problems where the input parameters comes under box uncertainty. In this paper, we solve the multi criteria robust problem with the help of  E- constraint method.

Keywords: portfolio optimization, multi-objective optimization, ϵ - constraint method, box uncertainty, robust optimization

Procedia PDF Downloads 105
27372 Modelling High-Frequency Crude Oil Dynamics Using Affine and Non-Affine Jump-Diffusion Models

Authors: Katja Ignatieva, Patrick Wong

Abstract:

We investigated the dynamics of high frequency energy prices, including crude oil and electricity prices. The returns of underlying quantities are modelled using various parametric models such as stochastic framework with jumps and stochastic volatility (SVCJ) as well as non-parametric alternatives, which are purely data driven and do not require specification of the drift or the diffusion coefficient function. Using different statistical criteria, we investigate the performance of considered parametric and nonparametric models in their ability to forecast price series and volatilities. Our models incorporate possible seasonalities in the underlying dynamics and utilise advanced estimation techniques for the dynamics of energy prices.

Keywords: stochastic volatility, affine jump-diffusion models, high frequency data, model specification, markov chain monte carlo

Procedia PDF Downloads 65
27371 Least Squares Solution for Linear Quadratic Gaussian Problem with Stochastic Approximation Approach

Authors: Sie Long Kek, Wah June Leong, Kok Lay Teo

Abstract:

Linear quadratic Gaussian model is a standard mathematical model for the stochastic optimal control problem. The combination of the linear quadratic estimation and the linear quadratic regulator allows the state estimation and the optimal control policy to be designed separately. This is known as the separation principle. In this paper, an efficient computational method is proposed to solve the linear quadratic Gaussian problem. In our approach, the Hamiltonian function is defined, and the necessary conditions are derived. In addition to this, the output error is defined and the least-square optimization problem is introduced. By determining the first-order necessary condition, the gradient of the sum squares of output error is established. On this point of view, the stochastic approximation approach is employed such that the optimal control policy is updated. Within a given tolerance, the iteration procedure would be stopped and the optimal solution of the linear-quadratic Gaussian problem is obtained. For illustration, an example of the linear-quadratic Gaussian problem is studied. The result shows the efficiency of the approach proposed. In conclusion, the applicability of the approach proposed for solving the linear quadratic Gaussian problem is highly demonstrated.

Keywords: iteration procedure, least squares solution, linear quadratic Gaussian, output error, stochastic approximation

Procedia PDF Downloads 129
27370 Calibration of Hybrid Model and Arbitrage-Free Implied Volatility Surface

Authors: Kun Huang

Abstract:

This paper investigates whether the combination of local and stochastic volatility models can be calibrated exactly to any arbitrage-free implied volatility surface of European option. The risk neutral Brownian Bridge density is applied for calibration of the leverage function of our Hybrid model. Furthermore, the tails of marginal risk neutral density are generated by Generalized Extreme Value distribution in order to capture the properties of asset returns. The local volatility is generated from the arbitrage-free implied volatility surface using stochastic volatility inspired parameterization.

Keywords: arbitrage free implied volatility, calibration, extreme value distribution, hybrid model, local volatility, risk-neutral density, stochastic volatility

Procedia PDF Downloads 234
27369 The European Research and Development Project Improved Nuclear Site Characterization for Waste Minimization in Decommissioning under Constrained Environment: Focus on Performance Analysis and Overall Uncertainty

Authors: M. Crozet, D. Roudil, T. Branger, S. Boden, P. Peerani, B. Russell, M. Herranz, L. Aldave de la Heras

Abstract:

The EURATOM work program project INSIDER (Improved Nuclear Site Characterization for Waste minimization in Decommissioning under Constrained Environment) was launched in June 2017. This 4-year project has 18 partners and aims at improving the management of contaminated materials arising from decommissioning and dismantling (D&D) operations by proposing an integrated methodology of characterization. This methodology is based on advanced statistical processing and modelling, coupled with adapted and innovative analytical and measurement methods, with respect to sustainability and economic objectives. In order to achieve these objectives, the approaches will be then applied to common case studies in the form of Inter-laboratory comparisons on matrix representative reference samples and benchmarking. Work Package 6 (WP6) ‘Performance analysis and overall uncertainty’ is in charge of the analysis of the benchmarking on real samples, the organisation of inter-laboratory comparison on synthetic certified reference materials and the establishment of overall uncertainty budget. Assessment of the outcome will be used for providing recommendations and guidance resulting in pre-standardization tests.

Keywords: decommissioning, sampling strategy, research and development, characterization, European project

Procedia PDF Downloads 328
27368 A Stochastic Volatility Model for Optimal Market-Making

Authors: Zubier Arfan, Paul Johnson

Abstract:

The electronification of financial markets and the rise of algorithmic trading has sparked a lot of interest from the mathematical community, for the market making-problem in particular. The research presented in this short paper solves the classic stochastic control problem in order to derive the strategy for a market-maker. It also shows how to calibrate and simulate the strategy with real limit order book data for back-testing. The ambiguity of limit-order priority in back-testing is dealt with by considering optimistic and pessimistic priority scenarios. The model, although it does outperform a naive strategy, assumes constant volatility, therefore, is not best suited to the LOB data. The Heston model is introduced to describe the price and variance process of the asset. The Trader's constant absolute risk aversion utility function is optimised by numerically solving a 3-dimensional Hamilton-Jacobi-Bellman partial differential equation to find the optimal limit order quotes. The results show that the stochastic volatility market-making model is more suitable for a risk-averse trader and is also less sensitive to calibration error than the constant volatility model.

Keywords: market-making, market-microsctrucure, stochastic volatility, quantitative trading

Procedia PDF Downloads 109
27367 Computational Simulations on Stability of Model Predictive Control for Linear Discrete-Time Stochastic Systems

Authors: Tomoaki Hashimoto

Abstract:

Model predictive control is a kind of optimal feedback control in which control performance over a finite future is optimized with a performance index that has a moving initial time and a moving terminal time. This paper examines the stability of model predictive control for linear discrete-time systems with additive stochastic disturbances. A sufficient condition for the stability of the closed-loop system with model predictive control is derived by means of a linear matrix inequality. The objective of this paper is to show the results of computational simulations in order to verify the validity of the obtained stability condition.

Keywords: computational simulations, optimal control, predictive control, stochastic systems, discrete-time systems

Procedia PDF Downloads 396
27366 Theoretical Appraisal of Satisfactory Decision: Uncertainty, Evolutionary Ideas and Beliefs, Satisfactory Time Use

Authors: Okay Gunes

Abstract:

Unsatisfactory experiences due to an information shortage regarding the future pay-offs of actual choices, yield satisficing decision-making. This research will examine, for the first time in the literature, the motivation behind suboptimal decisions due to uncertainty by subjecting Adam Smith’s and Jeremy Bentham’s assumptions about the nature of the actions that lead to satisficing behavior, in order to clarify the theoretical background of a “consumption-based satisfactory time” concept. The contribution of this paper with respect to the existing literature is threefold: Firstly, it is showed in this paper that Adam Smith’s uncertainty is related to the problem of the constancy of ideas and not related directly to beliefs. Secondly, possessions, as in Jeremy Bentham’s oeuvre, are assumed to be just as pleasing, as protecting and improving the actual or expected quality of life, so long as they reduce any displeasure due to the undesired outcomes of uncertainty. Finally, each consumption decision incurs its own satisfactory time period, owed to not feeling hungry, being healthy, not having transportation…etc. This reveals that the level of satisfaction is indeed a behavioral phenomenon where its value would depend on the simultaneous satisfaction derived from all activities.

Keywords: decision-making, idea and belief, satisficing, uncertainty

Procedia PDF Downloads 247
27365 Stochastic Modeling for Parameters of Modified Car-Following Model in Area-Based Traffic Flow

Authors: N. C. Sarkar, A. Bhaskar, Z. Zheng

Abstract:

The driving behavior in area-based (i.e., non-lane based) traffic is induced by the presence of other individuals in the choice space from the driver’s visual perception area. The driving behavior of a subject vehicle is constrained by the potential leaders and leaders are frequently changed over time. This paper is to determine a stochastic model for a parameter of modified intelligent driver model (MIDM) in area-based traffic (as in developing countries). The parametric and non-parametric distributions are presented to fit the parameters of MIDM. The goodness of fit for each parameter is measured in two different ways such as graphically and statistically. The quantile-quantile (Q-Q) plot is used for a graphical representation of a theoretical distribution to model a parameter and the Kolmogorov-Smirnov (K-S) test is used for a statistical measure of fitness for a parameter with a theoretical distribution. The distributions are performed on a set of estimated parameters of MIDM. The parameters are estimated on the real vehicle trajectory data from India. The fitness of each parameter with a stochastic model is well represented. The results support the applicability of the proposed modeling for parameters of MIDM in area-based traffic flow simulation.

Keywords: area-based traffic, car-following model, micro-simulation, stochastic modeling

Procedia PDF Downloads 119
27364 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: decentralized, optimal control, output, singular perturb

Procedia PDF Downloads 328
27363 Stochastic Modeling and Productivity Analysis of a Flexible Manufacturing System

Authors: Mehmet Savsar, Majid Aldaihani

Abstract:

Flexible Manufacturing Systems (FMS) are used to produce a variety of parts on the same equipment. Therefore, their utilization is higher than traditional machining systems. Higher utilization, on the other hand, results in more frequent equipment failures and additional need for maintenance. Therefore, it is necessary to carefully analyze operational characteristics and productivity of FMS or Flexible Manufacturing Cells (FMC), which are smaller configuration of FMS, before installation or during their operation. Appropriate models should be developed to determine production rates based on operational conditions, including equipment reliability, availability, and repair capacity. In this paper, a stochastic model is developed for an automated FMC system, which consists of two machines served by two robots and a single repairman. The model is used to determine system productivity and equipment utilization under different operational conditions, including random machine failures, random repairs, and limited repair capacity. The results are compared to previous study results for FMC system with sufficient repair capacity assigned to each machine. The results show that the model will be useful for design engineers and operational managers to analyze performance of manufacturing systems at the design or operational stages.

Keywords: flexible manufacturing, FMS, FMC, stochastic modeling, production rate, reliability, availability

Procedia PDF Downloads 483
27362 Relationship among Teams' Information Processing Capacity and Performance in Information System Projects: The Effects of Uncertainty and Equivocality

Authors: Ouafa Sakka, Henri Barki, Louise Cote

Abstract:

Uncertainty and equivocality are defined in the information processing literature as two task characteristics that require different information processing responses from managers. As uncertainty often stems from a lack of information, addressing it is thought to require the collection of additional data. On the other hand, as equivocality stems from ambiguity and a lack of understanding of the task at hand, addressing it is thought to require rich communication between those involved. Past research has provided weak to moderate empirical support to these hypotheses. The present study contributes to this literature by defining uncertainty and equivocality at the project level and investigating their moderating effects on the association between several project information processing constructs and project performance. The information processing constructs considered are the amount of information collected by the project team, and the richness and frequency of formal communications among the team members to discuss the project’s follow-up reports. Data on 93 information system development (ISD) project managers was collected in a questionnaire survey and analyzed it via the Fisher Test for correlation differences. The results indicate that the highest project performance levels were observed in projects characterized by high uncertainty and low equivocality in which project managers were provided with detailed and updated information on project costs and schedules. In addition, our findings show that information about user needs and technical aspects of the project is less useful to managing projects where uncertainty and equivocality are high. Further, while the strongest positive effect of interactive use of follow-up reports on performance occurred in projects where both uncertainty and equivocality levels were high, its weakest effect occurred when both of these were low.

Keywords: uncertainty, equivocality, information processing model, management control systems, project control, interactive use, diagnostic use, information system development

Procedia PDF Downloads 253
27361 Objective Assessment of the Evolution of Microplastic Contamination in Sediments from a Vast Coastal Area

Authors: Vanessa Morgado, Ricardo Bettencourt da Silva, Carla Palma

Abstract:

The environmental pollution by microplastics is well recognized. Microplastics were already detected in various matrices from distinct environmental compartments worldwide, some from remote areas. Various methodologies and techniques have been used to determine microplastic in such matrices, for instance, sediment samples from the ocean bottom. In order to determine microplastics in a sediment matrix, the sample is typically sieved through a 5 mm mesh, digested to remove the organic matter, and density separated to isolate microplastics from the denser part of the sediment. The physical analysis of microplastic consists of visual analysis under a stereomicroscope to determine particle size, colour, and shape. The chemical analysis is performed by an infrared spectrometer coupled to a microscope (micro-FTIR), allowing to the identification of the chemical composition of microplastic, i.e., the type of polymer. Creating legislation and policies to control and manage (micro)plastic pollution is essential to protect the environment, namely the coastal areas. The regulation is defined from the known relevance and trends of the pollution type. This work discusses the assessment of contamination trends of a 700 km² oceanic area affected by contamination heterogeneity, sampling representativeness, and the uncertainty of the analysis of collected samples. The methodology developed consists of objectively identifying meaningful variations of microplastic contamination by the Monte Carlo simulation of all uncertainty sources. This work allowed us to unequivocally conclude that the contamination level of the studied area did not vary significantly between two consecutive years (2018 and 2019) and that PET microplastics are the major type of polymer. The comparison of contamination levels was performed for a 99% confidence level. The developed know-how is crucial for the objective and binding determination of microplastic contamination in relevant environmental compartments.

Keywords: measurement uncertainty, micro-ATR-FTIR, microplastics, ocean contamination, sampling uncertainty

Procedia PDF Downloads 51
27360 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 389
27359 An Accelerated Stochastic Gradient Method with Momentum

Authors: Liang Liu, Xiaopeng Luo

Abstract:

In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.

Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum

Procedia PDF Downloads 123
27358 Synthesis of Filtering in Stochastic Systems on Continuous-Time Memory Observations in the Presence of Anomalous Noises

Authors: S. Rozhkova, O. Rozhkova, A. Harlova, V. Lasukov

Abstract:

We have conducted the optimal synthesis of root-mean-squared objective filter to estimate the state vector in the case if within the observation channel with memory the anomalous noises with unknown mathematical expectation are complement in the function of the regular noises. The synthesis has been carried out for linear stochastic systems of continuous-time.

Keywords: mathematical expectation, filtration, anomalous noise, memory

Procedia PDF Downloads 207
27357 Estimation of Probabilistic Fatigue Crack Propagation Models of AZ31 Magnesium Alloys under Various Load Ratio Conditions by Using the Interpolation of a Random Variable

Authors: Seon Soon Choi

Abstract:

The essential purpose is to present the good fatigue crack propagation model describing a stochastic fatigue crack growth behavior in a rolled magnesium alloy, AZ31, under various load ratio conditions. Fatigue crack propagation experiments were carried out in laboratory air under four conditions of load ratio, R, using AZ31 to investigate the crack growth behavior. The stochastic fatigue crack growth behavior was analyzed using an interpolation of random variable, Z, introduced to an empirical fatigue crack propagation model. The empirical fatigue models used in this study are Paris-Erdogan model, Walker model, Forman model, and modified Forman model. It was found that the random variable is useful in describing the stochastic fatigue crack growth behaviors under various load ratio conditions. The good probabilistic model describing a stochastic fatigue crack growth behavior under various load ratio conditions was also proposed.

Keywords: magnesium alloys, fatigue crack propagation model, load ratio, interpolation of random variable

Procedia PDF Downloads 378
27356 Uncertainty Quantification of Crack Widths and Crack Spacing in Reinforced Concrete

Authors: Marcel Meinhardt, Manfred Keuser, Thomas Braml

Abstract:

Cracking of reinforced concrete is a complex phenomenon induced by direct loads or restraints affecting reinforced concrete structures as soon as the tensile strength of the concrete is exceeded. Hence it is important to predict where cracks will be located and how they will propagate. The bond theory and the crack formulas in the actual design codes, for example, DIN EN 1992-1-1, are all based on the assumption that the reinforcement bars are embedded in homogeneous concrete without taking into account the influence of transverse reinforcement and the real stress situation. However, it can often be observed that real structures such as walls, slabs or beams show a crack spacing that is orientated to the transverse reinforcement bars or to the stirrups. In most Finite Element Analysis studies, the smeared crack approach is used for crack prediction. The disadvantage of this model is that the typical strain localization of a crack on element level can’t be seen. The crack propagation in concrete is a discontinuous process characterized by different factors such as the initial random distribution of defects or the scatter of material properties. Such behavior presupposes the elaboration of adequate models and methods of simulation because traditional mechanical approaches deal mainly with average material parameters. This paper concerned with the modelling of the initiation and the propagation of cracks in reinforced concrete structures considering the influence of transverse reinforcement and the real stress distribution in reinforced concrete (R/C) beams/plates in bending action. Therefore, a parameter study was carried out to investigate: (I) the influence of the transversal reinforcement to the stress distribution in concrete in bending mode and (II) the crack initiation in dependence of the diameter and distance of the transversal reinforcement to each other. The numerical investigations on the crack initiation and propagation were carried out with a 2D reinforced concrete structure subjected to quasi static loading and given boundary conditions. To model the uncertainty in the tensile strength of concrete in the Finite Element Analysis correlated normally and lognormally distributed random filed with different correlation lengths were generated. The paper also presents and discuss different methods to generate random fields, e.g. the Covariance Matrix Decomposition Method. For all computations, a plastic constitutive law with softening was used to model the crack initiation and the damage of the concrete in tension. It was found that the distributions of crack spacing and crack widths are highly dependent of the used random field. These distributions are validated to experimental studies on R/C panels which were carried out at the Laboratory for Structural Engineering at the University of the German Armed Forces in Munich. Also, a recommendation for parameters of the random field for realistic modelling the uncertainty of the tensile strength is given. The aim of this research was to show a method in which the localization of strains and cracks as well as the influence of transverse reinforcement on the crack initiation and propagation in Finite Element Analysis can be seen.

Keywords: crack initiation, crack modelling, crack propagation, cracks, numerical simulation, random fields, reinforced concrete, stochastic

Procedia PDF Downloads 110
27355 Patients with Chronic Obstructive Pulmonary Feelings of Uncertainty

Authors: Kyngäs Helvi, Patala-Pudas, Kaakinen Pirjo

Abstract:

It has been reported that COPD -patients may experience much emotional distress, which can compromise positive health outcomes. The aim of this study was to explore disease-related uncertainty as reported by Chronic Obstructive Pulmonary Disease (COPD) patients. Uncertainty was defined as a lack of confidence; negative feelings; a sense of confidence; and awareness of the sources of uncertainty. Research design was a non-experimental cross-sectional survey. The data (n=141) was collected by validated questionnaire during COPD -patients’ visits or admissions to a tertiary hospital. The response rate was 62%. The data was analyzed by statistical methods. Around 70% of the participants were male with COPD diagnosed many years ago. Fifty-four percent were under 65 years and used an electronic respiratory aid apparatus (52%) (oxygen concentrator, ventilator or electronic inhalation device). Forty-one percent of the participants smoked. Disease-related uncertainty was widely reported. Seventy-three percent of the participants had uncertainty about their knowledge of the disease, the pulmonary medication and nutrition. One-quarter (25%) did not feel sure about managing COPD exacerbation. About forty percent (43%) reported that they did not have a written exacerbation decision aid indicating how to act in relation to COPD symptoms. Over half of the respondents were uncertain about self-management behavior related to health habits such as exercise and nutrition. Over a third of the participants (37%) felt uncertain about self-management skills related to giving up smoking. Support from the care providers was correlated significantly with the patients’ sense of confidence. COPD -patients who felt no confidence stated that they received significantly less support in care. Disease-related uncertainty should be considered more closely and broadly in the patient care context, and those strategies within patient education that enhance adherence should be strengthened and incorporated into standard practice.

Keywords: adherence, COPD, disease-management, uncertainty

Procedia PDF Downloads 209