Search results for: stochastic uncertainty analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9126

Search results for: stochastic uncertainty analysis

8826 Quantifying Freeway Capacity Reductions by Rainfall Intensities Based on Stochastic Nature of Flow Breakdown

Authors: Hoyoung Lee, Dong-Kyu Kim, Seung-Young Kho, R. Eddie Wilson

Abstract:

This study quantifies a decrement in freeway capacity during rainfall. Traffic and rainfall data were gathered from Highway Agencies and Wunderground weather service. Three inter-urban freeway sections and its nearest weather stations were selected as experimental sites. Capacity analysis found reductions of maximum and mean pre-breakdown flow rates due to rainfall. The Kruskal-Wallis test also provided some evidence to suggest that the variance in the pre-breakdown flow rate is statistically insignificant. Potential application of this study lies in the operation of real time traffic management schemes such as Variable Speed Limits (VSL), Hard Shoulder Running (HSR), and Ramp Metering System (RMS), where speed or flow limits could be set based on a number of factors, including rainfall events and their intensities.

Keywords: Capacity randomness, flow breakdown, freeway capacity, rainfall.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
8825 Analysis of the Main Concepts and Discussions Involving Sustainable Tourism

Authors: Veruska C. Dutra, Mary L. G. S. Senna

Abstract:

The development of tourism is on the use of landscapes, natural or constructed, which involves a number of factors that contribute to the deterioration of nature. Tourist activity coupled with sustainable development has led to the emergence of many questions about these terms, since they are not well defined in this sense through literature searches. The present study was to analyze the main concepts and discussions involving sustainable tourism, providing reflections that can cause answers about one of the main questions in today's activity sector on whether its sustainability is a myth or reality. The methodology of this study is discussions, theoretical studies and bibliographic research. The results showed that the scholars who address the issue, often leave uncertainty about some discussions that demonstrate that there are still many studies to be conducted in order to prove that the claims so as to form the basis of what will be Tourism sustainable.

Keywords: Tourism, sustainability, development, discussions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 821
8824 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation

Authors: Pavlo Selyshchev

Abstract:

We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.

 

Keywords: Irradiation, Primary Defects, Interaction, Fluctuations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
8823 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: Goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1041
8822 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement

Authors: Nadezhda Kvatashidze

Abstract:

The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.

Keywords: Conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2516
8821 Effect of Urea Deep Placement Technology Adoption on the Production Frontier: Evidence from Irrigation Rice Farmers in the Northern Region of Ghana

Authors: Shaibu Baanni Azumah, William Adzawla

Abstract:

Rice is an important staple crop, with current demand higher than the domestic supply in Ghana. This has led to a high and unfavourable import bill. Therefore, recent policies and interventions in the agricultural sub-sector aim at promoting various improved agricultural technologies in order to improve domestic production and reduce the importation of rice. In this study, we examined the effect of the adoption of Urea Deep Placement (UDP) technology by rice farmers on the position of the production frontier. This involved 200 farmers selected through a multi stage sampling technique in the Northern region of Ghana. A Cobb-Douglas stochastic frontier model was fitted. The result showed that the adoption of UDP technology shifts the output frontier outward and also move the farmers closer to the frontier. Farmers were also operating under diminishing returns to scale which calls for redress. Other factors that significantly influenced rice production were farm size, labour, use of certified seeds and NPK fertilizer. Although there was an opportunity for improvement, the farmers were highly efficient (92%), compared to previous studies. Farmers’ efficiency was improved through increased education, household size, experience, access to credit, and lack of extension service provision by MoFA. The study recommends the revision of Ghana’s agricultural policy to include the UDP technology. Agricultural Extension officers of the Ministry of Food and Agriculture (MoFA) should be trained on the UDP technology to support IFDC’s drive to improve adoption by rice farmers. Rice farmers are also encouraged to expand their farm lands, improve plant population, and also increase the usage of fertilizer to improve yields. Mechanisms through which credit can be made easily accessible and effectively utilised should be identified and promoted.

Keywords: Efficiency, rice farmers, stochastic frontier, UDP technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 967
8820 Nonlinear Integral-Type Sliding Surface for Synchronization of Chaotic Systems with Unknown Parameters

Authors: Hongji Tang, Yanbo Gao, Yue Yu

Abstract:

This paper presents a new nonlinear integral-type sliding surface for synchronizing two different chaotic systems with parametric uncertainty. On the basis of Lyapunov theorem and average dwelling time method, we obtain the control gains of controllers which are derived to achieve chaos synchronization. In order to reduce the gains, the error system is modeled as a switching system. We obtain the sufficient condition drawn for the robust stability of the error dynamics by stability analysis. Then we apply it to guide the design of the controllers. Finally, numerical examples are used to show the robustness and effectiveness of the proposed control strategy.

Keywords: Chaos synchronization, Nonlinear sliding surface, Control gains, Sliding mode control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2024
8819 Risk Quantification for Tunnel Excavation Process

Authors: J. Šejnoha, D. Jarušková, O. Špačková, E. Novotná

Abstract:

Construction of tunnels is connected with high uncertainty in the field of costs, construction period, safety and impact on surroundings. Risk management became therefore a common part of tunnel projects, especially after a set of fatal collapses occurred in 1990's. Such collapses are caused usually by combination of factors that can be divided into three main groups, i.e. unfavourable geological conditions, failures in the design and planning or failures in the execution. This paper suggests a procedure enabling quantification of the excavation risk related to extraordinary accidents using FTA and ETA tools. It will elaborate on a common process of risk analysis and enable the transfer of information and experience between particular tunnel construction projects. Further, it gives a guide for designers, management and other participants, how to deal with risk of such accidents and how to make qualified decisions based on a probabilistic approach.

Keywords: risk quantification, tunnel collapse, ETA, FTA, geotechnical risk

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2209
8818 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: Enhanced ideal gas molecular movement, ideal gas molecular movement, model updating method, probability-based damage detection, uncertainty quantification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077
8817 Reliability-Based Topology Optimization Based on Evolutionary Structural Optimization

Authors: Sang-Rak Kim, Jea-Yong Park, Won-Goo Lee, Jin-Shik Yu, Seog-Young Han

Abstract:

This paper presents a Reliability-Based Topology Optimization (RBTO) based on Evolutionary Structural Optimization (ESO). An actual design involves uncertain conditions such as material property, operational load and dimensional variation. Deterministic Topology Optimization (DTO) is obtained without considering of the uncertainties related to the uncertainty parameters. However, RBTO involves evaluation of probabilistic constraints, which can be done in two different ways, the reliability index approach (RIA) and the performance measure approach (PMA). Limit state function is approximated using Monte Carlo Simulation and Central Composite Design for reliability analysis. ESO, one of the topology optimization techniques, is adopted for topology optimization. Numerical examples are presented to compare the DTO with RBTO.

Keywords: Evolutionary Structural Optimization, PerformanceMeasure Approach, Reliability-Based Topology Optimization, Reliability Index Approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2801
8816 A Novel Probablistic Strategy for Modeling Photovoltaic Based Distributed Generators

Authors: Engy A. Mohamed, Yasser G. Hegazy

Abstract:

This paper presents a novel algorithm for modeling photovoltaic based distributed generators for the purpose of optimal planning of distribution networks. The proposed algorithm utilizes sequential Monte Carlo method in order to accurately consider the stochastic nature of photovoltaic based distributed generators. The proposed algorithm is implemented in MATLAB environment and the results obtained are presented and discussed.

Keywords: Comulative distribution function, distributed generation, Monte Carlo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2484
8815 Optimization of Air Pollution Control Model for Mining

Authors: Zunaira Asif, Zhi Chen

Abstract:

The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.

Keywords: Air pollution, linear programming, mining, optimization, treatment technologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
8814 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Non linear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy normal sinus rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: Heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
8813 An Intelligent Approach of Rough Set in Knowledge Discovery Databases

Authors: Hrudaya Ku. Tripathy, B. K. Tripathy, Pradip K. Das

Abstract:

Knowledge Discovery in Databases (KDD) has evolved into an important and active area of research because of theoretical challenges and practical applications associated with the problem of discovering (or extracting) interesting and previously unknown knowledge from very large real-world databases. Rough Set Theory (RST) is a mathematical formalism for representing uncertainty that can be considered an extension of the classical set theory. It has been used in many different research areas, including those related to inductive machine learning and reduction of knowledge in knowledge-based systems. One important concept related to RST is that of a rough relation. In this paper we presented the current status of research on applying rough set theory to KDD, which will be helpful for handle the characteristics of real-world databases. The main aim is to show how rough set and rough set analysis can be effectively used to extract knowledge from large databases.

Keywords: Data mining, Data tables, Knowledge discovery in database (KDD), Rough sets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2336
8812 A New Measurable Definition of Knowledge in New Growth Theory

Authors: Mohammad Ali Molaei

Abstract:

New Growth Theory helps us make sense of the ongoing shift from a resource-based economy to a knowledge-based economy. It underscores the point that the economic processes which create and diffuse new knowledge are critical to shaping the growth of nations, communities and individual firms. In all too many contributions to New (Endogenous) Growth Theory – though not in all – central reference is made to 'a stock of knowledge', a 'stock of ideas', etc., this variable featuring centre-stage in the analysis. Yet it is immediately apparent that this is far from being a crystal clear concept. The difficulty and uncertainty of being able to capture the value associated with knowledge is a real problem. The intent of this paper is introducing new thinking and theorizing about the knowledge and its measurability in new growth theory. Moreover the study aims to synthesize various strain of the literature with a practical bearing on knowledge concept. By contribution of institution framework which is found within NGT, we can indirectly measure the knowledge concept. Institutions matter because they shape the environment for production and employment of new knowledge

Keywords: Institution Framework, Knowledge, New GrowthTheory (NGT)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1544
8811 On a Conjecture Regarding the Adam Optimizer

Authors: Mohamed Akrout, Douglas Tweed

Abstract:

The great success of deep learning relies on efficient optimizers, which are the algorithms that decide how to adjust network weights and biases based on gradient information. One of the most effective and widely used optimizers in recent years has been the method of adaptive moments, or Adam, but the mathematical reasons behind its effectiveness are still unclear. Attempts to analyse its behaviour have remained incomplete, in part because they hinge on a conjecture which has never been proven, regarding ratios of powers of the first and second moments of the gradient. Here we show that this conjecture is in fact false, but that a modified version of it is true, and can take its place in analyses of Adam.

Keywords: Adam optimizer, Bock’s conjecture, stochastic optimization, average regret.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 377
8810 On the Robust Stability of Homogeneous Perturbed Large-Scale Bilinear Systems with Time Delays and Constrained Inputs

Authors: Chien-Hua Lee, Cheng-Yi Chen

Abstract:

The stability test problem for homogeneous large-scale perturbed bilinear time-delay systems subjected to constrained inputs is considered in this paper. Both nonlinear uncertainties and interval systems are discussed. By utilizing the Lyapunove equation approach associated with linear algebraic techniques, several delay-independent criteria are presented to guarantee the robust stability of the overall systems. The main feature of the presented results is that although the Lyapunov stability theorem is used, they do not involve any Lyapunov equation which may be unsolvable. Furthermore, it is seen the proposed schemes can be applied to solve the stability analysis problem of large-scale time-delay systems.

Keywords: homogeneous bilinear system, constrained input, time-delay, uncertainty, transient response, decay rate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1609
8809 Reliability Analysis of Press Unit using Vague Set

Authors: S. P. Sharma, Monica Rani

Abstract:

In conventional reliability assessment, the reliability data of system components are treated as crisp values. The collected data have some uncertainties due to errors by human beings/machines or any other sources. These uncertainty factors will limit the understanding of system component failure due to the reason of incomplete data. In these situations, we need to generalize classical methods to fuzzy environment for studying and analyzing the systems of interest. Fuzzy set theory has been proposed to handle such vagueness by generalizing the notion of membership in a set. Essentially, in a Fuzzy Set (FS) each element is associated with a point-value selected from the unit interval [0, 1], which is termed as the grade of membership in the set. A Vague Set (VS), as well as an Intuitionistic Fuzzy Set (IFS), is a further generalization of an FS. Instead of using point-based membership as in FS, interval-based membership is used in VS. The interval-based membership in VS is more expressive in capturing vagueness of data. In the present paper, vague set theory coupled with conventional Lambda-Tau method is presented for reliability analysis of repairable systems. The methodology uses Petri nets (PN) to model the system instead of fault tree because it allows efficient simultaneous generation of minimal cuts and path sets. The presented method is illustrated with the press unit of the paper mill.

Keywords: Lambda -Tau methodology, Petri nets, repairable system, vague fuzzy set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
8808 Stochastic Estimation of Wireless Traffic Parameters

Authors: Somenath Mukherjee, Raj Kumar Samanta, Gautam Sanyal

Abstract:

Different services based on different switching techniques in wireless networks leads to drastic changes in the properties of network traffic. Because of these diversities in services, network traffic is expected to undergo qualitative and quantitative variations. Hence, assumption of traffic characteristics and the prediction of network events become more complex for the wireless networks. In this paper, the traffic characteristics have been studied by collecting traces from the mobile switching centre (MSC). The traces include initiation and termination time, originating node, home station id, foreign station id. Traffic parameters namely, call interarrival and holding times were estimated statistically. The results show that call inter-arrival and distribution time in this wireless network is heavy-tailed and follow gamma distributions. They are asymptotically long-range dependent. It is also found that the call holding times are best fitted with lognormal distribution. Based on these observations, an analytical model for performance estimation is also proposed.

Keywords: Wireless networks, traffic analysis, long-range dependence, heavy-tailed distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1898
8807 Evaluation of Algorithms for Sequential Decision in Biosonar Target Classification

Authors: Turgay Temel, John Hallam

Abstract:

A sequential decision problem, based on the task ofidentifying the species of trees given acoustic echo data collectedfrom them, is considered with well-known stochastic classifiers,including single and mixture Gaussian models. Echoes are processedwith a preprocessing stage based on a model of mammalian cochlearfiltering, using a new discrete low-pass filter characteristic. Stoppingtime performance of the sequential decision process is evaluated andcompared. It is observed that the new low pass filter processingresults in faster sequential decisions.

Keywords: Classification, neuro-spike coding, parametricmodel, Gaussian mixture with EM algorithm, sequential decision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1548
8806 Normalization and Constrained Optimization of Measures of Fuzzy Entropy

Authors: K.C. Deshmukh, P.G. Khot, Nikhil

Abstract:

In the literature of information theory, there is necessity for comparing the different measures of fuzzy entropy and this consequently, gives rise to the need for normalizing measures of fuzzy entropy. In this paper, we have discussed this need and hence developed some normalized measures of fuzzy entropy. It is also desirable to maximize entropy and to minimize directed divergence or distance. Keeping in mind this idea, we have explained the method of optimizing different measures of fuzzy entropy.

Keywords: Fuzzy set, Uncertainty, Fuzzy entropy, Normalization, Membership function

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1473
8805 Action Recognition in Video Sequences using a Mealy Machine

Authors: L. Rodriguez-Benitez, J. Moreno-Garcia, J.J. Castro-Schez, C. Solana, L. Jimenez

Abstract:

In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.

Keywords: Approximate reasoning, finite state machines, video analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1686
8804 Analysis of Thermal Deformation of a Rough Slider and Its Asperities and Its Impact on Load Generation in Parallel Sliders

Authors: Prawal Sinha, Getachew Adamu

Abstract:

Heating is inevitable in any bearing operation. This leads to not only the thinning of the lubricant but also could lead to a thermal deformation of the bearing. The present work is an attempt to analyze the influence of thermal deformation on the thermohydrodynamic lubrication of infinitely long tilted pad slider rough bearings. As a consequence of heating the slider is deformed and is assumed to take a parabolic shape. Also the asperities expand leading to smaller effective film thickness. Two different types of surface roughness are considered: longitudinal roughness and transverse roughness. Christensen-s stochastic approach is used to derive the Reynolds-type equations. Density and viscosity are considered to be temperature dependent. The modified Reynolds equation, momentum equation, continuity equation and energy equation are decoupled and solved using finite difference method to yield various bearing characteristics. From the numerical simulations it is observed that the performance of the bearing is significantly affected by the thermal distortion of the slider and asperities and even the parallel sliders seem to carry some load.

Keywords: Thermal Deformation, Tilted pad slider bearing, longitudinal roughness, transverse roughness, load capacity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1872
8803 Defining a Framework for Holistic Life Cycle Assessment of Building Components

Authors: Naomi Grigoryan, Alexandros Loutsioli Daskalakis, Anna Elisse Uy, Yihe Huang, Aude Laurent (Webanck)

Abstract:

In response to the building and construction sectors accounting for a third of all energy demand and emissions, the European Union has placed new laws and regulations in the construction sector that emphasize material circularity, energy efficiency, biodiversity, and social impact. Existing design tools assess sustainability in early-stage design for products or buildings; however, there is no standardized methodology for measuring the circularity performance of building components. Existing assessment methods for building components focus primarily on carbon footprint but lack the comprehensive analysis required to design for circularity. The research conducted in this paper covers the parameters needed to assess sustainability in the design process of architectural products such as doors, windows, and facades. It maps a framework for a tool that assists designers with real-time sustainability metrics. Considering the life cycle of building components such as façades, windows, and doors involves the life cycle stages applied to product design and many of the methods used in the life cycle analysis of buildings. The current industry standards of sustainability assessment for metal building components follow cradle-to-grave life cycle assessment (LCA), track Global Warming Potential (GWP), and document the parameters used for an Environmental Product Declaration (EPD). Expanding on the MCI with additional indicators such as the Water Circularity Index (WCI), the Energy Circularity Index (ECI), the Social Circularity Index (SCI), Life Cycle Economic Value (EV), and calculating biodiversity risk and uncertainty, the assessment methodology of an architectural product's impact can be targeted more specifically based on product requirements, performance, and lifespan. Broadening the scope of LCA calculation for products to incorporate aspects of building design allows product designers to account for the disassembly of architectural components. For example, the MCI for architectural products such as windows and facades is typically low due to the impact of glass, as 70% of glass ends up in landfills due to damage in the disassembly process. The low MCI can be combatted by expanding beyond cradle-to-grave assessment and focusing the design process on disassembly, recycling, and repurposing with the help of real-time assessment tools. Design for Disassembly and Urban Mining has been integrated within the construction field on small scales as project-based exercises, not addressing the entire supply chain of architectural products. By adopting more comprehensive sustainability metrics and incorporating uncertainty calculations, the sustainability assessment of building components can be more accurately assessed with decarbonization and disassembly in mind, addressing the large-scale commercial markets within construction, some of the most significant contributors to climate change.

Keywords: Architectural products, early-stage design, life cycle assessment, material circularity indicator.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 54
8802 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: Entropy measure, Hausdorff distance, neutrosophic set, soft set.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 934
8801 Sparsity-Aware Affine Projection Algorithm for System Identification

Authors: Young-Seok Choi

Abstract:

This work presents a new type of the affine projection (AP) algorithms which incorporate the sparsity condition of a system. To exploit the sparsity of the system, a weighted l1-norm regularization is imposed on the cost function of the AP algorithm. Minimizing the cost function with a subgradient calculus and choosing two distinct weighting for l1-norm, two stochastic gradient based sparsity regularized AP (SR-AP) algorithms are developed. Experimental results exhibit that the SR-AP algorithms outperform the typical AP counterparts for identifying sparse systems.

Keywords: System identification, adaptive filter, affine projection, sparsity, sparse system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1556
8800 Solutions to Probabilistic Constrained Optimal Control Problems Using Concentration Inequalities

Authors: Tomoaki Hashimoto

Abstract:

Recently, optimal control problems subject to probabilistic constraints have attracted much attention in many research field. Although probabilistic constraints are generally intractable in optimization problems, several methods haven been proposed to deal with probabilistic constraints. In most methods, probabilistic constraints are transformed to deterministic constraints that are tractable in optimization problems. This paper examines a method for transforming probabilistic constraints into deterministic constraints for a class of probabilistic constrained optimal control problems.

Keywords: Optimal control, stochastic systems, discrete-time systems, probabilistic constraints.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1376
8799 Performance Analysis of Brain Tumor Detection Based On Image Fusion

Authors: S. Anbumozhi, P. S. Manoharan

Abstract:

Medical Image fusion plays a vital role in medical field to diagnose the brain tumors which can be classified as benign or malignant. It is the process of integrating multiple images of the same scene into a single fused image to reduce uncertainty and minimizing redundancy while extracting all the useful information from the source images. Fuzzy logic is used to fuse two brain MRI images with different vision. The fused image will be more informative than the source images. The texture and wavelet features are extracted from the fused image. The multilevel Adaptive Neuro Fuzzy Classifier classifies the brain tumors based on trained and tested features. The proposed method achieved 80.48% sensitivity, 99.9% specificity and 99.69% accuracy. Experimental results obtained from fusion process prove that the use of the proposed image fusion approach shows better performance while compared with conventional fusion methodologies.

Keywords: Image fusion, Fuzzy rules, Neuro-fuzzy classifier.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3060
8798 Reliability Analysis of Underground Pipelines Using Subset Simulation

Authors: Kong Fah Tee, Lutfor Rahman Khan, Hongshuang Li

Abstract:

An advanced Monte Carlo simulation method, called Subset Simulation (SS) for the time-dependent reliability prediction for underground pipelines has been presented in this paper. The SS can provide better resolution for low failure probability level with efficient investigating of rare failure events which are commonly encountered in pipeline engineering applications. In SS method, random samples leading to progressive failure are generated efficiently and used for computing probabilistic performance by statistical variables. SS gains its efficiency as small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment. It is hoped that the development work can promote the use of SS tools for uncertainty propagation in the decision-making process of underground pipelines network reliability prediction.

Keywords: Underground pipelines, Probability of failure, Reliability and Subset Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3556
8797 Average Turbulent Pipe Flow with Heat Transfer Using a Three-Equation Model

Authors: Khalid Alammar

Abstract:

Aim of this study is to evaluate a new three-equation turbulence model applied to flow and heat transfer through a pipe. Uncertainty is approximated by comparing with published direct numerical simulation results for fully-developed flow. Error in the mean axial velocity, temperature, friction, and heat transfer is found to be negligible.

Keywords: Heat Transfer, Nusselt number, Skin friction, Turbulence.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2447