Search results for: stochastic uncertainty analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28406

Search results for: stochastic uncertainty analysis

27896 Exploring the Impact of Additive Manufacturing on Supply Chains: A Game-Theoretic Analysis of Manufacturer-Retailer Dynamics

Authors: Mohammad Ebrahim Arbabian

Abstract:

This paper investigates the impact of 3D printing, also known as additive manufacturing, on a multi-item supply chain comprising a manufacturer and retailer. Operating under a wholesale-price contract and catering to stochastic customer demand, this study delves into the largely unexplored realm of how 3D printing technology reshapes supply chain dynamics. A distinguishing aspect of 3D printing is its versatility in producing various product types, yet its slower production pace compared to traditional methods poses a challenge. We analyze the trade-off between 3D printing's limited capacity and its enhancement of production flexibility. By delineating the economic circumstances favoring 3D printing adoption by the manufacturer, we establish the Stackelberg equilibrium in the retailer-manufacturer game. Additionally, we determine optimal order quantities for the retailer considering 3D printing as an option for the manufacturer, ascertain optimal wholesale prices in the presence of 3D printing, and compute optimal profits for both parties involved in the supply chain.

Keywords: additive manufacturing, supply chain management, contract theory, Stackelberg game, optimization

Procedia PDF Downloads 55
27895 The Relationship Between Soldiers’ Psychological Resilience, Leadership Style and Organisational Commitment

Authors: Rosita Kanapeckaite

Abstract:

The modern operational military environment is a combination of factors such as change, uncertainty, complexity and ambiguity. Stiehm (2002) refers to such situations as VUCA situations. VUCA is an acronym commonly used to describe the volatility, uncertainty, complexity and ambiguity of various situations and conditions. Increasingly fast-paced military operations require military personnel to demonstrate readiness and resilience under stressful conditions in order to maintain the optimum cognitive and physical performance necessary to achieve success. Military resilience can be defined as the ability to cope with the negative effects of setbacks and associated stress on military performance and combat effectiveness. In the volatile, uncertain, complex and ambiguous modern operational environment, both current and future operations require and place a higher priority on enhancing and maintaining troop readiness and resilience to win decisively in multidimensional combat. This paper explores the phenomenon of soldiers' psychological resilience, theories of leadership, and commitment to the organisation. The aim of the study is to examine the relationship between soldiers' psychological resilience, leadership style and commitment to the organisation. The study involved 425 professional soldiers, the research method was a questionnaire survey. The instruments used were measures of psychological resilience, leadership styles and commitment to the organisation. Results: transformational leadership style predicts higher psychological resilience, and psychologically resilient professional servicemen are more committed to the organisation. The study confirms the importance of soldiers' psychological resilience for their commitment to the organisation. The paper also discusses practical applications.

Keywords: resilience, commitment, solders, leadership style

Procedia PDF Downloads 71
27894 Fuzzy Total Factor Productivity by Credibility Theory

Authors: Shivi Agarwal, Trilok Mathur

Abstract:

This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.

Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index

Procedia PDF Downloads 358
27893 There's No End in Sight: An Interpretative Phenomenological Analysis of Quality of Life in Burning Syndrome Sufferers

Authors: R. McGrath, A. Trace, S. Curtin, C. McCreary

Abstract:

Introduction: Although, in relation to Burning Mouth Syndrome (BMS), much energy has been expended on its definition and etiology, it still remains a contentious issue. There is agreement on the symptoms, but on little else; and approaches to treatment vary widely. However, it has been established that the condition has a detrimental effect on the sufferer’s quality of life. Much research focus has been put on the physical impact of the syndrome. Recently, some literature has turned the focus to social, functional, and psychological factors. However, there is very little qualitative research on how burning mouth syndrome affects the lives of sufferer’s and the present study seeks to remedy this. Method: The study recruited five male participants who took part in semi-structured interviews lasting between 30 and 50 minutes. Data was analysed using Interpretative Phenomenological Analysis. Results: The study identified four super-ordinate themes: Lack of Control due to Uncertainty about Condition; Disruption to Internal Sense of Self; Negative Future Expectation due to Chronic Symptoms; and Sense of BMS as an Intrusive Force. Aspects of these themes reflect areas of reduction in quality of life. Conclusion: BMS damages an individual’s quality of life in ways that have not been reflected in self-report surveys of health-related quality of life. The condition has serious implications for the individual's sense of self, identity, and future. The study recommends that further qualitative research be carried out in this area. Also, the use of therapeutic interventions with sufferers from BMS is recommended, which would help not only sufferers but best practice in relation to their treatment.

Keywords: burning mouth syndrome, interpretative phenomenological analysis, qualitative research, quality of life

Procedia PDF Downloads 437
27892 Challenges of Cryogenic Fluid Metering by Coriolis Flowmeter

Authors: Evgeniia Shavrina, Yan Zeng, Boo Cheong Khoo, Vinh-Tan Nguyen

Abstract:

The present paper is aimed at providing a review of error sources in cryogenic metering by Coriolis flowmeters (CFMs). Whereas these flowmeters allow accurate water metering, high uncertainty and low repeatability are commonly observed at cryogenic fluid metering, which is often necessary for effective renewable energy production and storage. The sources of these issues might be classified as general and cryogenic specific challenges. A conducted analysis of experimental and theoretical studies shows that material behaviour at cryogenic temperatures, composition variety, and multiphase presence are the most significant cryogenic challenges. At the same time, pipeline diameter limitation, ambient vibration impact, and drawbacks of the installation may be highlighted as the most important general challenges of cryogenic metering by CFM. Finally, the techniques, which mitigate the impact of these challenges are reviewed, and future development direction is indicated.

Keywords: Coriolis flowmeter, cryogenic, multicomponent flow, multiphase flow

Procedia PDF Downloads 147
27891 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability

Authors: Chin-Chia Jane

Abstract:

In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.

Keywords: quality of service, reliability, transportation network, travel time

Procedia PDF Downloads 218
27890 Information Disclosure And Financial Sentiment Index Using a Machine Learning Approach

Authors: Alev Atak

Abstract:

In this paper, we aim to create a financial sentiment index by investigating the company’s voluntary information disclosures. We retrieve structured content from BIST 100 companies’ financial reports for the period 1998-2018 and extract relevant financial information for sentiment analysis through Natural Language Processing. We measure strategy-related disclosures and their cross-sectional variation and classify report content into generic sections using synonym lists divided into four main categories according to their liquidity risk profile, risk positions, intra-annual information, and exposure to risk. We use Word Error Rate and Cosin Similarity for comparing and measuring text similarity and derivation in sets of texts. In addition to performing text extraction, we will provide a range of text analysis options, such as the readability metrics, word counts using pre-determined lists (e.g., forward-looking, uncertainty, tone, etc.), and comparison with reference corpus (word, parts of speech and semantic level). Therefore, we create an adequate analytical tool and a financial dictionary to depict the importance of granular financial disclosure for investors to identify correctly the risk-taking behavior and hence make the aggregated effects traceable.

Keywords: financial sentiment, machine learning, information disclosure, risk

Procedia PDF Downloads 92
27889 Strategic Risk Issues for Film Distributors of Hindi Film Industry in Mumbai: A Grounded Theory Approach

Authors: Rashmi Dyondi, Shishir K. Jha

Abstract:

The purpose of the paper is to address the strategic risk issues surrounding Hindi film distribution in Mumbai for a film distributor, who acts as an entrepreneur when launching a product (movie) in the market (film territory).The paper undertakes a fundamental review of films and risk in the Hindi film industry and applies Grounded Theory technique to understand the complex phenomena of risk taking behavior of the film distributors (both independent and studios) in Mumbai. Rich in-depth interviews with distributors are coded to develop core categories through constant comparison leading to conceptualization of the phenomena of interest. This paper is a first-of-its-kind-attempt to understand risk behavior of a distributor, which is akin to entrepreneurial risk behavior under conditions of uncertainty. Unlike extensive scholarly work on dynamics of Hollywood motion picture industry, Hindi film industry is an under-researched area till now. Especially how do film distributors perceive risk is an unexplored study for the Hindi film industry. Films are unique experience products and the film distributor acts as an entrepreneur assuming high risks given the uncertainty in the motion picture business. With the entry of mighty corporate studios and astronomical film budgets posing serious business threats to the independent distributors, there is a need for an in-depth qualitative enquiry (applying grounded theory technique) for unraveling the definition of risk for the independent distributors in Mumbai vis-à-vis the corporate studios. Need for good content was a common challenge to both the groups in the present state of the industry, however corporate studios with their distinct ideologies, focus on own productions and financial power faced different set of challenges than the independents (like achieving sustainability in business). Softer issues like market goodwill and relations with producers, honesty in business dealings and transparency came out to be clear markers for success of independents in long run. The findings from the qualitative analysis stress on different elements of risk and challenges as perceived by the two groups of distributors in the Hindi film industry and provide a future research agenda for empirical investigation of determinants of box-office success of Hindi films distributed in Mumbai.

Keywords: entrepreneurial risk behavior, film distribution strategy, Hindi film industry, risk

Procedia PDF Downloads 311
27888 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks

Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher

Abstract:

Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.

Keywords: neural networks, rainfall, prediction, climatic variables

Procedia PDF Downloads 484
27887 Potential Impact of Climate Change on Suspended Sediment Changes in Mekong River Basin

Authors: Zuliziana Suif, Nordila Ahmad, Sengheng Hul

Abstract:

This paper evaluates the impact of climate change on suspended sediment changes in the Mekong River Basin. In this study, the distributed process-based sediment transport model is used to examine the potential impact of future climate on suspended sediment dynamic changes in the Mekong River Basin. To this end, climate scenarios from two General Circulation Model (GCMs) were considered in the scenario analysis. The simulation results show that the sediment load and concentration shows 0.64% to 69% increase in the near future (2041-2050) and 2.5% to 95% in the far future (2090- 2099). As the projected climate change impact on sediment varies remarkably between the different climate models, the uncertainty should be taken into account in sediment management. Overall, the changes in sediment load and concentration can have a great implication for related sediment management.

Keywords: climate change, suspended sediment, Mekong River Basin, GCMs

Procedia PDF Downloads 436
27886 Cell-Cell Interactions in Diseased Conditions Revealed by Three Dimensional and Intravital Two Photon Microscope: From Visualization to Quantification

Authors: Satoshi Nishimura

Abstract:

Although much information has been garnered from the genomes of humans and mice, it remains difficult to extend that information to explain physiological and pathological phenomena. This is because the processes underlying life are by nature stochastic and fluctuate with time. Thus, we developed novel "in vivo molecular imaging" method based on single and two-photon microscopy. We visualized and analyzed many life phenomena, including common adult diseases. We integrated the knowledge obtained, and established new models that will serve as the basis for new minimally invasive therapeutic approaches.

Keywords: two photon microscope, intravital visualization, thrombus, artery

Procedia PDF Downloads 366
27885 Development of a Data-Driven Method for Diagnosing the State of Health of Battery Cells, Based on the Use of an Electrochemical Aging Model, with a View to Their Use in Second Life

Authors: Desplanches Maxime

Abstract:

Accurate estimation of the remaining useful life of lithium-ion batteries for electronic devices is crucial. Data-driven methodologies encounter challenges related to data volume and acquisition protocols, particularly in capturing a comprehensive range of aging indicators. To address these limitations, we propose a hybrid approach that integrates an electrochemical model with state-of-the-art data analysis techniques, yielding a comprehensive database. Our methodology involves infusing an aging phenomenon into a Newman model, leading to the creation of an extensive database capturing various aging states based on non-destructive parameters. This database serves as a robust foundation for subsequent analysis. Leveraging advanced data analysis techniques, notably principal component analysis and t-Distributed Stochastic Neighbor Embedding, we extract pivotal information from the data. This information is harnessed to construct a regression function using either random forest or support vector machine algorithms. The resulting predictor demonstrates a 5% error margin in estimating remaining battery life, providing actionable insights for optimizing usage. Furthermore, the database was built from the Newman model calibrated for aging and performance using data from a European project called Teesmat. The model was then initialized numerous times with different aging values, for instance, with varying thicknesses of SEI (Solid Electrolyte Interphase). This comprehensive approach ensures a thorough exploration of battery aging dynamics, enhancing the accuracy and reliability of our predictive model. Of particular importance is our reliance on the database generated through the integration of the electrochemical model. This database serves as a crucial asset in advancing our understanding of aging states. Beyond its capability for precise remaining life predictions, this database-driven approach offers valuable insights for optimizing battery usage and adapting the predictor to various scenarios. This underscores the practical significance of our method in facilitating better decision-making regarding lithium-ion battery management.

Keywords: Li-ion battery, aging, diagnostics, data analysis, prediction, machine learning, electrochemical model, regression

Procedia PDF Downloads 65
27884 Building Information Models Utilization for Design Improvement of Infrastructure

Authors: Keisuke Fujioka, Yuta Itoh, Masaru Minagawa, Shunji Kusayanagi

Abstract:

In this study, building information models of the underground temporary structures and adjacent embedded pipes were constructed to show the importance of the information on underground pipes adjacent to the structures to enhance the productivity of execution of construction. Next, the bar chart used in actual construction process were employed to make the Gantt chart, and the critical pass analysis was carried out to show that accurate information on the arrangement of underground existing pipes can be used for the enhancement of the productivity of the construction of underground structures. In the analyzed project, significant construction delay was not caused by unforeseeable existence of underground pipes by the management ability of the construction manager. However, in many cases of construction executions in the developing countries, the existence of unforeseeable embedded pipes often causes substantial delay of construction. Design change based on uncertainty on the position information of embedded pipe can be also important risk for contractors in domestic construction. So CPM analyses were performed by a project-management-software to the situation that influence of the tasks causing construction delay was assumed more significant. Through the analyses, the efficiency of information management on underground pipes and BIM analysis in the design stage for workability improvement was indirectly confirmed.

Keywords: building-information modelling, construction information modelling, design improvement, infrastructure

Procedia PDF Downloads 306
27883 Damage Micromechanisms of Coconut Fibers and Chopped Strand Mats of Coconut Fibers

Authors: Rios A. S., Hild F., Deus E. P., Aimedieu P., Benallal A.

Abstract:

The damage micromechanisms of chopped strand mats manufactured by compression of Brazilian coconut fiber and coconut fibers in different external conditions (chemical treatment) were used in this study. Mechanical analysis testing uniaxial traction were used with Digital Image Correlation (DIC). The images captured during the tensile test in the coconut fibers and coconut fiber mats showed an uncertainty of measurement in order centipixels. The initial modulus (modulus of elasticity) and tensile strength decreased with increasing diameter for the four conditions of coconut fibers. The DIC showed heterogeneous deformation fields for coconut fibers and mats and the displacement fields showed the rupture process of coconut fiber. The determination of poisson’s ratio of the mat was performed through of transverse and longitudinal deformations found in the elastic region.

Keywords: coconut fiber, mechanical behavior, digital image correlation, micromechanism

Procedia PDF Downloads 456
27882 Juridically Secure Trade Mechanisms for Alternative Dispute Resolution in Transnational Business Negotiations

Authors: Linda Frazer

Abstract:

A pluralistic methodology focuses on promoting an understanding that an alternative juridical framework for the regulation of transnational business negotiations (TBN) between private business parties is fundamentally required. This paper deals with the evolving assessment of the doctoral research of the author which demonstrated that due to insufficient juridical tools, negotiations are commonly misunderstood within the complexity of pluralistic and conflicting legal regimes. This inadequacy causes uncertainty in the enforcement of legal remedies, leaving business parties surprised. Consequently, parties cannot sufficiently anticipate when and how legal rights and obligations are created, often counting on oral or incomplete agreements which may lead to the misinterpretation of the extent of their legal rights and obligations. This uncertainty causes threats to business parties for fear of creating unintended legal obligations or, conversely, that law will not enforce intended agreements for failure to pass the tests of contractual validity. A need to find a manner to set default standards of communications and standards of conduct to monitor our evolving global trade would aid law to provide the security, predictability and foreseeability during alternative dispute resolution required by TBN parties. The conclusion of this study includes a proposal of new trade mechanisms, termed 'Bills of Negotiations' (BON) to enhance party autonomy and promote the ability for TBN parties to self-regulate within the boundaries of law. BON will be guided by a secure juridical institutionalized setting that caters to guiding communications during TBN and resolving disputes that arise along the negotiation processes on a fast track basis.

Keywords: alternative resolution disputes, ADR, good faith, good faith, juridical security, legal regulation, trade mechanisms, transnational business negotiations

Procedia PDF Downloads 140
27881 Measurements of Flow Mixing Behaviors Using a Wire-Mesh Sensor in a Wire-Wrapped 37-Pin Rod Assembly

Authors: Hyungmo Kim, Hwang Bae, Seok-Kyu Chang, Dong Won Lee, Yung Joo Ko, Sun Rock Choi, Hae Seob Choi, Hyeon Seok Woo, Dong-Jin Euh, Hyeong-Yeon Lee

Abstract:

Flow mixing characteristics in the wire-wrapped 37-pin rod bundle were measured by using a wire-mesh sensing system for a sodium-cooled fast reactor (SFR). The subchannel flow mixing in SFR core subchannels was an essential characteristic for verification of a core thermal design and safety analysis. A dedicated test facility including the wire-mesh sensor system and tracing liquid injection system was developed, and the conductivity fields at the end of 37-pin rod bundle were visualized in several different flow conditions. These experimental results represented the reasonable agreements with the results of CFD, and the uncertainty of the mixing experiments has been conducted to evaluate the experimental results.

Keywords: core thermal design, flow mixing, a wire-mesh sensor, a wire-wrap effect

Procedia PDF Downloads 626
27880 Considering Uncertainties of Input Parameters on Energy, Environmental Impacts and Life Cycle Costing by Monte Carlo Simulation in the Decision Making Process

Authors: Johannes Gantner, Michael Held, Matthias Fischer

Abstract:

The refurbishment of the building stock in terms of energy supply and efficiency is one of the major challenges of the German turnaround in energy policy. As the building sector accounts for 40% of Germany’s total energy demand, additional insulation is key for energy efficient refurbished buildings. Nevertheless the energetic benefits often the environmental and economic performances of insulation materials are questioned. The methods Life Cycle Assessment (LCA) as well as Life Cycle Costing (LCC) can form the standardized basis for answering this doubts and more and more become important for material producers due efforts such as Product Environmental Footprint (PEF) or Environmental Product Declarations (EPD). Due to increasing use of LCA and LCC information for decision support the robustness and resilience of the results become crucial especially for support of decision and policy makers. LCA and LCC results are based on respective models which depend on technical parameters like efficiencies, material and energy demand, product output, etc.. Nevertheless, the influence of parameter uncertainties on lifecycle results are usually not considered or just studied superficially. Anyhow the effect of parameter uncertainties cannot be neglected. Based on the example of an exterior wall the overall lifecycle results are varying by a magnitude of more than three. As a result simple best case worst case analyses used in practice are not sufficient. These analyses allow for a first rude view on the results but are not taking effects into account such as error propagation. Thereby LCA practitioners cannot provide further guidance for decision makers. Probabilistic analyses enable LCA practitioners to gain deeper understanding of the LCA and LCC results and provide a better decision support. Within this study, the environmental and economic impacts of an exterior wall system over its whole lifecycle are illustrated, and the effect of different uncertainty analysis on the interpretation in terms of resilience and robustness are shown. Hereby the approaches of error propagation and Monte Carlo Simulations are applied and combined with statistical methods in order to allow for a deeper understanding and interpretation. All in all this study emphasis the need for a deeper and more detailed probabilistic evaluation based on statistical methods. Just by this, misleading interpretations can be avoided, and the results can be used for resilient and robust decisions.

Keywords: uncertainty, life cycle assessment, life cycle costing, Monte Carlo simulation

Procedia PDF Downloads 283
27879 Measuring Banking Risk

Authors: Mike Tsionas

Abstract:

The paper develops new indices of financial stability based on an explicit model of expected utility maximization by financial institutions subject to the classical technology restrictions of neoclassical production theory. The model can be estimated using standard econometric techniques, like GMM for dynamic panel data and latent factor analysis for the estimation of co-variance matrices. An explicit functional form for the utility function is not needed and we show how measures of risk aversion and prudence (downside risk aversion) can be derived and estimated from the model. The model is estimated using data for Eurozone countries and we focus particularly on (i) the use of the modeling approach as an “early warning mechanism”, (ii) the bank- and country-specific estimates of risk aversion and prudence (downside risk aversion), and (iii) the derivation of a generalized measure of risk that relies on loan-price uncertainty.

Keywords: financial stability, banking, expected utility maximization, sub-prime crisis, financial crisis, eurozone, PIIGS

Procedia PDF Downloads 347
27878 An Experimental Approach to the Influence of Tipping Points and Scientific Uncertainties in the Success of International Fisheries Management

Authors: Jules Selles

Abstract:

The Atlantic and Mediterranean bluefin tuna fishery have been considered as the archetype of an overfished and mismanaged fishery. This crisis has demonstrated the role of public awareness and the importance of the interactions between science and management about scientific uncertainties. This work aims at investigating the policy making process associated with a regional fisheries management organization. We propose a contextualized computer-based experimental approach, in order to explore the effects of key factors on the cooperation process in a complex straddling stock management setting. Namely, we analyze the effects of the introduction of a socio-economic tipping point and the uncertainty surrounding the estimation of the resource level. Our approach is based on a Gordon-Schaefer bio-economic model which explicitly represents the decision making process. Each participant plays the role of a stakeholder of ICCAT and represents a coalition of fishing nations involved in the fishery and decide unilaterally a harvest policy for the coming year. The context of the experiment induces the incentives for exploitation and collaboration to achieve common sustainable harvest plans at the Atlantic bluefin tuna stock scale. Our rigorous framework allows testing how stakeholders who plan the exploitation of a fish stock (a common pool resource) respond to two kinds of effects: i) the inclusion of a drastic shift in the management constraints (beyond a socio-economic tipping point) and ii) an increasing uncertainty in the scientific estimation of the resource level.

Keywords: economic experiment, fisheries management, game theory, policy making, Atlantic Bluefin tuna

Procedia PDF Downloads 251
27877 The Effect of Spatial Variability on Axial Pile Design of Closed Ended Piles in Sand

Authors: Cormac Reale, Luke J. Prendergast, Kenneth Gavin

Abstract:

While significant improvements have been made in axial pile design methods over recent years, the influence of soils natural variability has not been adequately accounted for within them. Soil variability is a crucial parameter to consider as it can account for large variations in pile capacity across the same site. This paper seeks to address this knowledge deficit, by demonstrating how soil spatial variability can be accommodated into existing cone penetration test (CPT) based pile design methods, in the form of layered non-homogeneous random fields. These random fields model the scope of a given property’s variance and define how it varies spatially. A Monte Carlo analysis of the pile will be performed taking into account parameter uncertainty and spatial variability, described using the measured scales of fluctuation. The results will be discussed in light of Eurocode 7 and the effect of spatial averaging on design capacities will be analysed.

Keywords: pile axial design, reliability, spatial variability, CPT

Procedia PDF Downloads 242
27876 Time-Dependent Reliability Analysis of Corrosion Affected Cast Iron Pipes with Mixed Mode Fracture

Authors: Chun-Qing Li, Guoyang Fu, Wei Yang

Abstract:

A significant portion of current water networks is made of cast iron pipes. Due to aging and deterioration with corrosion being the most predominant mechanism, the failure rate of cast iron pipes is very high. Although considerable research has been carried out in the past few decades, most are on the effect of corrosion on the structural capacity of pipes using strength theory as the failure criterion. This paper presents a reliability-based methodology for the assessment of corrosion affected cast iron pipe cracking failures. A nonlinear limit state function taking into account all three fracture modes is proposed for brittle metal pipes with mixed mode fracture. A stochastic model of the load effect is developed, and time-dependent reliability method is employed to quantify the probability of failure and predict the remaining service life. A case study is carried out using the proposed methodology, followed by sensitivity analysis to investigate the effects of the random variables on the probability of failure. It has been found that the larger the inclination angle or the Mode I fracture toughness is, the smaller the probability of pipe failure is. It has also been found that the multiplying and exponential coefficients k and n in the power law corrosion model and the internal pressure have the most influence on the probability of failure for cast iron pipes. The methodology presented in this paper can assist pipe engineers and asset managers in developing a risk-informed and cost-effective strategy for better management of corrosion-affected pipelines.

Keywords: corrosion, inclined surface cracks, pressurized cast iron pipes, stress intensity

Procedia PDF Downloads 315
27875 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: entropy measure, Hausdorff distance, neutrosophic set, soft set

Procedia PDF Downloads 252
27874 Leadership in the Emergence Paradigm: A Literature Review on the Medusa Principles

Authors: Everard van Kemenade

Abstract:

Many quality improvement activities are planned. Leaders are strongly involved in missions, visions and strategic planning. They use, consciously or unconsciously, the PDCA-cycle, also know as the Deming cycle. After the planning, the plans are carried out and the results or effects are measured. If the results show that the goals in the plan have not been achieved, adjustments are made in the next plan or in the execution of the processes. Then, the cycle is run through again. Traditionally, the PDCA-cycle is advocated as a means to an end. However, PDCA is especially fit for planned, ordered, certain contexts. It fits with the empirical and referential quality paradigm. For uncertain, unordered, unplanned processes, something else might be needed instead of Plan-Do-Check-Act. Due to the complexity of our society, the influence of the context, and the uncertainty in our world nowadays, not every activity can be planned anymore. At the same time organisations need to be more innovative than ever. That provides leaders with ‘wicked tendencies’. However, that raises the question how one can innovate without being able to plan? Complexity science studies the interactions of a diverse group of agents that bring about change in times of uncertainty, e.g. when radical innovation is co-created. This process is called emergence. This research study explores the role of leadership in the emergence paradigm. Aim of the article is to study the way that leadership can support the emergence of innovation in a complex context. First, clarity is given on the concepts used in the research question: complexity, emergence, innovation and leadership. Thereafter a literature search is conducted to answer the research question. The topics ‘emergent leadership’ or ‘complexity leadership’ are chosen for an exploratory search in Google and Google Scholar using the berry picking method. Exclusion criterion is emergence in other disciplines than organizational development or in the meaning of ‘arising’. The literature search conducted gave 45 hits. Twenty-seven articles were excluded after reading the title and abstract because they did not research the topic of emergent leadership and complexity. After reading the remaining articles as a whole one more was excluded because the article used emergent in the limited meaning of ‗arising‘ and eight more were excluded because the topic did not match the research question of this article. That brings the total of the search to 17 articles. The useful conclusions from the articles are merged and grouped together under overarching topics, using thematic analysis. The findings are that 5 topics prevail when looking at possibilities for leadership to facilitate innovation: enabling, sharing values, dreaming, interacting, context sensitivity and adaptivity. Together they form In Dutch the acronym Medusa.

Keywords: complexity science, emergence, leadership in the emergence paradigm, innovation, the Medusa principles

Procedia PDF Downloads 23
27873 Accounting for Downtime Effects in Resilience-Based Highway Network Restoration Scheduling

Authors: Zhenyu Zhang, Hsi-Hsien Wei

Abstract:

Highway networks play a vital role in post-disaster recovery for disaster-damaged areas. Damaged bridges in such networks can disrupt the recovery activities by impeding the transportation of people, cargo, and reconstruction resources. Therefore, rapid restoration of damaged bridges is of paramount importance to long-term disaster recovery. In the post-disaster recovery phase, the key to restoration scheduling for a highway network is prioritization of bridge-repair tasks. Resilience is widely used as a measure of the ability to recover with which a network can return to its pre-disaster level of functionality. In practice, highways will be temporarily blocked during the downtime of bridge restoration, leading to the decrease of highway-network functionality. The failure to take downtime effects into account can lead to overestimation of network resilience. Additionally, post-disaster recovery of highway networks is generally divided into emergency bridge repair (EBR) in the response phase and long-term bridge repair (LBR) in the recovery phase, and both of EBR and LBR are different in terms of restoration objectives, restoration duration, budget, etc. Distinguish these two phases are important to precisely quantify highway network resilience and generate suitable restoration schedules for highway networks in the recovery phase. To address the above issues, this study proposes a novel resilience quantification method for the optimization of long-term bridge repair schedules (LBRS) taking into account the impact of EBR activities and restoration downtime on a highway network’s functionality. A time-dependent integer program with recursive functions is formulated for optimally scheduling LBR activities. Moreover, since uncertainty always exists in the LBRS problem, this paper extends the optimization model from the deterministic case to the stochastic case. A hybrid genetic algorithm that integrates a heuristic approach into a traditional genetic algorithm to accelerate the evolution process is developed. The proposed methods are tested using data from the 2008 Wenchuan earthquake, based on a regional highway network in Sichuan, China, consisting of 168 highway bridges on 36 highways connecting 25 cities/towns. The results show that, in this case, neglecting the bridge restoration downtime can lead to approximately 15% overestimation of highway network resilience. Moreover, accounting for the impact of EBR on network functionality can help to generate a more specific and reasonable LBRS. The theoretical and practical values are as follows. First, the proposed network recovery curve contributes to comprehensive quantification of highway network resilience by accounting for the impact of both restoration downtime and EBR activities on the recovery curves. Moreover, this study can improve the highway network resilience from the organizational dimension by providing bridge managers with optimal LBR strategies.

Keywords: disaster management, highway network, long-term bridge repair schedule, resilience, restoration downtime

Procedia PDF Downloads 147
27872 Redefining Solar Generation Estimation: A Comprehensive Analysis of Real Utility Advanced Metering Infrastructure (AMI) Data from Various Projects in New York

Authors: Haowei Lu, Anaya Aaron

Abstract:

Understanding historical solar generation and forecasting future solar generation from interconnected Distributed Energy Resources (DER) is crucial for utility planning and interconnection studies. The existing methodology, which relies on solar radiation, weather data, and common inverter models, is becoming less accurate. Rapid advancements in DER technologies have resulted in more diverse project sites, deviating from common patterns due to various factors such as DC/AC ratio, solar panel performance, tilt angle, and the presence of DC-coupled battery energy storage systems. In this paper, the authors review 10,000 DER projects within the system and analyze the Advanced Metering Infrastructure (AMI) data for various types to demonstrate the impact of different parameters. An updated methodology is proposed for redefining historical and future solar generation in distribution feeders.

Keywords: photovoltaic system, solar energy, fluctuations, energy storage, uncertainty

Procedia PDF Downloads 22
27871 Solutions to Probabilistic Constrained Optimal Control Problems Using Concentration Inequalities

Authors: Tomoaki Hashimoto

Abstract:

Recently, optimal control problems subject to probabilistic constraints have attracted much attention in many research field. Although probabilistic constraints are generally intractable in optimization problems, several methods haven been proposed to deal with probabilistic constraints. In most methods, probabilistic constraints are transformed to deterministic constraints that are tractable in optimization problems. This paper examines a method for transforming probabilistic constraints into deterministic constraints for a class of probabilistic constrained optimal control problems.

Keywords: optimal control, stochastic systems, discrete-time systems, probabilistic constraints

Procedia PDF Downloads 272
27870 Qualitative Case Studies in Reading Specialist Education

Authors: Carol Leroy

Abstract:

This presentation focuses on the analysis qualitative case studies in the graduate education of reading specialists. The presentation describes the development and application of an integrated conceptual framework for reading specialist education, drawing on Robert Stake’s work on case study research, Kenneth Zeichner’s work on professional learning, and various tools for reading assessment (e.g. the Qualitative Reading Inventory). Social constructivist theory is used to provide intersecting links between the various influences on the processes used to assess and teaching reading within the case study framework. Illustrative examples are described to show the application of the framework in reading specialist education in a teaching clinic at a large urban university. Central to education of reading specialists in this teaching clinic is the collection, analysis and interpretation of data for the design and implementation of reading and writing programs for struggling readers and writers. The case study process involves the integrated interpretation of data, which is central to qualitative case study inquiry. An emerging theme in this approach to graduate education is the ambiguity and uncertainty that governs work with the adults and children who attend the clinic for assistance. Tensions and contradictions are explored insofar as they reveal overlapping but intersecting frameworks for case study analysis in the area of literacy education. An additional theme is the interplay of multiple layers of data with a resulting depth that goes beyond the practical need of the client and toward the deeper pedagogical growth of the reading specialist. The presentation makes a case for the value of qualitative case studies in reading specialist education. Further, the use of social constructivism as a unifying paradigm provides a robustness to the conceptual framework as a tool for understanding the pedagogy that is involved.

Keywords: assessment, case study, professional education, reading

Procedia PDF Downloads 452
27869 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: probabilistic methods, risk assessment, risk management, slope stability

Procedia PDF Downloads 381
27868 Fault Detection and Diagnosis of Broken Bar Problem in Induction Motors Base Wavelet Analysis and EMD Method: Case Study of Mobarakeh Steel Company in Iran

Authors: M. Ahmadi, M. Kafil, H. Ebrahimi

Abstract:

Nowadays, induction motors have a significant role in industries. Condition monitoring (CM) of this equipment has gained a remarkable importance during recent years due to huge production losses, substantial imposed costs and increases in vulnerability, risk, and uncertainty levels. Motor current signature analysis (MCSA) is one of the most important techniques in CM. This method can be used for rotor broken bars detection. Signal processing methods such as Fast Fourier transformation (FFT), Wavelet transformation and Empirical Mode Decomposition (EMD) are used for analyzing MCSA output data. In this study, these signal processing methods are used for broken bar problem detection of Mobarakeh steel company induction motors. Based on wavelet transformation method, an index for fault detection, CF, is introduced which is the variation of maximum to the mean of wavelet transformation coefficients. We find that, in the broken bar condition, the amount of CF factor is greater than the healthy condition. Based on EMD method, the energy of intrinsic mode functions (IMF) is calculated and finds that when motor bars become broken the energy of IMFs increases.

Keywords: broken bar, condition monitoring, diagnostics, empirical mode decomposition, fourier transform, wavelet transform

Procedia PDF Downloads 147
27867 A Bayesian Network Approach to Customer Loyalty Analysis: A Case Study of Home Appliances Industry in Iran

Authors: Azam Abkhiz, Abolghasem Nasir

Abstract:

To achieve sustainable competitive advantage in the market, it is necessary to provide and improve customer satisfaction and Loyalty. To reach this objective, companies need to identify and analyze their customers. Thus, it is critical to measure the level of customer satisfaction and Loyalty very carefully. This study attempts to build a conceptual model to provide clear insights of customer loyalty. Using Bayesian networks (BNs), a model is proposed to evaluate customer loyalty and its consequences, such as repurchase and positive word-of-mouth. BN is a probabilistic approach that predicts the behavior of a system based on observed stochastic events. The most relevant determinants of customer loyalty are identified by the literature review. Perceived value, service quality, trust, corporate image, satisfaction, and switching costs are the most important variables that explain customer loyalty. The data are collected by use of a questionnaire-based survey from 1430 customers of a home appliances manufacturer in Iran. Four scenarios and sensitivity analyses are performed to run and analyze the impact of different determinants on customer loyalty. The proposed model allows businesses to not only set their targets but proactively manage their customer behaviors as well.

Keywords: customer satisfaction, customer loyalty, Bayesian networks, home appliances industry

Procedia PDF Downloads 135