Search results for: uncertainty analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27524

Search results for: uncertainty analysis

27284 Empirical Acceleration Functions and Fuzzy Information

Authors: Muhammad Shafiq

Abstract:

In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.

Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data

Procedia PDF Downloads 274
27283 Second Time’s a Charm: The Intervention of the European Patent Office on the Strategic Use of Divisional Applications

Authors: Alissa Lefebre

Abstract:

It might seem intuitive to hope for a fast decision on the patent grant. After all, a granted patent provides you with a monopoly position, which allows you to obstruct others from using your technology. However, this does not take into account the strategic advantages one can obtain from keeping their patent applications pending. First, you have the financial advantage of postponing certain fees, although many applicants would probably agree that this is not the main benefit. As the scope of the patent protection is only decided upon at the grant, the pendency period introduces uncertainty amongst rivals. This uncertainty entails not knowing whether the patent will actually get granted and what the scope of protection will be. Consequently, rivals can only depend upon limited and uncertain information when deciding what technology is worth pursuing. One way to keep patent applications pending, is the use of divisional applications. These applicants can be filed out of a parent application as long as that parent application is still pending. This allows the applicant to pursue (part of) the content of the parent application in another application, as the divisional application cannot exceed the scope of the parent application. In a fast-moving and complex market such as the tele- and digital communications, it might allow applicants to obtain an actual monopoly position as competitors are discouraged to pursue a certain technology. Nevertheless, this practice also has downsides to it. First of all, it has an impact on the workload of the examiners at the patent office. As the number of patent filings have been increasing over the last decades, using strategies that increase this number even more, is not desirable from the patent examiners point of view. Secondly, a pending patent does not provide you with the protection of a granted patent, thus not only create uncertainty for the rivals, but also for the applicant. Consequently, the European patent office (EPO) has come up with a “raising the bar initiative” in which they have decided to tackle the strategic use of divisional applications. Over the past years, two rules have been implemented. The first rule in 2010 introduced a time limit, upon which divisional applications could only be filed within a 24-month limit after the first communication with the patent office. However, after carrying-out a user feedback survey, the EPO abolished the rule again in 2014 and replaced it by a fee mechanism. The fee mechanism is still in place today, which might be an indication of a better result compared to the first rule change. This study tests the impact of these rules on the strategic use of divisional applications in the tele- and digital communication industry and provides empirical evidence on their success. Upon using three different survival models, we find overall evidence that divisional applications prolong the pendency time and that only the second rule is able to tackle the strategic patenting and thus decrease the pendency time.

Keywords: divisional applications, regulatory changes, strategic patenting, EPO

Procedia PDF Downloads 102
27282 The Use of Random Set Method in Reliability Analysis of Deep Excavations

Authors: Arefeh Arabaninezhad, Ali Fakher

Abstract:

Since the deterministic analysis methods fail to take system uncertainties into account, probabilistic and non-probabilistic methods are suggested. Geotechnical analyses are used to determine the stress and deformation caused by construction; accordingly, many input variables which depend on ground behavior are required for geotechnical analyses. The Random Set approach is an applicable reliability analysis method when comprehensive sources of information are not available. Using Random Set method, with relatively small number of simulations compared to fully probabilistic methods, smooth extremes on system responses are obtained. Therefore random set approach has been proposed for reliability analysis in geotechnical problems. In the present study, the application of random set method in reliability analysis of deep excavations is investigated through three deep excavation projects which were monitored during the excavating process. A finite element code is utilized for numerical modeling. Two expected ranges, from different sources of information, are established for each input variable, and a specific probability assignment is defined for each range. To determine the most influential input variables and subsequently reducing the number of required finite element calculations, sensitivity analysis is carried out. Input data for finite element model are obtained by combining the upper and lower bounds of the input variables. The relevant probability share of each finite element calculation is determined considering the probability assigned to input variables present in these combinations. Horizontal displacement of the top point of excavation is considered as the main response of the system. The result of reliability analysis for each intended deep excavation is presented by constructing the Belief and Plausibility distribution function (i.e. lower and upper bounds) of system response obtained from deterministic finite element calculations. To evaluate the quality of input variables as well as applied reliability analysis method, the range of displacements extracted from models has been compared to the in situ measurements and good agreement is observed. The comparison also showed that Random Set Finite Element Method applies to estimate the horizontal displacement of the top point of deep excavation. Finally, the probability of failure or unsatisfactory performance of the system is evaluated by comparing the threshold displacement with reliability analysis results.

Keywords: deep excavation, random set finite element method, reliability analysis, uncertainty

Procedia PDF Downloads 245
27281 New Analytical Current-Voltage Model for GaN-based Resonant Tunneling Diodes

Authors: Zhuang Guo

Abstract:

In the field of GaN-based resonant tunneling diodes (RTDs) simulations, the traditional Tsu-Esaki formalism failed to predict the values of peak currents and peak voltages in the simulated current-voltage(J-V) characteristics. The main reason is that due to the strong internal polarization fields, two-dimensional electron gas(2DEG) accumulates at emitters, resulting in 2D-2D resonant tunneling currents, which become the dominant parts of the total J-V characteristics. By comparison, based on the 3D-2D resonant tunneling mechanism, the traditional Tsu-Esaki formalism cannot predict the J-V characteristics correctly. To overcome this shortcoming, we develop a new analytical model for the 2D-2D resonant tunneling currents generated in GaN-based RTDs. Compared with Tsu-Esaki formalism, the new model has made the following modifications: Firstly, considering the Heisenberg uncertainty, the new model corrects the expression of the density of states around the 2DEG eigenenergy levels at emitters so that it could predict the half width at half-maximum(HWHM) of resonant tunneling currents; Secondly, taking into account the effect of bias on wave vectors on the collectors, the new model modifies the expression of the transmission coefficients which could help to get the values of peak currents closer to the experiment data compared with Tsu-Esaki formalism. The new analytical model successfully predicts the J-V characteristics of GaN-based RTDs, and it also reveals more detailed mechanisms of resonant tunneling happened in GaN-based RTDs, which helps to design and fabricate high-performance GaN RTDs.

Keywords: GaN-based resonant tunneling diodes, tsu-esaki formalism, 2D-2D resonant tunneling, heisenberg uncertainty

Procedia PDF Downloads 55
27280 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression

Procedia PDF Downloads 205
27279 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Nonlinear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy Normal Sinus Rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy

Procedia PDF Downloads 257
27278 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement

Authors: Nadezhda Kvatashidze

Abstract:

The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.

Keywords: conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship

Procedia PDF Downloads 106
27277 Production Optimization under Geological Uncertainty Using Distance-Based Clustering

Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe

Abstract:

It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.

Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization

Procedia PDF Downloads 117
27276 Russian pipeline natural gas export strategy under uncertainty

Authors: Koryukaeva Ksenia, Jinfeng Sun

Abstract:

Europe has been a traditional importer of Russian natural gas for more than 50 years. In 2021, Russian state-owned company Gazprom supplied about a third of all gas consumed in Europe. The Russia-Europe mutual dependence in terms of natural gas supplies has been causing many concerns about the energy security of the two sides for a long period of time. These days the issue has become more urgent than ever considering recent Russian invasion in Ukraine followed by increased large-scale geopolitical conflicts, making the future of Russian natural gas supplies and global gas markets as well highly uncertain. Hence, the main purpose of this study is to get insight into the possible futures of Russian pipeline natural gas exports by a scenario planning method based on Monte-Carlo simulation within LUSS model framework, and propose Russian pipeline natural gas export strategies based on the obtained scenario planning results. The scenario analysis revealed that recent geopolitical disputes disturbed the traditional, longstanding model of Russian pipeline gas exports, and, as a result, the prospects and the pathways for Russian pipeline gas on the world markets will differ significantly from those before 2022. Specifically, our main findings show, that (i) the events of 2022 generated many uncertainties for the long-term future of Russian pipeline gas export perspectives on both western and eastern supply directions, including geopolitical, regulatory, economic, infrastructure and other uncertainties; (ii) according to scenario modelling results, Russian pipeline exports will face many challenges in the future, both on western and eastern directions. A decrease in pipeline gas exports will inevitably affect country’s natural gas production and significantly reduce fossil fuel export revenues, jeopardizing the energy security of the country; (iii) according to proposed strategies, in order to ensure the long-term stable export supplies in the changing environment, Russia may need to adjust its traditional export strategy by performing export flows and product diversification, entering new markets, adapting its contracting mechanism, increasing competitiveness and gaining a reputation of a reliable gas supplier.

Keywords: Russian natural gas, Pipeline natural gas, Uncertainty, Scenario simulation, Export strategy

Procedia PDF Downloads 35
27275 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning

Authors: Chandan Hegde, K. Ashwini

Abstract:

Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.

Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning

Procedia PDF Downloads 160
27274 Alignment between Governance Structures and Food Safety Standards on the Shrimp Supply Chain in Indonesia

Authors: Maharani Yulisti, Amin Mugera, James Fogarty

Abstract:

Food safety standards have received significant attention in the fisheries global market due to health issues, free trade agreements, and increasing aquaculture production. Vertical coordination throughout the supply chain of fish producing and exporting countries is needed to meet food safety demands imposed by importing countries. However, the complexities of the supply chain governance structures and difficulties in standard implementation can generate safety uncertainty and high transaction costs. Using a Transaction Cost Economics framework, this paper examines the alignment between food safety standards and the governance structures in the shrimp supply chain in Indonesia. We find the supply chain is organized closer to the hierarchy-like governance structure where private standard (organic standard) are implemented and more towards a market-like governance structure where public standard (IndoGAP certification) are more prevalent. To verify the statements, two cases are examined from Sidoarjo district as a centre of shrimp production in Indonesia. The results show that public baseline FSS (Food Safety Standards) need additional mechanism to achieve a coordinated chain-wide response because uncertainty, asset specificity, and performance measurement problems are high in this chain. Organic standard as private chain-wide FSS is more efficient because it has been achieved by hierarchical-like type of governance structure.

Keywords: governance structure, shrimp value chain, food safety standards, transaction costs economics

Procedia PDF Downloads 351
27273 Probabilistic Analysis of Bearing Capacity of Isolated Footing using Monte Carlo Simulation

Authors: Sameer Jung Karki, Gokhan Saygili

Abstract:

The allowable bearing capacity of foundation systems is determined by applying a factor of safety to the ultimate bearing capacity. Conventional ultimate bearing capacity calculations routines are based on deterministic input parameters where the nonuniformity and inhomogeneity of soil and site properties are not accounted for. Hence, the laws of mathematics like probability calculus and statistical analysis cannot be directly applied to foundation engineering. It’s assumed that the Factor of Safety, typically as high as 3.0, incorporates the uncertainty of the input parameters. This factor of safety is estimated based on subjective judgement rather than objective facts. It is an ambiguous term. Hence, a probabilistic analysis of the bearing capacity of an isolated footing on a clayey soil is carried out by using the Monte Carlo Simulation method. This simulated model was compared with the traditional discrete model. It was found out that the bearing capacity of soil was found higher for the simulated model compared with the discrete model. This was verified by doing the sensitivity analysis. As the number of simulations was increased, there was a significant % increase of the bearing capacity compared with discrete bearing capacity. The bearing capacity values obtained by simulation was found to follow a normal distribution. While using the traditional value of Factor of safety 3, the allowable bearing capacity had lower probability (0.03717) of occurring in the field compared to a higher probability (0.15866), while using the simulation derived factor of safety of 1.5. This means the traditional factor of safety is giving us bearing capacity that is less likely occurring/available in the field. This shows the subjective nature of factor of safety, and hence probability method is suggested to address the variability of the input parameters in bearing capacity equations.

Keywords: bearing capacity, factor of safety, isolated footing, montecarlo simulation

Procedia PDF Downloads 161
27272 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression

Authors: Keisuke Takahata, Hiroshi Suetsugu

Abstract:

Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.

Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification

Procedia PDF Downloads 150
27271 There's No End in Sight: An Interpretative Phenomenological Analysis of Quality of Life in Burning Syndrome Sufferers

Authors: R. McGrath, A. Trace, S. Curtin, C. McCreary

Abstract:

Introduction: Although, in relation to Burning Mouth Syndrome (BMS), much energy has been expended on its definition and etiology, it still remains a contentious issue. There is agreement on the symptoms, but on little else; and approaches to treatment vary widely. However, it has been established that the condition has a detrimental effect on the sufferer’s quality of life. Much research focus has been put on the physical impact of the syndrome. Recently, some literature has turned the focus to social, functional, and psychological factors. However, there is very little qualitative research on how burning mouth syndrome affects the lives of sufferer’s and the present study seeks to remedy this. Method: The study recruited five male participants who took part in semi-structured interviews lasting between 30 and 50 minutes. Data was analysed using Interpretative Phenomenological Analysis. Results: The study identified four super-ordinate themes: Lack of Control due to Uncertainty about Condition; Disruption to Internal Sense of Self; Negative Future Expectation due to Chronic Symptoms; and Sense of BMS as an Intrusive Force. Aspects of these themes reflect areas of reduction in quality of life. Conclusion: BMS damages an individual’s quality of life in ways that have not been reflected in self-report surveys of health-related quality of life. The condition has serious implications for the individual's sense of self, identity, and future. The study recommends that further qualitative research be carried out in this area. Also, the use of therapeutic interventions with sufferers from BMS is recommended, which would help not only sufferers but best practice in relation to their treatment.

Keywords: burning mouth syndrome, interpretative phenomenological analysis, qualitative research, quality of life

Procedia PDF Downloads 414
27270 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty

Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih

Abstract:

In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.

Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization

Procedia PDF Downloads 103
27269 Emptiness Downlink and Uplink Proposal Using Space-Time Equation Interpretation

Authors: Preecha Yupapin And Somnath

Abstract:

From the emptiness, the vibration induces the fractal, and the strings are formed. From which the first elementary particle groups, known as quarks, were established. The neutrino and electron are created by them. More elementary particles and life are formed by organic and inorganic substances. The universe is constructed, from which the multi-universe has formed in the same way. universe assumes that the intense energy has escaped from the singularity cone from the multi-universes. Initially, the single mass energy is confined, from which it is disturbed by the space-time distortion. It splits into the entangled pair, where the circular motion is established. It will consider one side of the entangled pair, where the fusion energy of the strong coupling force has formed. The growth of the fusion energy has the quantum physic phenomena, where the moving of the particle along the circumference with a speed faster than light. It introduces the wave-particle duality aspect, which will be saturated at the stopping point. It will be re-run again and again without limitation, which can say that the universe has been created and expanded. The Bose-Einstein condensate (BEC) is released through the singularity by the wormhole, which will be condensed to become a mass associated with the Sun's size. It will circulate(orbit) along the Sun. the consideration of the uncertainty principle is applied, from which the breath control is followed by the uncertainty condition ∆p∆x=∆E∆t~ℏ. The flowing in-out air into a body via a nose has applied momentum and energy control respecting the movement and time, in which the target is that the distortion of space-time will have vanished. Finally, the body is clean which can go to the next procedure, where the mind can escape from the body by the speed of light. However, the borderline between contemplation to being an Arahant is a vacuum, which will be explained.

Keywords: space-time, relativity, enlightenment, emptiness

Procedia PDF Downloads 41
27268 Fuzzy Total Factor Productivity by Credibility Theory

Authors: Shivi Agarwal, Trilok Mathur

Abstract:

This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.

Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index

Procedia PDF Downloads 331
27267 The Role of Education and Indigenous Knowledge in Disaster Preparedness

Authors: Sameen Masood, Muhammad Ali Jibran

Abstract:

The frequent flood history in Pakistan has pronounced the need for disaster risk management. Various policies are formulated and steps are being taken by the government in order to cope with the flood effects. However, a much promising pro-active approach that is globally acknowledged is educating the masses regarding living with risk and uncertainty. Unfortunately, majority of the flood victims in Pakistan are poor and illiterate which also transpires as a significant cause of their distress. An illiterate population is not risk averse or equipped intellectually regarding how to prepare and protect against natural disasters. The current research utilizes a cross-disciplinary approach where the role of education (both formal and informal) and indigenous knowledge is explored with reference to disaster preparedness. The data was collected from the flood prone rural areas of Punjab. In the absence of disaster curriculum taught in formal schools, informal education disseminated by NGOs and relief and rehabilitation agencies was the only education given to the flood victims. However the educational attainment of flood victims highly correlated with their awareness regarding flood management and disaster preparedness. Moreover, lessons learned from past flood experience generated indigenous knowledge on the basis of which flood victims prepared themselves for any uncertainty. If the future policy regarding disaster preparation integrates indigenous knowledge and then delivers education on the basis of that, it is anticipated that the flood devastations can be much reduced. Education can play a vital role in amplifying perception of risk and taking precautionary measures for disaster. The findings of the current research will provide practical strategies where disaster preparedness through education has not yet been applied.

Keywords: education, disaster preparedness, illiterate population, risk management

Procedia PDF Downloads 453
27266 Options Trading and Crash Risk

Authors: Cameron Truong, Mikhail Bhatia, Yangyang Chen, Viet Nga Cao

Abstract:

Using a sample of U.S. firms between 1996 and 2011, this paper documents a positive association between options trading volume and future stock price crash risk. This relation is evidently more pronounced among firms with higher information asymmetry, business uncertainty, and short-sale constraints. In a dichotomous cross-sectional setting, we also document that firms with options trading have higher future crash risk than firms without options trading. We further show in a difference-in-difference analysis that firms experience an increase in crash risk immediately after the listing of options. The results suggest that options traders are able of identifying bad news hoarding by management and choose to trade in a liquid options market in anticipation of future crashes.

Keywords: bad news hoarding, cross-sectional setting, options trading, stock price crash

Procedia PDF Downloads 423
27265 Information Disclosure And Financial Sentiment Index Using a Machine Learning Approach

Authors: Alev Atak

Abstract:

In this paper, we aim to create a financial sentiment index by investigating the company’s voluntary information disclosures. We retrieve structured content from BIST 100 companies’ financial reports for the period 1998-2018 and extract relevant financial information for sentiment analysis through Natural Language Processing. We measure strategy-related disclosures and their cross-sectional variation and classify report content into generic sections using synonym lists divided into four main categories according to their liquidity risk profile, risk positions, intra-annual information, and exposure to risk. We use Word Error Rate and Cosin Similarity for comparing and measuring text similarity and derivation in sets of texts. In addition to performing text extraction, we will provide a range of text analysis options, such as the readability metrics, word counts using pre-determined lists (e.g., forward-looking, uncertainty, tone, etc.), and comparison with reference corpus (word, parts of speech and semantic level). Therefore, we create an adequate analytical tool and a financial dictionary to depict the importance of granular financial disclosure for investors to identify correctly the risk-taking behavior and hence make the aggregated effects traceable.

Keywords: financial sentiment, machine learning, information disclosure, risk

Procedia PDF Downloads 71
27264 Modern Well Logs Technology to Improve Geological Model for Libyan Deep Sand Stone Reservoir

Authors: Tarek S. Duzan, Fisal Ben Ammer, Mohamed Sula

Abstract:

In some places within Sirt Basin-Libya, it has been noticed that seismic data below pre-upper cretaceous unconformity (PUK) is hopeless to resolve the large-scale structural features and is unable to fully determine reservoir delineation. Seismic artifacts (multiples) are observed in the reservoir zone (Nubian Formation) below PUK, which complicate the process of seismic interpretation. The nature of the unconformity and the structures below are still ambiguous and not fully understood which generates a significant gap in characterizing the geometry of the reservoir, the uncertainty accompanied with lack of reliable seismic data creates difficulties in building a robust geological model. High resolution dipmeter is highly useful in steeply dipping zones. This paper uses FMl and OBMl borehole images (dipmeter) to analyze the structures below the PUK unconformity from two wells drilled recently in the North Gialo field (a mature reservoir). In addition, borehole images introduce new evidences that the PUK unconformity is angular and the bedding planes within the Nubian formation (below PUK) are significantly titled. Structural dips extracted from high resolution borehole images are used to construct a new geological model by the utilization of latest software technology. Therefore, it is important to use the advance well logs technology such as FMI-HD for any future drilling and up-date the existing model in order to minimize the structural uncertainty.

Keywords: FMI (formation micro imager), OBMI (oil base mud imager), UBI (ultra sonic borehole imager), nub sandstone reservoir in North gialo

Procedia PDF Downloads 299
27263 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm

Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee

Abstract:

Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.

Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification

Procedia PDF Downloads 256
27262 Challenges of Cryogenic Fluid Metering by Coriolis Flowmeter

Authors: Evgeniia Shavrina, Yan Zeng, Boo Cheong Khoo, Vinh-Tan Nguyen

Abstract:

The present paper is aimed at providing a review of error sources in cryogenic metering by Coriolis flowmeters (CFMs). Whereas these flowmeters allow accurate water metering, high uncertainty and low repeatability are commonly observed at cryogenic fluid metering, which is often necessary for effective renewable energy production and storage. The sources of these issues might be classified as general and cryogenic specific challenges. A conducted analysis of experimental and theoretical studies shows that material behaviour at cryogenic temperatures, composition variety, and multiphase presence are the most significant cryogenic challenges. At the same time, pipeline diameter limitation, ambient vibration impact, and drawbacks of the installation may be highlighted as the most important general challenges of cryogenic metering by CFM. Finally, the techniques, which mitigate the impact of these challenges are reviewed, and future development direction is indicated.

Keywords: Coriolis flowmeter, cryogenic, multicomponent flow, multiphase flow

Procedia PDF Downloads 128
27261 Tracing Sources of Sediment in an Arid River, Southern Iran

Authors: Hesam Gholami

Abstract:

Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.

Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran

Procedia PDF Downloads 47
27260 Mean Monthly Rainfall Prediction at Benina Station Using Artificial Neural Networks

Authors: Hasan G. Elmazoghi, Aisha I. Alzayani, Lubna S. Bentaher

Abstract:

Rainfall is a highly non-linear phenomena, which requires application of powerful supervised data mining techniques for its accurate prediction. In this study the Artificial Neural Network (ANN) technique is used to predict the mean monthly historical rainfall data collected from BENINA station in Benghazi for 31 years, the period of “1977-2006” and the results are compared against the observed values. The specific objective to achieve this goal was to determine the best combination of weather variables to be used as inputs for the ANN model. Several statistical parameters were calculated and an uncertainty analysis for the results is also presented. The best ANN model is then applied to the data of one year (2007) as a case study in order to evaluate the performance of the model. Simulation results reveal that application of ANN technique is promising and can provide reliable estimates of rainfall.

Keywords: neural networks, rainfall, prediction, climatic variables

Procedia PDF Downloads 461
27259 Potential Impact of Climate Change on Suspended Sediment Changes in Mekong River Basin

Authors: Zuliziana Suif, Nordila Ahmad, Sengheng Hul

Abstract:

This paper evaluates the impact of climate change on suspended sediment changes in the Mekong River Basin. In this study, the distributed process-based sediment transport model is used to examine the potential impact of future climate on suspended sediment dynamic changes in the Mekong River Basin. To this end, climate scenarios from two General Circulation Model (GCMs) were considered in the scenario analysis. The simulation results show that the sediment load and concentration shows 0.64% to 69% increase in the near future (2041-2050) and 2.5% to 95% in the far future (2090- 2099). As the projected climate change impact on sediment varies remarkably between the different climate models, the uncertainty should be taken into account in sediment management. Overall, the changes in sediment load and concentration can have a great implication for related sediment management.

Keywords: climate change, suspended sediment, Mekong River Basin, GCMs

Procedia PDF Downloads 413
27258 Understanding the Manifestation of Psychosocial Difficulties in Children with Developmental Language Disorder, with a Focus on Anxiety and Social Frustration

Authors: Annabel Burnley, Michelle St. Clair, Charlotte Dack, Yvonne Wren

Abstract:

Children with Developmental Language Disorder (DLD) are well documented to experience social and emotional difficulties. Despite this, there is little consensus as to how these difficulties manifest, without which the ability to develop prevention initiatives is limited. An online survey was completed by 107 parents of either child with DLD (‘DLD sample’; n=57), or typically developing children (‘typical sample’; n=50), all aged 6-12 years old. Psychosocial symptom measures were used, alongside 11 psychosocial statements generated from previous qualitative work. Qualitative interviews were then held to understand the manifestation of key difficulties in more depth (n=4). The DLD sample scored significantly higher on all psychosocial statements than the typical sample. Experiencing anxiety (80.7%), requiring routine and sameness (75.4%) and struggling to regulate their emotions (75.4%) were the most common difficulties for a majority of children with DLD. For this DLD sample, family communication and coping styles were found not to contribute to the manifestation of these difficulties. Two separate mediation models were run to understand the role of other psychosocial difficulties in the manifestation of (1) anxiety and (2) social frustration. ‘Intolerance of uncertainty was found to strongly mediate the relationship between DLD diagnosis and symptoms of anxiety. Emotion regulation was found to moderately mediate the relationship between DLD diagnosis and social frustration. Parents appear to cope well with their children’s complex psychosocial needs, but further external intervention is needed. Intervention focussing on intolerance of uncertainty and emotion dysregulation may help the management of anxiety and social frustration. Further research is needed to understand the children’s routined behaviors.

Keywords: psychosocial difficulties, developmental language disorder, specific language impairment, parent, anxiety

Procedia PDF Downloads 84
27257 Strategic Risk Issues for Film Distributors of Hindi Film Industry in Mumbai: A Grounded Theory Approach

Authors: Rashmi Dyondi, Shishir K. Jha

Abstract:

The purpose of the paper is to address the strategic risk issues surrounding Hindi film distribution in Mumbai for a film distributor, who acts as an entrepreneur when launching a product (movie) in the market (film territory).The paper undertakes a fundamental review of films and risk in the Hindi film industry and applies Grounded Theory technique to understand the complex phenomena of risk taking behavior of the film distributors (both independent and studios) in Mumbai. Rich in-depth interviews with distributors are coded to develop core categories through constant comparison leading to conceptualization of the phenomena of interest. This paper is a first-of-its-kind-attempt to understand risk behavior of a distributor, which is akin to entrepreneurial risk behavior under conditions of uncertainty. Unlike extensive scholarly work on dynamics of Hollywood motion picture industry, Hindi film industry is an under-researched area till now. Especially how do film distributors perceive risk is an unexplored study for the Hindi film industry. Films are unique experience products and the film distributor acts as an entrepreneur assuming high risks given the uncertainty in the motion picture business. With the entry of mighty corporate studios and astronomical film budgets posing serious business threats to the independent distributors, there is a need for an in-depth qualitative enquiry (applying grounded theory technique) for unraveling the definition of risk for the independent distributors in Mumbai vis-à-vis the corporate studios. Need for good content was a common challenge to both the groups in the present state of the industry, however corporate studios with their distinct ideologies, focus on own productions and financial power faced different set of challenges than the independents (like achieving sustainability in business). Softer issues like market goodwill and relations with producers, honesty in business dealings and transparency came out to be clear markers for success of independents in long run. The findings from the qualitative analysis stress on different elements of risk and challenges as perceived by the two groups of distributors in the Hindi film industry and provide a future research agenda for empirical investigation of determinants of box-office success of Hindi films distributed in Mumbai.

Keywords: entrepreneurial risk behavior, film distribution strategy, Hindi film industry, risk

Procedia PDF Downloads 291
27256 The Relationship Between Soldiers’ Psychological Resilience, Leadership Style and Organisational Commitment

Authors: Rosita Kanapeckaite

Abstract:

The modern operational military environment is a combination of factors such as change, uncertainty, complexity and ambiguity. Stiehm (2002) refers to such situations as VUCA situations. VUCA is an acronym commonly used to describe the volatility, uncertainty, complexity and ambiguity of various situations and conditions. Increasingly fast-paced military operations require military personnel to demonstrate readiness and resilience under stressful conditions in order to maintain the optimum cognitive and physical performance necessary to achieve success. Military resilience can be defined as the ability to cope with the negative effects of setbacks and associated stress on military performance and combat effectiveness. In the volatile, uncertain, complex and ambiguous modern operational environment, both current and future operations require and place a higher priority on enhancing and maintaining troop readiness and resilience to win decisively in multidimensional combat. This paper explores the phenomenon of soldiers' psychological resilience, theories of leadership, and commitment to the organisation. The aim of the study is to examine the relationship between soldiers' psychological resilience, leadership style and commitment to the organisation. The study involved 425 professional soldiers, the research method was a questionnaire survey. The instruments used were measures of psychological resilience, leadership styles and commitment to the organisation. Results: transformational leadership style predicts higher psychological resilience, and psychologically resilient professional servicemen are more committed to the organisation. The study confirms the importance of soldiers' psychological resilience for their commitment to the organisation. The paper also discusses practical applications.

Keywords: resilience, commitment, solders, leadership style

Procedia PDF Downloads 54
27255 Building Information Models Utilization for Design Improvement of Infrastructure

Authors: Keisuke Fujioka, Yuta Itoh, Masaru Minagawa, Shunji Kusayanagi

Abstract:

In this study, building information models of the underground temporary structures and adjacent embedded pipes were constructed to show the importance of the information on underground pipes adjacent to the structures to enhance the productivity of execution of construction. Next, the bar chart used in actual construction process were employed to make the Gantt chart, and the critical pass analysis was carried out to show that accurate information on the arrangement of underground existing pipes can be used for the enhancement of the productivity of the construction of underground structures. In the analyzed project, significant construction delay was not caused by unforeseeable existence of underground pipes by the management ability of the construction manager. However, in many cases of construction executions in the developing countries, the existence of unforeseeable embedded pipes often causes substantial delay of construction. Design change based on uncertainty on the position information of embedded pipe can be also important risk for contractors in domestic construction. So CPM analyses were performed by a project-management-software to the situation that influence of the tasks causing construction delay was assumed more significant. Through the analyses, the efficiency of information management on underground pipes and BIM analysis in the design stage for workability improvement was indirectly confirmed.

Keywords: building-information modelling, construction information modelling, design improvement, infrastructure

Procedia PDF Downloads 282