Search results for: stochastic uncertainty analysis
28195 New Analytical Current-Voltage Model for GaN-based Resonant Tunneling Diodes
Authors: Zhuang Guo
Abstract:
In the field of GaN-based resonant tunneling diodes (RTDs) simulations, the traditional Tsu-Esaki formalism failed to predict the values of peak currents and peak voltages in the simulated current-voltage(J-V) characteristics. The main reason is that due to the strong internal polarization fields, two-dimensional electron gas(2DEG) accumulates at emitters, resulting in 2D-2D resonant tunneling currents, which become the dominant parts of the total J-V characteristics. By comparison, based on the 3D-2D resonant tunneling mechanism, the traditional Tsu-Esaki formalism cannot predict the J-V characteristics correctly. To overcome this shortcoming, we develop a new analytical model for the 2D-2D resonant tunneling currents generated in GaN-based RTDs. Compared with Tsu-Esaki formalism, the new model has made the following modifications: Firstly, considering the Heisenberg uncertainty, the new model corrects the expression of the density of states around the 2DEG eigenenergy levels at emitters so that it could predict the half width at half-maximum(HWHM) of resonant tunneling currents; Secondly, taking into account the effect of bias on wave vectors on the collectors, the new model modifies the expression of the transmission coefficients which could help to get the values of peak currents closer to the experiment data compared with Tsu-Esaki formalism. The new analytical model successfully predicts the J-V characteristics of GaN-based RTDs, and it also reveals more detailed mechanisms of resonant tunneling happened in GaN-based RTDs, which helps to design and fabricate high-performance GaN RTDs.Keywords: GaN-based resonant tunneling diodes, tsu-esaki formalism, 2D-2D resonant tunneling, heisenberg uncertainty
Procedia PDF Downloads 7628194 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation
Authors: P. Selyshchev
Abstract:
We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.Keywords: irradiation, primary defects, interaction, fluctuations
Procedia PDF Downloads 34328193 Observationally Constrained Estimates of Aerosol Indirect Radiative Forcing over Indian Ocean
Authors: Sofiya Rao, Sagnik Dey
Abstract:
Aerosol-cloud-precipitation interaction continues to be one of the largest sources of uncertainty in quantifying the aerosol climate forcing. The uncertainty is increasing from global to regional scale. This problem remains unresolved due to the large discrepancy in the representation of cloud processes in the climate models. Most of the studies on aerosol-cloud-climate interaction and aerosol-cloud-precipitation over Indian Ocean (like INDOEX, CAIPEEX campaign etc.) are restricted to either particular to one season or particular to one region. Here we developed a theoretical framework to quantify aerosol indirect radiative forcing using Moderate Resolution Imaging Spectroradiometer (MODIS) aerosol and cloud products of 15 years (2000-2015) period over the Indian Ocean. This framework relies on the observationally constrained estimate of the aerosol-induced change in cloud albedo. We partitioned the change in cloud albedo into the change in Liquid Water Path (LWP) and Effective Radius of Clouds (Reff) in response to an aerosol optical depth (AOD). Cloud albedo response to an increase in AOD is most sensitive in the range of LWP between 120-300 gm/m² for a range of Reff varying from 8-24 micrometer, which means aerosols are most sensitive to this range of LWP and Reff. Using this framework, aerosol forcing during a transition from indirect to semi-direct effect is also calculated. The outcome of this analysis shows best results over the Arabian Sea in comparison with the Bay of Bengal and the South Indian Ocean because of heterogeneity in aerosol spices over the Arabian Sea. Over the Arabian Sea during Winter Season the more absorbing aerosols are dominating, during Pre-monsoon dust (coarse mode aerosol particles) are more dominating. In winter and pre-monsoon majorly the aerosol forcing is more dominating while during monsoon and post-monsoon season meteorological forcing is more dominating. Over the South Indian Ocean, more or less same types of aerosol (Sea salt) are present. Over the Arabian Sea the Aerosol Indirect Radiative forcing are varying from -5 ± 4.5 W/m² for winter season while in other seasons it is reducing. The results provide observationally constrained estimates of aerosol indirect forcing in the Indian Ocean which can be helpful in evaluating the climate model performance in the context of such complex interactions.Keywords: aerosol-cloud-precipitation interaction, aerosol-cloud-climate interaction, indirect radiative forcing, climate model
Procedia PDF Downloads 17528192 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming
Authors: Hadi Gholizadeh, Ali Tajdin
Abstract:
To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.Keywords: goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression
Procedia PDF Downloads 22328191 Maximizing Coverage with Mobile Crime Cameras in a Stochastic Spatiotemporal Bipartite Network
Authors: (Ted) Edward Holmberg, Mahdi Abdelguerfi, Elias Ioup
Abstract:
This research details a coverage measure for evaluating the effectiveness of observer node placements in a spatial bipartite network. This coverage measure can be used to optimize the configuration of stationary or mobile spatially oriented observer nodes, or a hybrid of the two, over time in order to fully utilize their capabilities. To demonstrate the practical application of this approach, we construct a SpatioTemporal Bipartite Network (STBN) using real-time crime center (RTCC) camera nodes and NOPD calls for service (CFS) event nodes from New Orleans, La (NOLA). We use the coverage measure to identify optimal placements for moving mobile RTCC camera vans to improve coverage of vulnerable areas based on temporal patterns.Keywords: coverage measure, mobile node dynamics, Monte Carlo simulation, observer nodes, observable nodes, spatiotemporal bipartite knowledge graph, temporal spatial analysis
Procedia PDF Downloads 11328190 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement
Authors: Nadezhda Kvatashidze
Abstract:
The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.Keywords: conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship
Procedia PDF Downloads 12628189 Advanced Technologies and Algorithms for Efficient Portfolio Selection
Authors: Konstantinos Liagkouras, Konstantinos Metaxiotis
Abstract:
In this paper we present a classification of the various technologies applied for the solution of the portfolio selection problem according to the discipline and the methodological framework followed. We provide a concise presentation of the emerged categories and we are trying to identify which methods considered obsolete and which lie at the heart of the debate. On top of that, we provide a comparative study of the different technologies applied for efficient portfolio construction and we suggest potential paths for future work that lie at the intersection of the presented techniques.Keywords: portfolio selection, optimization techniques, financial models, stochastic, heuristics
Procedia PDF Downloads 43228188 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 13528187 Production Optimization under Geological Uncertainty Using Distance-Based Clustering
Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe
Abstract:
It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization
Procedia PDF Downloads 14328186 The Influence of Hydrogen Addition to Natural Gas Networks on Gas Appliances
Authors: Yitong Xie, Chaokui Qin, Zhiguang Chen, Shuangqian Guo
Abstract:
Injecting hydrogen, a competitive carbon-free energy carrier, into existing natural gas networks has become a promising step toward alleviating global warming. Considering the differences in properties of hydrogen and natural gas, there is very little evidence showing how many degrees of hydrogen admixture can be accepted and how to adjust appliances to adapt to gas constituents' variation. The lack of this type of analysis provides more uncertainty in injecting hydrogen into networks because of the short the basis of burner design and adjustment. First, the properties of methane and hydrogen were compared for a comprehensive analysis of the impact of hydrogen addition to methane. As the main determinant of flame stability, the burning velocity was adopted for hydrogen addition analysis. Burning velocities for hydrogen-enriched natural gas with different hydrogen percentages and equivalence ratios were calculated by the software CHEMKIN. Interchangeability methods, including single index methods, multi indices methods, and diagram methods, were adopted to determine the limit of hydrogen percentage. Cooktops and water heaters were experimentally tested in the laboratory. Flame structures of different hydrogen percentages and equivalence ratios were observed and photographed. Besides, the change in heat efficiency, burner temperature, emission by hydrogen percentage, and equivalence ratio was studied. The experiment methodologies and results in this paper provide an important basis for the introduction of hydrogen into gas pipelines and the adjustment of gas appliances.Keywords: hydrogen, methane, combustion, appliances, interchangeability
Procedia PDF Downloads 9128185 Supply Chain Optimization for Silica Sand in a Glass Manufacturing Company
Authors: Ramon Erasmo Verdin Rodriguez
Abstract:
Many has been the ways that historically the managers and gurus has been trying to get closer to the perfect supply chain, but since this topic is so vast and very complex the bigger the companies are, the duty has not been certainly easy. On this research, you are going to see thru the entrails of the logistics that happens at a glass manufacturing company with the number one raw material of the process that is the silica sand. After a very quick passage thru the supply chain, this document is going to focus on the way that raw materials flow thru the system, so after that, an analysis and research can take place to improve the logistics. Thru Operations Research techniques, it will be analyzed the current scheme of distribution and inventories of raw materials at a glass company’s plants, so after a mathematical conceptualization process, the supply chain could be optimized with the purpose of reducing the uncertainty of supply and obtaining an economic benefit at the very end of this research.Keywords: inventory management, operations research, optimization, supply chain
Procedia PDF Downloads 32628184 Alignment between Governance Structures and Food Safety Standards on the Shrimp Supply Chain in Indonesia
Authors: Maharani Yulisti, Amin Mugera, James Fogarty
Abstract:
Food safety standards have received significant attention in the fisheries global market due to health issues, free trade agreements, and increasing aquaculture production. Vertical coordination throughout the supply chain of fish producing and exporting countries is needed to meet food safety demands imposed by importing countries. However, the complexities of the supply chain governance structures and difficulties in standard implementation can generate safety uncertainty and high transaction costs. Using a Transaction Cost Economics framework, this paper examines the alignment between food safety standards and the governance structures in the shrimp supply chain in Indonesia. We find the supply chain is organized closer to the hierarchy-like governance structure where private standard (organic standard) are implemented and more towards a market-like governance structure where public standard (IndoGAP certification) are more prevalent. To verify the statements, two cases are examined from Sidoarjo district as a centre of shrimp production in Indonesia. The results show that public baseline FSS (Food Safety Standards) need additional mechanism to achieve a coordinated chain-wide response because uncertainty, asset specificity, and performance measurement problems are high in this chain. Organic standard as private chain-wide FSS is more efficient because it has been achieved by hierarchical-like type of governance structure.Keywords: governance structure, shrimp value chain, food safety standards, transaction costs economics
Procedia PDF Downloads 37928183 Empirical Acceleration Functions and Fuzzy Information
Authors: Muhammad Shafiq
Abstract:
In accelerated life testing approaches life time data is obtained under various conditions which are considered more severe than usual condition. Classical techniques are based on obtained precise measurements, and used to model variation among the observations. In fact, there are two types of uncertainty in data: variation among the observations and the fuzziness. Analysis techniques, which do not consider fuzziness and are only based on precise life time observations, lead to pseudo results. This study was aimed to examine the behavior of empirical acceleration functions using fuzzy lifetimes data. The results showed an increased fuzziness in the transformed life times as compare to the input data.Keywords: acceleration function, accelerated life testing, fuzzy number, non-precise data
Procedia PDF Downloads 29828182 A Knowledge-Based Development of Risk Management Approaches for Construction Projects
Authors: Masoud Ghahvechi Pour
Abstract:
Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.Keywords: risk, management, knowledge, risk management
Procedia PDF Downloads 6628181 Effect of Urea Deep Placement Technology Adoption on the Production Frontier: Evidence from Irrigation Rice Farmers in the Northern Region of Ghana
Authors: Shaibu Baanni Azumah, William Adzawla
Abstract:
Rice is an important staple crop, with current demand higher than the domestic supply in Ghana. This has led to a high and unfavourable import bill. Therefore, recent policies and interventions in the agricultural sub-sector aim at promoting various improved agricultural technologies in order to improve domestic production and reduce the importation of rice. In this study, we examined the effect of the adoption of Urea Deep Placement (UDP) technology by rice farmers on the position of the production frontier. This involved 200 farmers selected through a multi stage sampling technique in the Northern region of Ghana. A Cobb-Douglas stochastic frontier model was fitted. The result showed that the adoption of UDP technology shifts the output frontier outward and also move the farmers closer to the frontier. Farmers were also operating under diminishing returns to scale which calls for redress. Other factors that significantly influenced rice production were farm size, labour, use of certified seeds and NPK fertilizer. Although there was an opportunity for improvement, the farmers were highly efficient (92%), compared to previous studies. Farmers’ efficiency was improved through increased education, household size, experience, access to credit, and lack of extension service provision by MoFA. The study recommends the revision of Ghana’s agricultural policy to include the UDP technology. Agricultural Extension officers of the Ministry of Food and Agriculture (MoFA) should be trained on the UDP technology to support IFDC’s drive to improve adoption by rice farmers. Rice farmers are also encouraged to expand their farm lands, improve plant population, and also increase the usage of fertilizer to improve yields. Mechanisms through which credit can be made easily accessible and effectively utilised should be identified and promoted.Keywords: efficiency, rice farmers, stochastic frontier, UDP technology
Procedia PDF Downloads 40928180 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression
Authors: Keisuke Takahata, Hiroshi Suetsugu
Abstract:
Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification
Procedia PDF Downloads 18228179 Soft Computing Employment to Optimize Safety Stock Levels in Supply Chain Dairy Product under Supply and Demand Uncertainty
Authors: Riyadh Jamegh, Alla Eldin Kassam, Sawsan Sabih
Abstract:
In order to overcome uncertainty conditions and inability to meet customers' requests due to these conditions, organizations tend to reserve a certain safety stock level (SSL). This level must be chosen carefully in order to avoid the increase in holding cost due to excess in SSL or shortage cost due to too low SSL. This paper used soft computing fuzzy logic to identify optimal SSL; this fuzzy model uses the dynamic concept to cope with high complexity environment status. The proposed model can deal with three input variables, i.e., demand stability level, raw material availability level, and on hand inventory level by using dynamic fuzzy logic to obtain the best SSL as an output. In this model, demand stability, raw material, and on hand inventory levels are described linguistically and then treated by inference rules of the fuzzy model to extract the best level of safety stock. The aim of this research is to provide dynamic approach which is used to identify safety stock level, and it can be implanted in different industries. Numerical case study in the dairy industry with Yogurt 200 gm cup product is explained to approve the validity of the proposed model. The obtained results are compared with the current level of safety stock which is calculated by using the traditional approach. The importance of the proposed model has been demonstrated by the significant reduction in safety stock level.Keywords: inventory optimization, soft computing, safety stock optimization, dairy industries inventory optimization
Procedia PDF Downloads 12528178 A Novel Probablistic Strategy for Modeling Photovoltaic Based Distributed Generators
Authors: Engy A. Mohamed, Y. G. Hegazy
Abstract:
This paper presents a novel algorithm for modeling photovoltaic based distributed generators for the purpose of optimal planning of distribution networks. The proposed algorithm utilizes sequential Monte Carlo method in order to accurately consider the stochastic nature of photovoltaic based distributed generators. The proposed algorithm is implemented in MATLAB environment and the results obtained are presented and discussed.Keywords: comulative distribution function, distributed generation, Monte Carlo
Procedia PDF Downloads 58428177 Emptiness Downlink and Uplink Proposal Using Space-Time Equation Interpretation
Authors: Preecha Yupapin And Somnath
Abstract:
From the emptiness, the vibration induces the fractal, and the strings are formed. From which the first elementary particle groups, known as quarks, were established. The neutrino and electron are created by them. More elementary particles and life are formed by organic and inorganic substances. The universe is constructed, from which the multi-universe has formed in the same way. universe assumes that the intense energy has escaped from the singularity cone from the multi-universes. Initially, the single mass energy is confined, from which it is disturbed by the space-time distortion. It splits into the entangled pair, where the circular motion is established. It will consider one side of the entangled pair, where the fusion energy of the strong coupling force has formed. The growth of the fusion energy has the quantum physic phenomena, where the moving of the particle along the circumference with a speed faster than light. It introduces the wave-particle duality aspect, which will be saturated at the stopping point. It will be re-run again and again without limitation, which can say that the universe has been created and expanded. The Bose-Einstein condensate (BEC) is released through the singularity by the wormhole, which will be condensed to become a mass associated with the Sun's size. It will circulate(orbit) along the Sun. the consideration of the uncertainty principle is applied, from which the breath control is followed by the uncertainty condition ∆p∆x=∆E∆t~ℏ. The flowing in-out air into a body via a nose has applied momentum and energy control respecting the movement and time, in which the target is that the distortion of space-time will have vanished. Finally, the body is clean which can go to the next procedure, where the mind can escape from the body by the speed of light. However, the borderline between contemplation to being an Arahant is a vacuum, which will be explained.Keywords: space-time, relativity, enlightenment, emptiness
Procedia PDF Downloads 6728176 Russian pipeline natural gas export strategy under uncertainty
Authors: Koryukaeva Ksenia, Jinfeng Sun
Abstract:
Europe has been a traditional importer of Russian natural gas for more than 50 years. In 2021, Russian state-owned company Gazprom supplied about a third of all gas consumed in Europe. The Russia-Europe mutual dependence in terms of natural gas supplies has been causing many concerns about the energy security of the two sides for a long period of time. These days the issue has become more urgent than ever considering recent Russian invasion in Ukraine followed by increased large-scale geopolitical conflicts, making the future of Russian natural gas supplies and global gas markets as well highly uncertain. Hence, the main purpose of this study is to get insight into the possible futures of Russian pipeline natural gas exports by a scenario planning method based on Monte-Carlo simulation within LUSS model framework, and propose Russian pipeline natural gas export strategies based on the obtained scenario planning results. The scenario analysis revealed that recent geopolitical disputes disturbed the traditional, longstanding model of Russian pipeline gas exports, and, as a result, the prospects and the pathways for Russian pipeline gas on the world markets will differ significantly from those before 2022. Specifically, our main findings show, that (i) the events of 2022 generated many uncertainties for the long-term future of Russian pipeline gas export perspectives on both western and eastern supply directions, including geopolitical, regulatory, economic, infrastructure and other uncertainties; (ii) according to scenario modelling results, Russian pipeline exports will face many challenges in the future, both on western and eastern directions. A decrease in pipeline gas exports will inevitably affect country’s natural gas production and significantly reduce fossil fuel export revenues, jeopardizing the energy security of the country; (iii) according to proposed strategies, in order to ensure the long-term stable export supplies in the changing environment, Russia may need to adjust its traditional export strategy by performing export flows and product diversification, entering new markets, adapting its contracting mechanism, increasing competitiveness and gaining a reputation of a reliable gas supplier.Keywords: Russian natural gas, Pipeline natural gas, Uncertainty, Scenario simulation, Export strategy
Procedia PDF Downloads 6028175 The Role of Education and Indigenous Knowledge in Disaster Preparedness
Authors: Sameen Masood, Muhammad Ali Jibran
Abstract:
The frequent flood history in Pakistan has pronounced the need for disaster risk management. Various policies are formulated and steps are being taken by the government in order to cope with the flood effects. However, a much promising pro-active approach that is globally acknowledged is educating the masses regarding living with risk and uncertainty. Unfortunately, majority of the flood victims in Pakistan are poor and illiterate which also transpires as a significant cause of their distress. An illiterate population is not risk averse or equipped intellectually regarding how to prepare and protect against natural disasters. The current research utilizes a cross-disciplinary approach where the role of education (both formal and informal) and indigenous knowledge is explored with reference to disaster preparedness. The data was collected from the flood prone rural areas of Punjab. In the absence of disaster curriculum taught in formal schools, informal education disseminated by NGOs and relief and rehabilitation agencies was the only education given to the flood victims. However the educational attainment of flood victims highly correlated with their awareness regarding flood management and disaster preparedness. Moreover, lessons learned from past flood experience generated indigenous knowledge on the basis of which flood victims prepared themselves for any uncertainty. If the future policy regarding disaster preparation integrates indigenous knowledge and then delivers education on the basis of that, it is anticipated that the flood devastations can be much reduced. Education can play a vital role in amplifying perception of risk and taking precautionary measures for disaster. The findings of the current research will provide practical strategies where disaster preparedness through education has not yet been applied.Keywords: education, disaster preparedness, illiterate population, risk management
Procedia PDF Downloads 48628174 Optimization of Air Pollution Control Model for Mining
Authors: Zunaira Asif, Zhi Chen
Abstract:
The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.Keywords: air pollution, linear programming, mining, optimization, treatment technologies
Procedia PDF Downloads 20828173 Determinants of Profit Efficiency among Poultry Egg Farmers in Ondo State, Nigeria: A Stochastic Profit Function Approach
Authors: Olufunke Olufunmilayo Ilemobayo, Barakat. O Abdulazeez
Abstract:
Profit making among poultry egg farmers has been a challenge to efficient distribution of scarce farm resources over the years, due majorly to low capital base, inefficient management, technical inefficiency, economic inefficiency, thus poultry egg production has moved into an underperformed situation, characterised by low profit margin. Though previous studies focus mainly on broiler production and efficiency of its production, however, paucity of information exist in the areas of profit efficiency in the study area. Hence, determinants of profit efficiency among poultry egg farmers in Ondo State, Nigeria were investigated. A purposive sampling technique was used to obtain primary data from poultry egg farmers in Owo and Akure local government area of Ondo State, through a well-structured questionnaire. socio-economic characteristics such as age, gender, educational level, marital status, household size, access to credit, extension contact, other variables were input and output data like flock size, cost of feeder and drinker, cost of feed, cost of labour, cost of drugs and medications, cost of energy, price of crate of table egg, price of spent layers were variables used in the study. Data were analysed using descriptive statistics, budgeting analysis, and stochastic profit function/inefficiency model. Result of the descriptive statistics shows that 52 per cent of the poultry farmers were between 31-40 years, 62 per cent were male, 90 per cent had tertiary education, 66 per cent were primarily poultry farmers, 78 per cent were original poultry farm owners and 55 per cent had more than 5 years’ work experience. Descriptive statistics on cost and returns indicated that 64 per cent of the return were from sales of egg, while the remaining 36 per cent was from sales of spent layers. The cost of feeding take the highest proportion of 69 per cent of cost of production and cost of medication the lowest (7 per cent). A positive gross margin of N5, 518,869.76, net farm income of ₦ 5, 500.446.82 and net return on investment of 0.28 indicated poultry egg production is profitable. Equipment’s cost (22.757), feeding cost (18.3437), labour cost (136.698), flock size (16.209), drug and medication cost (4.509) were factors that affecting profit efficiency, while education (-2.3143), household size (-18.4291), access to credit (-16.027), and experience (-7.277) were determinant of profit efficiency. Education, household size, access to credit and experience in poultry production were the main determinants of profit efficiency of poultry egg production in Ondo State. Other factors that affect profit efficiency were cost of feeding, cost of labour, flock size, cost of drug and medication, they positively and significantly influenced profit efficiency in Ondo State, Nigeria.Keywords: cost and returns, economic inefficiency, profit margin, technical inefficiency
Procedia PDF Downloads 12928172 Gender Based Variability Time Series Complexity Analysis
Authors: Ramesh K. Sunkaria, Puneeta Marwaha
Abstract:
Nonlinear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy Normal Sinus Rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.Keywords: heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy
Procedia PDF Downloads 28228171 Modern Well Logs Technology to Improve Geological Model for Libyan Deep Sand Stone Reservoir
Authors: Tarek S. Duzan, Fisal Ben Ammer, Mohamed Sula
Abstract:
In some places within Sirt Basin-Libya, it has been noticed that seismic data below pre-upper cretaceous unconformity (PUK) is hopeless to resolve the large-scale structural features and is unable to fully determine reservoir delineation. Seismic artifacts (multiples) are observed in the reservoir zone (Nubian Formation) below PUK, which complicate the process of seismic interpretation. The nature of the unconformity and the structures below are still ambiguous and not fully understood which generates a significant gap in characterizing the geometry of the reservoir, the uncertainty accompanied with lack of reliable seismic data creates difficulties in building a robust geological model. High resolution dipmeter is highly useful in steeply dipping zones. This paper uses FMl and OBMl borehole images (dipmeter) to analyze the structures below the PUK unconformity from two wells drilled recently in the North Gialo field (a mature reservoir). In addition, borehole images introduce new evidences that the PUK unconformity is angular and the bedding planes within the Nubian formation (below PUK) are significantly titled. Structural dips extracted from high resolution borehole images are used to construct a new geological model by the utilization of latest software technology. Therefore, it is important to use the advance well logs technology such as FMI-HD for any future drilling and up-date the existing model in order to minimize the structural uncertainty.Keywords: FMI (formation micro imager), OBMI (oil base mud imager), UBI (ultra sonic borehole imager), nub sandstone reservoir in North gialo
Procedia PDF Downloads 31928170 Identification of Factors Affecting Technical Efficiency Sugar Cane Farming in East Java
Authors: Noor Rizkiyah, Djoko Koestiono, Budi Setiawan, Nuhfil Hanani
Abstract:
This research aims to identify the factors that affect the production of sugar cane, the level of technical efficiency of farming sugar cane ratooning and factors that affect technical inefficiency. Research carried out in Malang of East Java with sampling in a non random sampling stratified proportioned and obtained 172 household sugar cane farmers who are classified based on the level of ratooning i.e. ratooning I 3-4 times ratoning, ratooning II 5-10 times ratoning as well as ratooning III > 10 times ratoning. The method used is the Stochastic Production Frontier approach MLE (maximum likelihood estimation). From the results obtained by analysis of the factors affecting the production of sugar cane farming land, namely ratooning fertilizer use ZA petroganic, use of fertilizer and seeds of embroidery and labor. While the average level of sugar cane farmers ratooning efficiency of 0.78 and categorized yet efficient technically. For the factors that influence the technical inefficiency i.e. age, number of dependents and the frequency of family ratooning. Though not yet technically efficient but sugar cane farmers cultivate cultivation remains ratooning. But if it is done repeatedly ratooning will result in a decrease in the production of sugar cane. Whereas the results of the analysis of farming level of feasibility or RC ratooning sugar cane ratio of 1.15 so worth trying to accomplish. Thus with increased technology and combining the use of inputs is an attempt to let the technical efficiency can be achieved so that the more worthy to be organised.Keywords: technical efficiency, production, sugarcane, frontier
Procedia PDF Downloads 17228169 Quadrature Mirror Filter Bank Design Using Population Based Stochastic Optimization
Authors: Ju-Hong Lee, Ding-Chen Chung
Abstract:
The paper deals with the optimal design of two-channel linear-phase (LP) quadrature mirror filter (QMF) banks using a metaheuristic based optimization technique. Based on the theory of two-channel QMF banks using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the group delay error of the designed QMF bank and the magnitude response error of the designed low-pass analysis filter. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a particle swarm optimization algorithm. The resulting two-channel QMF banks can possess approximately LP response without magnitude distortion. Simulation results are presented for illustration and comparison.Keywords: quadrature mirror filter bank, digital all-pass filter, weighted least squares algorithm, particle swarm optimization
Procedia PDF Downloads 52128168 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: enhanced ideal gas molecular movement (EIGMM), ideal gas molecular movement (IGMM), model updating method, probability-based damage detection (PBDD), uncertainty quantification
Procedia PDF Downloads 27728167 Analyzing the Practicality of Drawing Inferences in Automation of Commonsense Reasoning
Authors: Chandan Hegde, K. Ashwini
Abstract:
Commonsense reasoning is the simulation of human ability to make decisions during the situations that we encounter every day. It has been several decades since the introduction of this subfield of artificial intelligence, but it has barely made some significant progress. The modern computing aids also have remained impotent in this regard due to the absence of a strong methodology towards commonsense reasoning development. Among several accountable reasons for the lack of progress, drawing inference out of commonsense knowledge-base stands out. This review paper emphasizes on a detailed analysis of representation of reasoning uncertainties and feasible prospects of programming aids for drawing inferences. Also, the difficulties in deducing and systematizing commonsense reasoning and the substantial progress made in reasoning that influences the study have been discussed. Additionally, the paper discusses the possible impacts of an effective inference technique in commonsense reasoning.Keywords: artificial intelligence, commonsense reasoning, knowledge base, uncertainty in reasoning
Procedia PDF Downloads 18728166 Tracing Sources of Sediment in an Arid River, Southern Iran
Authors: Hesam Gholami
Abstract:
Elevated suspended sediment loads in riverine systems resulting from accelerated erosion due to human activities are a serious threat to the sustainable management of watersheds and ecosystem services therein worldwide. Therefore, mitigation of deleterious sediment effects as a distributed or non-point pollution source in the catchments requires reliable provenance information. Sediment tracing or sediment fingerprinting, as a combined process consisting of sampling, laboratory measurements, different statistical tests, and the application of mixing or unmixing models, is a useful technique for discriminating the sources of sediments. From 1996 to the present, different aspects of this technique, such as grouping the sources (spatial and individual sources), discriminating the potential sources by different statistical techniques, and modification of mixing and unmixing models, have been introduced and modified by many researchers worldwide, and have been applied to identify the provenance of fine materials in agricultural, rural, mountainous, and coastal catchments, and in large catchments with numerous lakes and reservoirs. In the last two decades, efforts exploring the uncertainties associated with sediment fingerprinting results have attracted increasing attention. The frameworks used to quantify the uncertainty associated with fingerprinting estimates can be divided into three groups comprising Monte Carlo simulation, Bayesian approaches and generalized likelihood uncertainty estimation (GLUE). Given the above background, the primary goal of this study was to apply geochemical fingerprinting within the GLUE framework in the estimation of sub-basin spatial sediment source contributions in the arid Mehran River catchment in southern Iran, which drains into the Persian Gulf. The accuracy of GLUE predictions generated using four different sets of statistical tests for discriminating three sub-basin spatial sources was evaluated using 10 virtual sediments (VS) samples with known source contributions using the root mean square error (RMSE) and mean absolute error (MAE). Based on the results, the contributions modeled by GLUE for the western, central and eastern sub-basins are 1-42% (overall mean 20%), 0.5-30% (overall mean 12%) and 55-84% (overall mean 68%), respectively. According to the mean absolute fit (MAF; ≥ 95% for all target sediment samples) and goodness-of-fit (GOF; ≥ 99% for all samples), our suggested modeling approach is an accurate technique to quantify the source of sediments in the catchments. Overall, the estimated source proportions can help watershed engineers plan the targeting of conservation programs for soil and water resources.Keywords: sediment source tracing, generalized likelihood uncertainty estimation, virtual sediment mixtures, Iran
Procedia PDF Downloads 74