Search results for: stochastic uncertainty analysis
27995 Commodity Price Shocks and Monetary Policy
Authors: Faisal Algosair
Abstract:
We examine the role of monetary policy in the presence of commodity price shocks using a Dynamic stochastic general equilibrium (DSGE) model with price and wage rigidities. The model characterizes a commodity exporter by its degree of export diversification, and explores the following monetary regimes: flexible domestic inflation targeting; flexible Consumer Price Index inflation targeting; exchange rate peg; and optimal rule. An increase in the degree of diversification is found to mitigate responses to commodity shocks. The welfare comparison suggests that a flexible exchange rate regime under the optimal rule is preferred to an exchange rate peg. However, monetary policy provides limited stabilization effects in an economy with low degree of export diversification.Keywords: business cycle, commodity price, exchange rate, global financial cycle
Procedia PDF Downloads 9427994 Conceptualizing the Cyber Insecurity Risk in the Ethics of Automated Warfare
Authors: Otto Kakhidze, Hoda Alkhzaimi, Adam Ramey, Nasir Memon
Abstract:
This paper provides an alternative, cyber security based a conceptual framework for the ethics of automated warfare. The large body of work produced on fully or partially autonomous warfare systems tends to overlook malicious security factors as in the possibility of technical attacks on these systems when it comes to the moral and legal decision-making. The argument provides a risk-oriented justification to why technical malicious risks cannot be dismissed in legal, ethical and policy considerations when warfare models are being implemented and deployed. The assumptions of the paper are supported by providing a broader model that contains the perspective of technological vulnerabilities through the lenses of the Game Theory, Just War Theory as well as standard and non-standard defense ethics. The paper argues that a conventional risk-benefit analysis without considering ethical factors is insufficient for making legal and policy decisions on automated warfare. This approach will provide the substructure for security and defense experts as well as legal scholars, ethicists and decision theorists to work towards common justificatory grounds that will accommodate the technical security concerns that have been overlooked in the current legal and policy models.Keywords: automated warfare, ethics of automation, inherent hijacking, security vulnerabilities, risk, uncertainty
Procedia PDF Downloads 35627993 Effective Scheduling of Hybrid Reconfigurable Microgrids Considering High Penetration of Renewable Sources
Authors: Abdollah Kavousi Fard
Abstract:
This paper addresses the optimal scheduling of hybrid reconfigurable microgrids considering hybrid electric vehicle charging demands. A stochastic framework based on unscented transform to model the high uncertainties of renewable energy sources including wind turbine and photovoltaic panels, as well as the hybrid electric vehicles’ charging demand. In order to get to the optimal scheduling, the network reconfiguration is employed as an effective tool for changing the power supply path and avoiding possible congestions. The simulation results are analyzed and discussed in three different scenarios including coordinated, uncoordinated and smart charging demand of hybrid electric vehicles. A typical grid-connected microgrid is employed to show the satisfying performance of the proposed method.Keywords: microgrid, renewable energy sources, reconfiguration, optimization
Procedia PDF Downloads 26927992 European Countries Challenge’s in Value Added Tax
Authors: Fatbardha Kadiu, Nulifer Caliskan
Abstract:
The value added tax came as a necessity of substituting the old tax on sales. Based on the advantages of this new tax in our days it is used successfully in more than 140 countries around the world. The aim of the paper is to describe the nature of this tax with its advantages and disadvantages. Also it will describe the way how it functions in most of the European countries and the actual challenges of these countries on value added tax. It will be present the types of goods which are exempt from this tax, the reasons and the consequences of those exemptions. The paper will be based on secondary data taken from respective literature. An econometric model will be present in order to identify the dependence of value tax from other parameters. The analyzing most refers to the two main principles of harmonization and billing on the fiscal system and the ways how to restructures the system in order to minimize the fiscal evasion.Keywords: value added tax, revenues, complexity, legal uncertainty
Procedia PDF Downloads 39727991 Overview of Risk Management in Electricity Markets Using Financial Derivatives
Authors: Aparna Viswanath
Abstract:
Electricity spot prices are highly volatile under optimal generation capacity scenarios due to factors such as non-storability of electricity, peak demand at certain periods, generator outages, fuel uncertainty for renewable energy generators, huge investments and time needed for generation capacity expansion etc. As a result market participants are exposed to price and volume risk, which has led to the development of risk management practices. This paper provides an overview of risk management practices by market participants in electricity markets using financial derivatives.Keywords: financial derivatives, forward, futures, options, risk management
Procedia PDF Downloads 47627990 Disruption Coordination of Supply Chain with Loss-Averse Retailer Under Buy-Back Contract
Abstract:
This paper aims to investigate a two stage supply chain of one leading supplier and one following retailer that experiences two factors perturbation out of supplier's production cost, retailer's marginal cost and retail price in stochastic demand environment. Granted that risk neutral condition has long been discussed, little attention has been given to disruptions under the premise of risk neutral supplier and risk aversion retailer. We establish the optimal order quantity and revealed the profit distribution coefficient in risk-neutral static model, make adjustment under disruption scenario, and then select utility function method for risk aversion model. Using buy-back contract policy, the improvement of parameters can achieve channel coordination where Pareto optimal is realized.Keywords: supply chain coordination, disruption management, buy-back contract, lose aversion
Procedia PDF Downloads 32527989 Regional Hydrological Extremes Frequency Analysis Based on Statistical and Hydrological Models
Authors: Hadush Kidane Meresa
Abstract:
The hydrological extremes frequency analysis is the foundation for the hydraulic engineering design, flood protection, drought management and water resources management and planning to utilize the available water resource to meet the desired objectives of different organizations and sectors in a country. This spatial variation of the statistical characteristics of the extreme flood and drought events are key practice for regional flood and drought analysis and mitigation management. For different hydro-climate of the regions, where the data set is short, scarcity, poor quality and insufficient, the regionalization methods are applied to transfer at-site data to a region. This study aims in regional high and low flow frequency analysis for Poland River Basins. Due to high frequent occurring of hydrological extremes in the region and rapid water resources development in this basin have caused serious concerns over the flood and drought magnitude and frequencies of the river in Poland. The magnitude and frequency result of high and low flows in the basin is needed for flood and drought planning, management and protection at present and future. Hydrological homogeneous high and low flow regions are formed by the cluster analysis of site characteristics, using the hierarchical and C- mean clustering and PCA method. Statistical tests for regional homogeneity are utilized, by Discordancy and Heterogeneity measure tests. In compliance with results of the tests, the region river basin has been divided into ten homogeneous regions. In this study, frequency analysis of high and low flows using AM for high flow and 7-day minimum low flow series is conducted using six statistical distributions. The use of L-moment and LL-moment method showed a homogeneous region over entire province with Generalized logistic (GLOG), Generalized extreme value (GEV), Pearson type III (P-III), Generalized Pareto (GPAR), Weibull (WEI) and Power (PR) distributions as the regional drought and flood frequency distributions. The 95% percentile and Flow duration curves of 1, 7, 10, 30 days have been plotted for 10 stations. However, the cluster analysis performed two regions in west and east of the province where L-moment and LL-moment method demonstrated the homogeneity of the regions and GLOG and Pearson Type III (PIII) distributions as regional frequency distributions for each region, respectively. The spatial variation and regional frequency distribution of flood and drought characteristics for 10 best catchment from the whole region was selected and beside the main variable (streamflow: high and low) we used variables which are more related to physiographic and drainage characteristics for identify and delineate homogeneous pools and to derive best regression models for ungauged sites. Those are mean annual rainfall, seasonal flow, average slope, NDVI, aspect, flow length, flow direction, maximum soil moisture, elevation, and drainage order. The regional high-flow or low-flow relationship among one streamflow characteristics with (AM or 7-day mean annual low flows) some basin characteristics is developed using Generalized Linear Mixed Model (GLMM) and Generalized Least Square (GLS) regression model, providing a simple and effective method for estimation of flood and drought of desired return periods for ungauged catchments.Keywords: flood , drought, frequency, magnitude, regionalization, stochastic, ungauged, Poland
Procedia PDF Downloads 60027988 The Analysis of Regulation on Sustainability in the Financial Sector in Lithuania
Authors: Dalia Kubiliūtė
Abstract:
Lithuania is known as a trusted location for global business institutions, and it attracts investors with it’s competitive environment for financial service providers. Along with the aspiration to offer a strong results-oriented and innovations-driven environment for financial service providers, Lithuanian regulatory authorities consistently implement the European Union's high regulatory standards for financial activities, including sustainability-related disclosures. Since European Union directed its policy towards transition to a climate-neutral, green, competitive, and inclusive economy, additional regulatory requirements for financial market participants are adopted: disclosure of sustainable activities, transparency, prevention of greenwashing, etc. The financial sector is one of the key factors influencing the implementation of sustainability objectives in European Union policies and mitigating the negative effects of climate change –public funds are not enough to make a significant impact on sustainable investments, therefore directing public and private capital to green projects may help to finance the necessary changes. The topic of the study is original and has not yet been widely analyzed in Lithuanian legal discourse. There are used quantitative and qualitative methodologies, logical, systematic, and critical analysis principles; hence the aim of this study is to reveal the problem of the implementation of the regulation on sustainability in the Lithuanian financial sector. Additional regulatory requirements could cause serious changes in financial business operations: additional funds, employees, and time have to be dedicated in order for the companies could implement these regulations. Lack of knowledge and data on how to implement new regulatory requirements towards sustainable reporting causes a lot of uncertainty for financial market participants. And for some companies, it might even be an essential point in terms of business continuity. It is considered that the supervisory authorities should find a balance between financial market needs and legal regulation.Keywords: financial, legal, regulatory, sustainability
Procedia PDF Downloads 10227987 Fast Terminal Sliding Mode Controller For Quadrotor UAV
Authors: Vahid Tabrizi, Reza GHasemi, Ahmadreza Vali
Abstract:
This paper presents robust nonlinear control law for a quadrotor UAV using fast terminal sliding mode control. Fast terminal sliding mode idea is used for introducing a nonlinear sliding variable that guarantees the finite time convergence in sliding phase. Then, in reaching phase for removing chattering and producing smooth control signal, continuous approximation idea is used. Simulation results show that the proposed algorithm is robust against parameter uncertainty and has better performance than conventional sliding mode for controlling a quadrotor UAV.Keywords: quadrotor UAV, fast terminal sliding mode, second order sliding mode t
Procedia PDF Downloads 54727986 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis
Authors: H. Jung, N. Kim, B. Kang, J. Choe
Abstract:
History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.Keywords: history matching, principal component analysis, reservoir modelling, support vector machine
Procedia PDF Downloads 15827985 Environment-Specific Political Risk Discourse, Environmental Reputation, and Stock Price Crash Risk
Authors: Sohanur Rahman, Elisabeth Sinnewe, Larelle (Ellie) Chapple, Sarah Osborne
Abstract:
Greater political attention to global climate change exposes firms to a higher level of political uncertainty, which can lead to adverse capital market consequences. However, a higher level of discourse on environment-specific political risk (EPR) between management and investors can mitigate information asymmetry, followed by less stock price crash risk. This study examines whether EPR discourse in discourse in the earnings conference calls (ECC) reduces firm-level stock price crash risk in the US market. This research also explores if adverse disclosures via media channels further moderates the association between EPR on crash risk. Employing a dataset of 28,933 firm-year observations from 2002 to 2020, the empirical analysis reveals that EPR discourse in ECC reduces future stock price crash risk. However, adverse disclosures via media channels can offset the favourable effect of EPR discourse on crash risk. The results are robust to the potential endogeneity concern in a quasi-natural experiment setting.Keywords: earnings conference calls, environment, environment-specific political risk discourse, environmental disclosures, information asymmetry, reputation risk, stock price crash risk
Procedia PDF Downloads 13927984 Conservativeness of Probabilistic Constrained Optimal Control Method for Unknown Probability Distribution
Authors: Tomoaki Hashimoto
Abstract:
In recent decades, probabilistic constrained optimal control problems have attracted much attention in many research field. Although probabilistic constraints are generally intractable in an optimization problem, several tractable methods haven been proposed to handle probabilistic constraints. In most methods, probabilistic constraints are reduced to deterministic constraints that are tractable in an optimization problem. However, there is a gap between the transformed deterministic constraints in case of known and unknown probability distribution. This paper examines the conservativeness of probabilistic constrained optimization method with the unknown probability distribution. The objective of this paper is to provide a quantitative assessment of the conservatism for tractable constraints in probabilistic constrained optimization with the unknown probability distribution.Keywords: optimal control, stochastic systems, discrete time systems, probabilistic constraints
Procedia PDF Downloads 57827983 A Coordination of Supply Chain Disruption in Different Types of Manufacturing Environments: A Case Study of Sugar Manufacturing Company
Authors: Max Moleke, Gilbert Mbonde
Abstract:
Coordinating supply chain process within a manufacturing environment is a very critical aspect of any organization. Nowadays, most manufacturing industries turn to look at only the financial indicator which in real life situation on the shop floor, there are a number of supply chain disruptions that are been ignored. In this work, we had to look at different types of supply chain disruption and their various impact within the organization. A number of Industrial engineering tools are employed which includes, Multifactor productivity, activity on arrow and rescheduling plans. The final result shows that supply chain disruption various with different geographical area where the production plant is operating.Keywords: supply chain, disruptions, flow shop scheduling, uncertainty
Procedia PDF Downloads 42827982 Data-Driven Dynamic Overbooking Model for Tour Operators
Authors: Kannapha Amaruchkul
Abstract:
We formulate a dynamic overbooking model for a tour operator, in which most reservations contain at least two people. The cancellation rate and the timing of the cancellation may depend on the group size. We propose two overbooking policies, namely economic- and service-based. In an economic-based policy, we want to minimize the expected oversold and underused cost, whereas, in a service-based policy, we ensure that the probability of an oversold situation does not exceed the pre-specified threshold. To illustrate the applicability of our approach, we use tour package data in 2016-2018 from a tour operator in Thailand to build a data-driven robust optimization model, and we tested the proposed overbooking policy in 2019. We also compare the data-driven approach to the conventional approach of fitting data into a probability distribution.Keywords: applied stochastic model, data-driven robust optimization, overbooking, revenue management, tour operator
Procedia PDF Downloads 13127981 Mean-Field Type Modeling of Non-Local Congestion in Pedestrian Crowd Dynamics
Authors: Alexander Aurell
Abstract:
One of the latest trends in the modeling of human crowds is the mean-field game approach. In the mean-field game approach, the motion of a human crowd is described by a nonstandard stochastic optimal control problem. It is nonstandard since congestion is considered, introduced through a dependence in the performance functional on the distribution of the crowd. This study extends the class of mean-field pedestrian crowd models to allow for non-local congestion and arbitrary, but finitely, many interacting crowds. The new congestion feature grants pedestrians a 'personal space' where crowding is undesirable. The model is treated as a mean-field type game which is derived from a particle picture. This, in contrast to a mean-field game, better describes a situation where the crowd can be controlled by a central planner. The latter is suitable for decentralized situations. Solutions to the mean-field type game are characterized via a Pontryagin-type Maximum Principle.Keywords: congestion, crowd dynamics, interacting populations, mean-field approximation, optimal control
Procedia PDF Downloads 44327980 Closed-Form Sharma-Mittal Entropy Rate for Gaussian Processes
Authors: Septimia Sarbu
Abstract:
The entropy rate of a stochastic process is a fundamental concept in information theory. It provides a limit to the amount of information that can be transmitted reliably over a communication channel, as stated by Shannon's coding theorems. Recently, researchers have focused on developing new measures of information that generalize Shannon's classical theory. The aim is to design more efficient information encoding and transmission schemes. This paper continues the study of generalized entropy rates, by deriving a closed-form solution to the Sharma-Mittal entropy rate for Gaussian processes. Using the squeeze theorem, we solve the limit in the definition of the entropy rate, for different values of alpha and beta, which are the parameters of the Sharma-Mittal entropy. In the end, we compare it with Shannon and Rényi's entropy rates for Gaussian processes.Keywords: generalized entropies, Sharma-Mittal entropy rate, Gaussian processes, eigenvalues of the covariance matrix, squeeze theorem
Procedia PDF Downloads 51727979 Single Valued Neutrosophic Hesitant Fuzzy Rough Set and Its Application
Authors: K. M. Alsager, N. O. Alshehri
Abstract:
In this paper, we proposed the notion of single valued neutrosophic hesitant fuzzy rough set, by combining single valued neutrosophic hesitant fuzzy set and rough set. The combination of single valued neutrosophic hesitant fuzzy set and rough set is a powerful tool for dealing with uncertainty, granularity and incompleteness of knowledge in information systems. We presented both definition and some basic properties of the proposed model. Finally, we gave a general approach which is applied to a decision making problem in disease diagnoses, and demonstrated the effectiveness of the approach by a numerical example.Keywords: single valued neutrosophic fuzzy set, single valued neutrosophic fuzzy hesitant set, rough set, single valued neutrosophic hesitant fuzzy rough set
Procedia PDF Downloads 27127978 The Spiritual Distress of Women Coping with the End of Life and Death of Their Spouses
Authors: Szu-Mei Hsiao
Abstract:
Many nurses have concerns about the difficulties of providing spiritual care for ethnic-Chinese patients and family members within their cultural context. This is due to a lack of knowledge and training. Most family caregivers are female. There has been little research exploring the potential impact of Chinese cultural values on the spiritual distress of couple dyadic participants in Taiwan. This study explores the spiritual issues of Taiwanese women coping with their husband’s advanced cancer during palliative care to death. Qualitative multiple case studies were used. Data was collected through participant observation and in-depth face-to-face interviews. Transcribed interview data was analyzed by using qualitative content analysis. Three couples were recruited from a community-based rural hospital in Taiwan where the husbands were hospitalized in a medical ward. Four spiritual distress themes emerged from the analysis: (1) A personal conflict in trying to come to terms with love and forgiveness; the inability to forgive their husband’s mistakes; and, lack of their family’s love and support. (2) A feeling of hopelessness due to advanced cancer, such as a feeling of disappointment in their destiny and karma, including expressing doubt on survival. (3) A feeling of uncertainty in facing death peacefully, such as fear of facing the unknown world; and, (4) A feeling of doubt causing them to question the meaning and values in their lives. This research has shown that caregivers needed family support, friends, social welfare, and the help of their religion to meet their spiritual needs in coping within the final stages of life and death. The findings of this study could assist health professionals to detect the spiritual distress of ethnic-Chinese patients and caregivers in the context of their cultural or religious background as early as possible.Keywords: advanced cancer, Buddhism, Confucianism, Taoism, qualitative research, spiritual distress
Procedia PDF Downloads 17527977 A Comparative Study between Japan and the European Union on Software Vulnerability Public Policies
Authors: Stefano Fantin
Abstract:
The present analysis outcomes from the research undertaken in the course of the European-funded project EUNITY, which targets the gaps in research and development on cybersecurity and privacy between Europe and Japan. Under these auspices, the research presents a study on the policy approach of Japan, the EU and a number of Member States of the Union with regard to the handling and discovery of software vulnerabilities, with the aim of identifying methodological differences and similarities. This research builds upon a functional comparative analysis of both public policies and legal instruments from the identified jurisdictions. The result of this analysis is based on semi-structured interviews with EUNITY partners, as well as by the participation of the researcher to a recent report from the Center for EU Policy Study on software vulnerability. The European Union presents a rather fragmented legal framework on software vulnerabilities. The presence of a number of different legislations at the EU level (including Network and Information Security Directive, Critical Infrastructure Directive, Directive on the Attacks at Information Systems and the Proposal for a Cybersecurity Act) with no clear focus on such a subject makes it difficult for both national governments and end-users (software owners, researchers and private citizens) to gain a clear understanding of the Union’s approach. Additionally, the current data protection reform package (general data protection regulation), seems to create legal uncertainty around security research. To date, at the member states level, a few efforts towards transparent practices have been made, namely by the Netherlands, France, and Latvia. This research will explain what policy approach such countries have taken. Japan has started implementing a coordinated vulnerability disclosure policy in 2004. To date, two amendments can be registered on the framework (2014 and 2017). The framework is furthermore complemented by a series of instruments allowing researchers to disclose responsibly any new discovery. However, the policy has started to lose its efficiency due to a significant increase in reports made to the authority in charge. To conclude, the research conducted reveals two asymmetric policy approaches, time-wise and content-wise. The analysis therein will, therefore, conclude with a series of policy recommendations based on the lessons learned from both regions, towards a common approach to the security of European and Japanese markets, industries and citizens.Keywords: cybersecurity, vulnerability, European Union, Japan
Procedia PDF Downloads 15627976 Signal Integrity Performance Analysis in Capacitive and Inductively Coupled Very Large Scale Integration Interconnect Models
Authors: Mudavath Raju, Bhaskar Gugulothu, B. Rajendra Naik
Abstract:
The rapid advances in Very Large Scale Integration (VLSI) technology has resulted in the reduction of minimum feature size to sub-quarter microns and switching time in tens of picoseconds or even less. As a result, the degradation of high-speed digital circuits due to signal integrity issues such as coupling effects, clock feedthrough, crosstalk noise and delay uncertainty noise. Crosstalk noise in VLSI interconnects is a major concern and reduction in VLSI interconnect has become more important for high-speed digital circuits. It is the most effectively considered in Deep Sub Micron (DSM) and Ultra Deep Sub Micron (UDSM) technology. Increasing spacing in-between aggressor and victim line is one of the technique to reduce the crosstalk. Guard trace or shield insertion in-between aggressor and victim is also one of the prominent options for the minimization of crosstalk. In this paper, far end crosstalk noise is estimated with mutual inductance and capacitance RLC interconnect model. Also investigated the extent of crosstalk in capacitive and inductively coupled interconnects to minimizes the same through shield insertion technique.Keywords: VLSI, interconnects, signal integrity, crosstalk, shield insertion, guard trace, deep sub micron
Procedia PDF Downloads 18427975 Spatio-Temporal Properties of p53 States Raised by Glucose
Authors: Md. Jahoor Alam
Abstract:
Recent studies suggest that Glucose controls several lifesaving pathways. Glucose molecule is reported to be responsible for the production of ROS (reactive oxygen species). In the present work, a p53-MDM2-Glucose model is developed in order to study spatiotemporal properties of the p53 pathway. The systematic model is mathematically described. The model is numerically simulated using high computational facility. It is observed that the variation in glucose concentration level triggers the system at different states, namely, oscillation death (stabilized), sustain and damped oscillations which correspond to various cellular states. The transition of these states induced by glucose is phase transition-like behaviour. Further, the amplitude of p53 dynamics with the variation of glucose concentration level follows power law behaviour, As(k) ~ kϒ, where, ϒ is a constant. Further Stochastic approach is needed for understanding of realistic behaviour of the model. The present model predicts the variation of p53 states under the influence of glucose molecule which is also supported by experimental facts reported by various research articles.Keywords: oscillation, temporal behavior, p53, glucose
Procedia PDF Downloads 30127974 Characteristics and Drivers of Greenhouse Gas (GHG) emissions from China’s Manufacturing Industry: A Threshold Analysis
Abstract:
Only a handful of literature have used to non-linear model to investigate the influencing factors of greenhouse gas (GHG) emissions in China’s manufacturing sectors. And there is a limit in investigating quantitatively and systematically the mechanism of correlation between economic development and GHG emissions considering inherent differences among manufacturing sub-sectors. Considering the sectorial characteristics, the manufacturing sub-sectors with various impacts of output on GHG emissions may be explained by different development modes in each manufacturing sub-sector, such as investment scale, technology level and the level of international competition. In order to assess the environmental impact associated with any specific level of economic development and explore the factors that affect GHG emissions in China’s manufacturing industry during the process of economic growth, using the threshold Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT) model, this paper investigated the influence impacts of GHG emissions for China’s manufacturing sectors of different stages of economic development. A data set from 28 manufacturing sectors covering an 18-year period was used. Results demonstrate that output per capita and investment scale contribute to increasing GHG emissions while energy efficiency, R&D intensity and FDI mitigate GHG emissions. Results also verify the nonlinear effect of output per capita on emissions as: (1) the Environmental Kuznets Curve (EKC) hypothesis is supported when threshold point RMB 31.19 million is surpassed; (2) the driving strength of output per capita on GHG emissions becomes stronger as increasing investment scale; (3) the threshold exists for energy efficiency with the positive coefficient first and negative coefficient later; (4) the coefficient of output per capita on GHG emissions decreases as R&D intensity increases. (5) FDI shows a reduction in elasticity when the threshold is compassed.Keywords: China, GHG emissions, manufacturing industry, threshold STIRPAT model
Procedia PDF Downloads 42727973 Postmodernism and Metanarrative: Deconstruction of Narrative in a Song of Ice and Fire Fantasy TV Series
Authors: Narjes Azimi
Abstract:
It has been a while that narrative and storytelling turned to be the inevitable part of media. The narrative has so many aspects and among those entire aspects, the fantasy genre is consciously challenging one as fantasy readers are used to reading narratives like good versus evil plot. This paper will analyze the ASOIF TV series as a Meta narrative cultural production that deconstructs the elements of a traditional narrative. This study will shade on a grand narrative perspective from poststructuralism point of view. The theoretical framework is structuralism and post structuralism. Lyotard and Barthes are two main poststructuralists and focus of the study. Lyotard grand narrative elements will analyze in this research study. Fantasy genre generated a number of outstanding authors that explore innovative perspectives. Among all these leading authors George R.R Martin is one of the best. George R. R. Martin’s Fantasy a Song of Ice and Fire picturized the brutal world that seven kingdoms struggling for the power. Since 2011 this production has been followed and watched by millions of audiences all around the world. The methodology is the textual analysis of selected scenes. Martin’s distinctive fantasy style which makes it different from other fantasies, yet this shift does not negate how the previous fantasy writers represent the mentioned concepts of war, and etc., but Martin’ fantasy and left the mature audiences full of uncertainty.Keywords: narrative theory, metanarrative, deconstruction, post-structuralism, Lyotard, Barthes
Procedia PDF Downloads 29327972 A Modified Shannon Entropy Measure for Improved Image Segmentation
Authors: Mohammad A. U. Khan, Omar A. Kittaneh, M. Akbar, Tariq M. Khan, Husam A. Bayoud
Abstract:
The Shannon Entropy measure has been widely used for measuring uncertainty. However, in partial settings, the histogram is used to estimate the underlying distribution. The histogram is dependent on the number of bins used. In this paper, a modification is proposed that makes the Shannon entropy based on histogram consistent. For providing the benefits, two application are picked in medical image processing applications. The simulations are carried out to show the superiority of this modified measure for image segmentation problem. The improvement may be contributed to robustness shown to uneven background in images.Keywords: Shannon entropy, medical image processing, image segmentation, modification
Procedia PDF Downloads 49527971 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 12827970 Lessons Learned from Interlaboratory Noise Modelling in Scope of Environmental Impact Assessments in Slovenia
Abstract:
Noise assessment methods are regularly used in scope of Environmental Impact Assessments for planned projects to assess (predict) the expected noise emissions of these projects. Different noise assessment methods could be used. In recent years, we had an opportunity to collaborate in some noise assessment procedures where noise assessments of different laboratories have been performed simultaneously. We identified some significant differences in noise assessment results between laboratories in Slovenia. We estimate that despite good input Georeferenced Data to set up acoustic model exists in Slovenia; there is no clear consensus on methods for predictive noise methods for planned projects. We analyzed input data, methods and results of predictive noise methods for two planned industrial projects, both were done independently by two laboratories. We also analyzed the data, methods and results of two interlaboratory collaborative noise models for two existing noise sources (railway and motorway). In cases of predictive noise modelling, the validations of acoustic models were performed by noise measurements of surrounding existing noise sources, but in varying durations. The acoustic characteristics of existing buildings were also not described identically. The planned noise sources were described and digitized differently. Differences in noise assessment results between different laboratories have ranged up to 10 dBA, which considerably exceeds the acceptable uncertainty ranged between 3 to 6 dBA. Contrary to predictive noise modelling, in cases of collaborative noise modelling for two existing noise sources the possibility to perform the validation noise measurements of existing noise sources greatly increased the comparability of noise modelling results. In both cases of collaborative noise modelling for existing motorway and railway, the modelling results of different laboratories were comparable. Differences in noise modeling results between different laboratories were below 5 dBA, which was acceptable uncertainty set up by interlaboratory noise modelling organizer. The lessons learned from the study were: 1) Predictive noise calculation using formulae from International standard SIST ISO 9613-2: 1997 is not an appropriate method to predict noise emissions of planned projects since due to complexity of procedure they are not used strictly, 2) The noise measurements are important tools to minimize noise assessment errors of planned projects and should be in cases of predictive noise modelling performed at least for validation of acoustic model, 3) National guidelines should be made on the appropriate data, methods, noise source digitalization, validation of acoustic model etc. in order to unify the predictive noise models and their results in scope of Environmental Impact Assessments for planned projects.Keywords: environmental noise assessment, predictive noise modelling, spatial planning, noise measurements, national guidelines
Procedia PDF Downloads 23327969 Coarse-Graining in Micromagnetic Simulations of Magnetic Hyperthermia
Authors: Razyeh Behbahani, Martin L. Plumer, Ivan Saika-Voivod
Abstract:
Micromagnetic simulations based on the stochastic Landau-Lifshitz-Gilbert equation are used to calculate dynamic magnetic hysteresis loops relevant to magnetic hyperthermia applications. With the goal to effectively simulate room-temperature loops for large iron-oxide based systems at relatively slow sweep rates on the order of 1 Oe/ns or less, a coarse-graining scheme is proposed and tested. The scheme is derived from a previously developed renormalization-group approach. Loops associated with nanorods, used as building blocks for larger nanoparticles that were employed in preclinical trials (Dennis et al., 2009 Nanotechnology 20 395103), serve as the model test system. The scaling algorithm is shown to produce nearly identical loops over several decades in the model grain sizes. Sweep-rate scaling involving the damping constant alpha is also demonstrated.Keywords: coarse-graining, hyperthermia, hysteresis loops, micromagnetic simulations
Procedia PDF Downloads 14627968 Implementation of Integrated Multi-Channel Analysis of Surface Waves and Waveform Inversion Techniques for Seismic Hazard Estimation with Emphasis on Associated Uncertainty: A Case Study at Zafarana Wind Turbine Towers Farm, Egypt
Authors: Abd El-Aziz Khairy Abd El-Aal, Yuji Yagi, Heba Kamal
Abstract:
In this study, an integrated multi-channel analysis of Surface Waves (MASW) technique is applied to explore the geotechnical parameters of subsurface layers at the Zafarana wind farm. Moreover, a seismic hazard procedure based on the extended deterministic technique is used to estimate the seismic hazard load for the investigated area. The study area includes many active fault systems along the Gulf of Suez that cause many moderate and large earthquakes. Overall, the seismic activity of the area has recently become better understood following the use of new waveform inversion methods and software to develop accurate focal mechanism solutions for recent recorded earthquakes around the studied area. These earthquakes resulted in major stress-drops in the Eastern desert and the Gulf of Suez area. These findings have helped to reshape the understanding of the seismotectonic environment of the Gulf of Suez area, which is a perplexing tectonic domain. Based on the collected new information and data, this study uses an extended deterministic approach to re-examine the seismic hazard for the Gulf of Suez region, particularly the wind turbine towers at Zafarana Wind Farm and its vicinity. Alternate seismic source and magnitude-frequency relationships were combined with various indigenous attenuation relationships, adapted within a logic tree formulation, to quantify and project the regional exposure on a set of hazard maps. We select two desired exceedance probabilities (10 and 20%) that any of the applied scenarios may exceed the largest median ground acceleration. The ground motion was calculated at 50th, 84th percentile levels.Keywords: MASW, seismic hazard, wind turbine towers, Zafarana wind farm
Procedia PDF Downloads 40127967 Layouting Phase II of New Priok Using Adaptive Port Planning Frameworks
Authors: Mustarakh Gelfi, Tiedo Vellinga, Poonam Taneja, Delon Hamonangan
Abstract:
The development of New Priok/Kalibaru as an expansion terminal of the old port has been being done by IPC (Indonesia Port Cooperation) together with the subsidiary company, Port Developer (PT Pengembangan Pelabuhan Indonesia). As stated in the master plan, from 2 phases that had been proposed, phase I has shown its form and even Container Terminal I has been operated in 2016. It was planned principally, the development will be divided into Phase I (2013-2018) consist of 3 container terminals and 2 product terminals and Phase II (2018-2023) consist of 4 container terminals. In fact, the master plan has to be changed due to some major uncertainties which were escaped in prediction. This study is focused on the design scenario of phase II (2035- onwards) to deal with future uncertainty. The outcome is the robust design of phase II of the Kalibaru Terminal taking into account the future changes. Flexibility has to be a major goal in such a large infrastructure project like New Priok in order to deal and manage future uncertainty. The phasing of project needs to be adapted and re-look frequently before being irrelevant to future challenges. One of the frameworks that have been developed by an expert in port planning is Adaptive Port Planning (APP) with scenario-based planning. The idea behind APP framework is the adaptation that might be needed at any moment as an answer to a challenge. It is a continuous procedure that basically aims to increase the lifespan of waterborne transport infrastructure by increasing flexibility in the planning, contracting and design phases. Other methods used in this study are brainstorming with the port authority, desk study, interview and site visit to the real project. The result of the study is expected to be the insight for the port authority of Tanjung Priok over the future look and how it will impact the design of the port. There will be guidelines to do the design in an uncertain environment as well. Solutions of flexibility can be divided into: 1 - Physical solutions, all the items related hard infrastructure in the projects. The common things in this type of solution are using modularity, standardization, multi-functional, shorter and longer design lifetime, reusability, etc. 2 - Non-physical solutions, usually related to the planning processes, decision making and management of the projects. To conclude, APP framework seems quite robust to deal with the problem of designing phase II of New Priok Project for such a long period.Keywords: Indonesia port, port's design, port planning, scenario-based planning
Procedia PDF Downloads 23827966 Evidence of a Negativity Bias in the Keywords of Scientific Papers
Authors: Kseniia Zviagintseva, Brett Buttliere
Abstract:
Science is fundamentally a problem-solving enterprise, and scientists pay more attention to the negative things, that cause them dissonance and negative affective state of uncertainty or contradiction. While this is agreed upon by philosophers of science, there are few empirical demonstrations. Here we examine the keywords from those papers published by PLoS in 2014 and show with several sentiment analyzers that negative keywords are studied more than positive keywords. Our dataset is the 927,406 keywords of 32,870 scientific articles in all fields published in 2014 by the journal PLOS ONE (collected from Altmetric.com). Counting how often the 47,415 unique keywords are used, we can examine whether those negative topics are studied more than positive. In order to find the sentiment of the keywords, we utilized two sentiment analysis tools, Hu and Liu (2004) and SentiStrength (2014). The results below are for Hu and Liu as these are the less convincing results. The average keyword was utilized 19.56 times, with half of the keywords being utilized only 1 time and the maximum number of uses being 18,589 times. The keywords identified as negative were utilized 37.39 times, on average, with the positive keywords being utilized 14.72 times and the neutral keywords - 19.29, on average. This difference is only marginally significant, with an F value of 2.82, with a p of .05, but one must keep in mind that more than half of the keywords are utilized only 1 time, artificially increasing the variance and driving the effect size down. To examine more closely, we looked at those top 25 most utilized keywords that have a sentiment. Among the top 25, there are only two positive words, ‘care’ and ‘dynamics’, in position numbers 5 and 13 respectively, with all the rest being identified as negative. ‘Diseases’ is the most studied keyword with 8,790 uses, with ‘cancer’ and ‘infectious’ being the second and fourth most utilized sentiment-laden keywords. The sentiment analysis is not perfect though, as the words ‘diseases’ and ‘disease’ are split by taking 1st and 3rd positions. Combining them, they remain as the most common sentiment-laden keyword, being utilized 13,236 times. More than just splitting the words, the sentiment analyzer logs ‘regression’ and ‘rat’ as negative, and these should probably be considered false positives. Despite these potential problems, the effect is apparent, as even the positive keywords like ‘care’ could or should be considered negative, since this word is most commonly utilized as a part of ‘health care’, ‘critical care’ or ‘quality of care’ and generally associated with how to improve it. All in all, the results suggest that negative concepts are studied more, also providing support for the notion that science is most generally a problem-solving enterprise. The results also provide evidence that negativity and contradiction are related to greater productivity and positive outcomes.Keywords: bibliometrics, keywords analysis, negativity bias, positive and negative words, scientific papers, scientometrics
Procedia PDF Downloads 186