Search results for: risk estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7639

Search results for: risk estimation

7279 The Systemic Approach to Risk Measurement of Drainage Systems in Urban Areas

Authors: Jadwiga Królikowska, Andrzej Królikowski, Jarosław Bajer

Abstract:

The work delineates the threats of maladjustment of the capacity of rain canals, designed and built in the early 20th century, in connection to heavy rainfall, especially in summer. This is the cause of the so called 'urban floods.' It directly relates to fierce raise of paving in the cities. Resolving this problem requires a change in philosophy of draining the rainfall by wider use of retention, infiltration and usage of rainwater. In systemic approach to managing the safety of urban drainage systems the risk, which is directly connected to safety failures, has been accepted as a measure. The risk level defines the probability of occurrence of losses grater than the ones forecast for a given time frame. The procedure of risk modelling, enabling its numeric analysis by using appropriate weights, is a significant issue in this paper.

Keywords: drainage system, urban areas, risk measurement, systemic approach

Procedia PDF Downloads 289
7278 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis

Authors: Petr Gurný

Abstract:

One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the credit-scoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.

Keywords: credit-scoring models, multidimensional subordinated Lévy model, probability of default

Procedia PDF Downloads 451
7277 Understanding Narrative Transformations of Ebola in Negotiations of Epidemic Risk

Authors: N. W. Paul, M. Banerjee

Abstract:

Discussing the nexus between global health policy and local practices, this article addresses the recent Ebola outbreak as a role model for narrative co-constructions of epidemic risk. We will demonstrate in how far a theory-driven and methodologically rooted analysis of narrativity can help to improve mechanisms of prevention and intervention whenever epidemic risk needs to be addressed locally in order to contribute to global health. Analyzing the narrative transformation of Ebola, we will also address issues of transcultural problem-solving and of normative questions at stake. In this regard, we seek to contribute to a better understanding of a key question of global health and justice as well as to the underlying ethical questions. By highlighting and analyzing the functions of narratives, this paper provides a translational approach to refine our practices by which we address epidemic risk, be it on the national, the transnational or the global scale.

Keywords: ebola, epidemic risk, medical ethics, medical humanities

Procedia PDF Downloads 444
7276 Phillips Curve Estimation in an Emerging Economy: Evidence from Sub-National Data of Indonesia

Authors: Harry Aginta

Abstract:

Using Phillips curve framework, this paper seeks for new empirical evidence on the relationship between inflation and output in a major emerging economy. By exploiting sub-national data, the contribution of this paper is threefold. First, it resolves the issue of using on-target national inflation rates that potentially causes weakening inflation-output nexus. This is very relevant for Indonesia as its central bank has been adopting inflation targeting framework based on national consumer price index (CPI) inflation. Second, the study tests the relevance of mining sector in output gap estimation. The test for mining sector is important to control for the effects of mining regulation and nominal effects of coal prices on real economic activities. Third, the paper applies panel econometric method by incorporating regional variation that help to improve model estimation. The results from this paper confirm the strong presence of Phillips curve in Indonesia. Positive output gap that reflects excess demand condition gives rise to the inflation rates. In addition, the elasticity of output gap is higher if the mining sector is excluded from output gap estimation. In addition to inflation adaptation, the dynamics of exchange rate and international commodity price are also found to affect inflation significantly. The results are robust to the alternative measurement of output gap

Keywords: Phillips curve, inflation, Indonesia, panel data

Procedia PDF Downloads 117
7275 Impact of Exogenous Risk Factors into Actual Construction Price in PPP Projects

Authors: Saleh Alzahrani, Halim Boussabaine

Abstract:

Many of Public Private Partnership (PPP) are developed based on a public project is to be awarded to a private party within a one contractual framework. PPP project risks typically include the development and construction of a new asset as well as its operation. Certainly the most severe consequences of risks through the construction period are price and time overruns. These events are among the most generally used situation in value for money analysis risks. The sources of risk change during the time in PPP project. In traditional procurement, the public sector usually has to cover all prices suffering from these risks. At least there is plenty to suggest that price suffering is a norm in some of the projects that are delivered under traditional procurement. This paper will find the impact of exogenous risk factors into actual construction price into PPP projects. The paper will present a brief literature review on PPP risk pricing strategies and then using system dynamics (SD) to analyses of the risks associated with the estimated project price. Based on the finding from these analyses a risk pricing association model is presented and discussed. The paper concludes with thoughts for future research.

Keywords: public private partnership (PPP), risk, risk pricing, system dynamics (SD)

Procedia PDF Downloads 549
7274 Risk Management in an Islamic Framework

Authors: Magid Maatallah

Abstract:

The problem is, investment management in modern conditions boils down to risk management which is very underdeveloped in Islamic financial theory and practice. Add to this the fact that, in Islamic perception, this is one of the areas of conventional finance in need of drastic reforms. This need was recently underlined by the story of Long Term Capital Management (LTCM ), ( told by Roger Lowenstein in his book, When Genius Failed, Random House, 2000 ). So we face a double challenge, to develop Islamic techniques of risk management and to see that these new techniques are free from the ills with which conventional methods are suffering. This is different from the challenge faced in the middle of twentieth century, to develop a method of financial intermediation free of interest.Risk was always there, especially in business. But industrialization brought risks unknown in trade and agriculture. Industrial production often involves long periods of time .The longer the period of production the more the uncertainty. The scope of the market has expanded to cover the whole world, introducing new kinds of risk. More than a thousand years ago, when Islamic laws were being written, the nature and scope of risk and uncertainty was different. However, something can still be learnt which, in combination with the modern experience, should enable us to realize the Shariah objectives of justice, fairness and efficiency.

Keywords: financial markets, Islamic framework, risk management, investment

Procedia PDF Downloads 544
7273 Frequency Analysis of Minimum Ecological Flow and Gage Height in Indus River Using Maximum Likelihood Estimation

Authors: Tasir Khan, Yejuan Wan, Kalim Ullah

Abstract:

Hydrological frequency analysis has been conducted to estimate the minimum flow elevation of the Indus River in Pakistan to protect the ecosystem. The Maximum likelihood estimation (MLE) technique is used to estimate the best-fitted distribution for Minimum Ecological Flows at nine stations of the Indus River in Pakistan. The four selected distributions, Generalized Extreme Value (GEV) distribution, Generalized Logistics (GLO) distribution, Generalized Pareto (GPA) distribution, and Pearson type 3 (PE3) are fitted in all sites, usually used in hydro frequency analysis. Compare the performance of these distributions by using the goodness of fit tests, such as the Kolmogorov Smirnov test, Anderson darling test, and chi-square test. The study concludes that the Maximum Likelihood Estimation (MLE) method recommended that GEV and GPA are the most suitable distributions which can be effectively applied to all the proposed sites. The quantiles are estimated for the return periods from 5 to 1000 years by using MLE, estimations methods. The MLE is the robust method for larger sample sizes. The results of these analyses can be used for water resources research, including water quality management, designing irrigation systems, determining downstream flow requirements for hydropower, and the impact of long-term drought on the country's aquatic system.

Keywords: minimum ecological flow, frequency distribution, indus river, maximum likelihood estimation

Procedia PDF Downloads 72
7272 The Effect of Supplier Trust and Top Management Involvement on Supply Chain Risk Management through Buyer-Supplier Relationship

Authors: Hotlan Siagian, Han Tae Hee

Abstract:

This study aims to examine the effect of supplier trust and top management involvement on the supply chain risk management through buyer-supplier relationship. The population of the research is 44 Korean companies domiciled in East and Central Java of Indonesia. The respondent consists of a top management level from each company. Data collection used a questionnaire designed with five-item Likert scale. Collected data were analyzed using structural equation modeling (SEM) technique with SmartPLS software version 3.0 to examine the hypotheses. The result revealed that supplier trust has an effect on supply chain risk management, top management involvement affects supply chain risk management, supplier trust influences buyer-supplier relationship, top management involvement affects the buyer-supplier relationship, and buyer-supplier relationship affects supply chain risk management. The last finding is that buyer-supplier relationship empirically mediates the effect of supplier trust and top management involvement.

Keywords: buyer supplier relationship, supplier trust, supply chain risk management, top management involvement

Procedia PDF Downloads 213
7271 Bayesian Network and Feature Selection for Rank Deficient Inverse Problem

Authors: Kyugneun Lee, Ikjin Lee

Abstract:

Parameter estimation with inverse problem often suffers from unfavorable conditions in the real world. Useless data and many input parameters make the problem complicated or insoluble. Data refinement and reformulation of the problem can solve that kind of difficulties. In this research, a method to solve the rank deficient inverse problem is suggested. A multi-physics system which has rank deficiency caused by response correlation is treated. Impeditive information is removed and the problem is reformulated to sequential estimations using Bayesian network (BN) and subset groups. At first, subset grouping of the responses is performed. Feature selection with singular value decomposition (SVD) is used for the grouping. Next, BN inference is used for sequential conditional estimation according to the group hierarchy. Directed acyclic graph (DAG) structure is organized to maximize the estimation ability. Variance ratio of response to noise is used to pairing the estimable parameters by each response.

Keywords: Bayesian network, feature selection, rank deficiency, statistical inverse analysis

Procedia PDF Downloads 308
7270 Identifying Psychosocial, Autonomic, and Pain Sensitivity Risk Factors of Chronic Temporomandibular Disorder by Using Ridge Logistic Regression and Bootstrapping

Authors: Haolin Li, Eric Bair, Jane Monaco, Quefeng Li

Abstract:

The temporomandibular disorder (TMD) is a series of musculoskeletal disorders ranging from jaw pain to chronic debilitating pain, and the risk factors for the onset and maintenance of TMD are still unclear. Prior researches have shown that the potential risk factors for chronic TMD are related to psychosocial factors, autonomic functions, and pain sensitivity. Using data from the Orofacial Pain: Prospective Evaluation and Risk Assessment (OPPERA) study’s baseline case-control study, we examine whether the risk factors identified by prior researches are still statistically significant after taking all of the risk measures into account in one single model, and we also compare the relative influences of the risk factors in three different perspectives (psychosocial factors, autonomic functions, and pain sensitivity) on the chronic TMD. The statistical analysis is conducted by using ridge logistic regression and bootstrapping, in which the performance of the algorithms has been assessed using extensive simulation studies. The results support most of the findings of prior researches that there are many psychosocial and pain sensitivity measures that have significant associations with chronic TMD. However, it is surprising that most of the risk factors of autonomic functions have not presented significant associations with chronic TMD, as described by a prior research.

Keywords: autonomic function, OPPERA study, pain sensitivity, psychosocial measures, temporomandibular disorder

Procedia PDF Downloads 178
7269 Total Plaque Area in Chronic Renal Failure

Authors: Hernán A. Perez, Luis J. Armando, Néstor H. García

Abstract:

Background and aims Cardiovascular disease rates are very high in patients with renal failure (CRF), but the underlying mechanisms are incompletely understood. Traditional cardiovascular risk factors do not explain the increased risk, and observational studies have observed paradoxical or absent associations between classical risk factors and mortality in dialysis patients. A large randomized controlled trial, the 4D Study, the AURORA and the ALERT study found that statin therapy in CRF do not reduce cardiovascular events. These results may be the results of ‘accelerated atherosclerosis’ observed on these patients. The objective of this study was to investigate if carotid total plaque area (TPA), a measure of carotid plaque burden growth is increased at progressively lower creatinine clearance in patients with CRF. We studied a cohort of patients with CRF not on dialysis, reasoning that risk factor associations might be more easily discerned before end stage renal disease. Methods: The Blossom DMO Argentina ethics committee approved the study and informed consent from each participant was obtained. We performed a cohort study in 412 patients with Stage 1, 2 and 3 CRF. Clinical and laboratory data were obtained. TPA was determined using bilateral carotid ultrasonography. Modification of Diet in Renal Disease estimation formula was used to determine renal function. ANOVA was used when appropriate. Results: Stage 1 CRF group (n= 16, 43±2yo) had a blood pressure of 123±2/78±2 mmHg, BMI 30±1, LDL col 145±10 mg/dl, HbA1c 5.8±0.4% and had the lowest TPA 25.8±6.9 mm2. Stage 2 CRF (n=231, 50±1 yo) had a blood pressure of 132±1/81±1 mmHg, LDL col 125±2 mg/dl, HbA1c 6±0.1% and TPA 48±10mm2 ( p< 0.05 vs CRF stage 1) while Stage 3 CRF (n=165, 59±1 yo) had a blood pressure of 134±1/81±1, LDL col 125±3 mg/dl, HbA1c 6±0.1% and TPA 71±6mm2 (p < 0.05 vs CRF stage 1 and 2). Conclusion: Our data indicate that TPA increases along the renal function deterioration, and it is not related with the LDL cholesterol and triglycerides levels. We suggest that mechanisms other than the classics are responsible for the observed excess of cardiovascular disease in CKD patients and finally, determination of total plaque area should be used to measure effects of antiatherosclerotic therapy.

Keywords: hypertension, chronic renal failure, atherosclerosis, cholesterol

Procedia PDF Downloads 266
7268 Leadership Styles and Adoption of Risk Governance in Insurance and Energy Industry: A Comparative Case Study

Authors: Ruchi Agarwal

Abstract:

In today’s world, companies are operating in dynamic, uncertain and ambiguous business environments. Globally, more companies are failing due to Environmental, Social and Governance (ESG) factors than ever. Corporate governance and risk management are intertwined in nature. For decades, corporate governance and risk management have been influenced by internal and external factors. Three schools of thought have influenced risk governance for decades: Agency theory, Contingency theory, and Institutional theory. Agency theory argues that agents have interests conflicting with principal interests and the information problem. Contingency theory suggests that risk management adoption is influenced by internal and external factors, while Institutional theory suggests that organizations legitimize risk management with regulators, competitors, and professional bodies. The conflicting objectives of theories have created problems for executives in organizations in the adoption of Risk Governance. So far, there are many studies that discussed risk culture and the role of actors in risk governance, but there are rare studies discussing the role of risk culture in the adoption of risk governance from a leadership style perspective. This study explores the adoption of risk governance in two contrasting industries, such as the Insurance and energy business, to understand whether risk governance is influenced by internal/external factors or whether risk culture is influenced by leaders. We draw empirical evidence by comparing the cases of an Indian insurance company and a renewable energy-based firm in India. We interviewed more than 20 senior executives of companies and collected annual reports, risk management policies, and more than 10 PPTs and other reports from 2017 to 2024. We visited the company for follow-up questions several times. The findings of my research revealed that both companies have used risk governance for strategic renewal of the company. Insurance companies use a transactional leadership style based on performance and reward for improving risk, while energy companies use rather symbolic management to make debt restructuring meaningful for stakeholders. Overall, both companies turned from loss-making to profitable ones in a few years. This comparative study highlights the role of different leadership styles in the adoption of risk governance. The study is also distinct as previous research rarely studied risk governance in two contrasting industries in reference to leadership styles.

Keywords: leadership style, corporate governance, risk management, risk culture, strategic renewal

Procedia PDF Downloads 40
7267 Vehicular Emission Estimation of Islamabad by Using Copert-5 Model

Authors: Muhammad Jahanzaib, Muhammad Z. A. Khan, Junaid Khayyam

Abstract:

Islamabad is the capital of Pakistan with the population of 1.365 million people and with a vehicular fleet size of 0.75 million. The vehicular fleet size is growing annually by the rate of 11%. Vehicular emissions are major source of Black carbon (BC). In developing countries like Pakistan, most of the vehicles consume conventional fuels like Petrol, Diesel, and CNG. These fuels are the major emitters of pollutants like CO, CO2, NOx, CH4, VOCs, and particulate matter (PM10). Carbon dioxide and methane are the leading contributor to the global warming with a global share of 9-26% and 4-9% respectively. NOx is the precursor of nitrates which ultimately form aerosols that are noxious to human health. In this study, COPERT (Computer program to Calculate Emissions from Road Transport) was used for vehicular emission estimation in Islamabad. COPERT is a windows based program which is developed for the calculation of emissions from the road transport sector. The emissions were calculated for the year of 2016 include pollutants like CO, NOx, VOC, and PM and energy consumption. The different variable was input to the model for emission estimation including meteorological parameters, average vehicular trip length and respective time duration, fleet configuration, activity data, degradation factor, and fuel effect. The estimated emissions for CO, CH4, CO2, NOx, and PM10 were found to be 9814.2, 44.9, 279196.7, 3744.2 and 304.5 tons respectively.

Keywords: COPERT Model, emission estimation, PM10, vehicular emission

Procedia PDF Downloads 256
7266 Risk Allocation in Public-Private Partnership (PPP) Projects for Wastewater Treatment Plants

Authors: Samuel Capintero, Ole H. Petersen

Abstract:

This paper examines the utilization of public-private partnerships for the building and operation of wastewater treatment plants. Our research focuses on risk allocation in this kind of projects. Our analysis builds on more than hundred wastewater treatment plants built and operated through PPP projects in Aragon (Spain). The paper illustrates the consequences of an inadequate management of construction risk and an unsuitable transfer of demand risk in wastewater treatment plants. It also shows that the involvement of many public bodies at local, regional and national level further increases the complexity of this kind of projects and make time delays more likely.

Keywords: wastewater, treatment plants, PPP, construction

Procedia PDF Downloads 643
7265 Multi-Subpopulation Genetic Algorithm with Estimation of Distribution Algorithm for Textile Batch Dyeing Scheduling Problem

Authors: Nhat-To Huynh, Chen-Fu Chien

Abstract:

Textile batch dyeing scheduling problem is complicated which includes batch formation, batch assignment on machines, batch sequencing with sequence-dependent setup time. Most manufacturers schedule their orders manually that are time consuming and inefficient. More power methods are needed to improve the solution. Motivated by the real needs, this study aims to propose approaches in which genetic algorithm is developed with multi-subpopulation and hybridised with estimation of distribution algorithm to solve the constructed problem for minimising the makespan. A heuristic algorithm is designed and embedded into the proposed algorithms to improve the ability to get out of the local optima. In addition, an empirical study is conducted in a textile company in Taiwan to validate the proposed approaches. The results have showed that proposed approaches are more efficient than simulated annealing algorithm.

Keywords: estimation of distribution algorithm, genetic algorithm, multi-subpopulation, scheduling, textile dyeing

Procedia PDF Downloads 294
7264 Identifying Mitigation Plans in Reducing Usability Risk Using Delphi Method

Authors: Jayaletchumi T. Sambantha Moorthy, Suhaimi bin Ibrahim, Mohd Naz’ri Mahrin

Abstract:

Most quality models have defined usability as a significant factor that leads to improving product acceptability, increasing user satisfaction, improving product reliability, and also financially benefiting companies. Usability is also the best factor that acts as a balance for both the technical and human aspects of a software product, which is an important aspect in defining quality during software development process. A usability risk can be defined as a potential usability risk factor that a chosen action or activity may lead to a possible loss or an undesirable outcome. This could impact the usability of a software product thereby contributing to negative user experiences and causing a possible software product failure. Hence, it is important to mitigate and reduce usability risks in the software development process itself. By managing possible involved usability risks in software development process, failure of software product could be reduced. Therefore, this research uses the Delphi method to identify mitigation plans to reduce potential usability risks. The Delphi method is conducted with seven experts from the field of risk management and software development.

Keywords: usability, usability risk, risk management, risk mitigation, delphi study

Procedia PDF Downloads 460
7263 The 10-year Risk of Major Osteoporotic and Hip Fractures Among Indonesian People Living with HIV

Authors: Iqbal Pramukti, Mamat Lukman, Hasniatisari Harun, Kusman Ibrahim

Abstract:

Introduction: People living with HIV had a higher risk of osteoporotic fracture than the general population. The purpose of this study was to predict the 10-year risk of fracture among people living with HIV (PLWH) using FRAX™ and to identify characteristics related to the fracture risk. Methodology: This study consisted of 75 subjects. The ten-year probability of major osteoporotic fractures (MOF) and hip fractures was assessed using the FRAX™ algorithm. A cross-tabulation was used to identify the participant’s characteristics related to fracture risk. Results: The overall mean 10-year probability of fracture was 2.4% (1.7) for MOF and 0.4% (0.3) for hip fractures. For MOF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use showed a higher MOF score than those who were not (3.1 vs. 2.5; 4.6 vs 2.5; and 3.4 vs 2.5, respectively). For HF score, participants with parents’ hip fracture history, smoking behavior and glucocorticoid use also showed a higher HF score than those who were not (0.5 vs. 0.3; 0.8 vs. 0.3; and 0.5 vs. 0.3, respectively). Conclusions: The 10-year risk of fracture was higher among PLWH with several factors, including the parent’s hip. Fracture history, smoking behavior and glucocorticoid used. Further analysis on determining factors using multivariate regression analysis with a larger sample size is required to confirm the factors associated with the high fracture risk.

Keywords: HIV, PLWH, osteoporotic fractures, hip fractures, 10-year risk of fracture, FRAX

Procedia PDF Downloads 42
7262 A Multi-Stage Learning Framework for Reliable and Cost-Effective Estimation of Vehicle Yaw Angle

Authors: Zhiyong Zheng, Xu Li, Liang Huang, Zhengliang Sun, Jianhua Xu

Abstract:

Yaw angle plays a significant role in many vehicle safety applications, such as collision avoidance and lane-keeping system. Although the estimation of the yaw angle has been extensively studied in existing literature, it is still the main challenge to simultaneously achieve a reliable and cost-effective solution in complex urban environments. This paper proposes a multi-stage learning framework to estimate the yaw angle with a monocular camera, which can deal with the challenge in a more reliable manner. In the first stage, an efficient road detection network is designed to extract the road region, providing a highly reliable reference for the estimation. In the second stage, a variational auto-encoder (VAE) is proposed to learn the distribution patterns of road regions, which is particularly suitable for modeling the changing patterns of yaw angle under different driving maneuvers, and it can inherently enhance the generalization ability. In the last stage, a gated recurrent unit (GRU) network is used to capture the temporal correlations of the learned patterns, which is capable to further improve the estimation accuracy due to the fact that the changes of deflection angle are relatively easier to recognize among continuous frames. Afterward, the yaw angle can be obtained by combining the estimated deflection angle and the road direction stored in a roadway map. Through effective multi-stage learning, the proposed framework presents high reliability while it maintains better accuracy. Road-test experiments with different driving maneuvers were performed in complex urban environments, and the results validate the effectiveness of the proposed framework.

Keywords: gated recurrent unit, multi-stage learning, reliable estimation, variational auto-encoder, yaw angle

Procedia PDF Downloads 135
7261 An Indoor Positioning System in Wireless Sensor Networks with Measurement Delay

Authors: Pyung Soo Kim, Eung Hyuk Lee, Mun Suck Jang

Abstract:

In the current paper, an indoor positioning system is proposed with consideration of measurement delay. Firstly, an estimation filter with a measurement delay is designed for the indoor positioning mechanism under a weighted least square criterion, which utilizes only finite measurements on the most recent window. The proposed estimation filtering based scheme gives the filtered estimates for position, velocity and acceleration of moving target in real-time, while removing undesired noisy effects and preserving desired moving positions. Secondly, the proposed scheme is shown to have good inherent properties such as unbiasedness, efficiency, time-invariance, deadbeat, and robustness due to the finite memory structure. Finally, computer simulations shows that the performance of the proposed estimation filtering based scheme can outperform to the existing infinite memory filtering based mechanism.

Keywords: indoor positioning system, wireless sensor networks, measurement delay

Procedia PDF Downloads 478
7260 An Algorithm to Compute the State Estimation of a Bilinear Dynamical Systems

Authors: Abdullah Eqal Al Mazrooei

Abstract:

In this paper, we introduce a mathematical algorithm which is used for estimating the states in the bilinear systems. This algorithm uses a special linearization of the second-order term by using the best available information about the state of the system. This technique makes our algorithm generalizes the well-known Kalman estimators. The system which is used here is of the bilinear class, the evolution of this model is linear-bilinear in the state of the system. Our algorithm can be used with linear and bilinear systems. We also here introduced a real application for the new algorithm to prove the feasibility and the efficiency for it.

Keywords: estimation algorithm, bilinear systems, Kakman filter, second order linearization

Procedia PDF Downloads 481
7259 Automatic Censoring in K-Distribution for Multiple Targets Situations

Authors: Naime Boudemagh, Zoheir Hammoudi

Abstract:

The parameters estimation of the K-distribution is an essential part in radar detection. In fact, presence of interfering targets in reference cells causes a decrease in detection performances. In such situation, the estimate of the shape and the scale parameters are far from the actual values. In the order to avoid interfering targets, we propose an Automatic Censoring (AC) algorithm of radar interfering targets in K-distribution. The censoring technique used in this work offers a good discrimination between homogeneous and non-homogeneous environments. The homogeneous population is then used to estimate the unknown parameters by the classical Method of Moment (MOM). The AC algorithm does not need any prior information about the clutter parameters nor does it require both the number and the position of interfering targets. The accuracy of the estimation parameters obtained by this algorithm are validated and compared to various actual values of the shape parameter, using Monte Carlo simulations, this latter show that the probability of censing in multiple target situations are in good agreement.

Keywords: parameters estimation, method of moments, automatic censoring, K distribution

Procedia PDF Downloads 369
7258 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model

Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong

Abstract:

This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.

Keywords: defective autoparts products, Bayesian framework, generalized linear mixed model (GLMM), risk factors

Procedia PDF Downloads 563
7257 Modelling Hydrological Time Series Using Wakeby Distribution

Authors: Ilaria Lucrezia Amerise

Abstract:

The statistical modelling of precipitation data for a given portion of territory is fundamental for the monitoring of climatic conditions and for Hydrogeological Management Plans (HMP). This modelling is rendered particularly complex by the changes taking place in the frequency and intensity of precipitation, presumably to be attributed to the global climate change. This paper applies the Wakeby distribution (with 5 parameters) as a theoretical reference model. The number and the quality of the parameters indicate that this distribution may be the appropriate choice for the interpolations of the hydrological variables and, moreover, the Wakeby is particularly suitable for describing phenomena producing heavy tails. The proposed estimation methods for determining the value of the Wakeby parameters are the same as those used for density functions with heavy tails. The commonly used procedure is the classic method of moments weighed with probabilities (probability weighted moments, PWM) although this has often shown difficulty of convergence, or rather, convergence to a configuration of inappropriate parameters. In this paper, we analyze the problem of the likelihood estimation of a random variable expressed through its quantile function. The method of maximum likelihood, in this case, is more demanding than in the situations of more usual estimation. The reasons for this lie, in the sampling and asymptotic properties of the estimators of maximum likelihood which improve the estimates obtained with indications of their variability and, therefore, their accuracy and reliability. These features are highly appreciated in contexts where poor decisions, attributable to an inefficient or incomplete information base, can cause serious damages.

Keywords: generalized extreme values, likelihood estimation, precipitation data, Wakeby distribution

Procedia PDF Downloads 134
7256 A Review of Benefit-Risk Assessment over the Product Lifecycle

Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris

Abstract:

Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.

Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches

Procedia PDF Downloads 149
7255 Pricing the Risk Associated to Weather of Variable Renewable Energy Generation

Authors: Jorge M. Uribe

Abstract:

We propose a methodology for setting the price of an insurance contract targeted to manage the risk associated with weather conditions that affect variable renewable energy generation. The methodology relies on conditional quantile regressions to estimate the weather risk of a solar panel. It is illustrated using real daily radiation and weather data for three cities in Spain (Valencia, Barcelona and Madrid) from February 2/2004 to January 22/2019. We also adapt the concepts of value at risk and expected short fall from finance to this context, to provide a complete panorama of what we label as weather risk. The methodology is easy to implement and can be used by insurance companies to price a contract with the aforementioned characteristics when data about similar projects and accurate cash flow projections are lacking. Our methodology assigns a higher price to an insurance product with the stated characteristics in Madrid, compared to Valencia and Barcelona. This is consistent with Madrid showing the largest interquartile range of operational deficits and it is unrelated to the average value deficit, which illustrates the importance of our proposal.

Keywords: insurance, weather, vre, risk

Procedia PDF Downloads 142
7254 Real-Time Radar Tracking Based on Nonlinear Kalman Filter

Authors: Milca F. Coelho, K. Bousson, Kawser Ahmed

Abstract:

To accurately track an aerospace vehicle in a time-critical situation and in a highly nonlinear environment, is one of the strongest interests within the aerospace community. The tracking is achieved by estimating accurately the state of a moving target, which is composed of a set of variables that can provide a complete status of the system at a given time. One of the main ingredients for a good estimation performance is the use of efficient estimation algorithms. A well-known framework is the Kalman filtering methods, designed for prediction and estimation problems. The success of the Kalman Filter (KF) in engineering applications is mostly due to the Extended Kalman Filter (EKF), which is based on local linearization. Besides its popularity, the EKF presents several limitations. To address these limitations and as a possible solution to tracking problems, this paper proposes the use of the Ensemble Kalman Filter (EnKF). Although the EnKF is being extensively used in the context of weather forecasting and it is being recognized for producing accurate and computationally effective estimation on systems with a very high dimension, it is almost unknown by the tracking community. The EnKF was initially proposed as an attempt to improve the error covariance calculation, which on the classic Kalman Filter is difficult to implement. Also, in the EnKF method the prediction and analysis error covariances have ensemble representations. These ensembles have sizes which limit the number of degrees of freedom, in a way that the filter error covariance calculations are a lot more practical for modest ensemble sizes. In this paper, a realistic simulation of a radar tracking was performed, where the EnKF was applied and compared with the Extended Kalman Filter. The results suggested that the EnKF is a promising tool for tracking applications, offering more advantages in terms of performance.

Keywords: Kalman filter, nonlinear state estimation, optimal tracking, stochastic environment

Procedia PDF Downloads 137
7253 Groundwater Recharge Estimation of Fetam Catchment in Upper Blue Nile Basin North-Western Ethiopia

Authors: Mekonen G., Sileshi M., Melkamu M.

Abstract:

Recharge estimation is important for the assessment and management of groundwater resources effectively. This study applied the soil moisture balance and Baseflow separation methods to estimate groundwater recharge in the Fetam Catchment. It is one of the major catchments understudied from the different catchments in the upper Blue Nile River basin. Surface water has been subjected to high seasonal variation; due to this, groundwater is a primary option for drinking water supply to the community. This research has been conducted to estimate groundwater recharge by using fifteen years of River flow data for the Baseflow separation and ten years of daily meteorological data for the daily soil moisture balance recharge estimating method. The recharge rate by the two methods is 170.5 and 244.9mm/year daily soil moisture and baseflow separation method, respectively, and the average recharge is 207.7mm/year. The average value of annual recharge in the catchment is almost equal to the average recharge in the country, which is 200mm/year. So, each method has its own limitations, and taking the average value is preferable rather than taking a single value. Baseflow provides overestimated result compared to the average of the two, and soil moisture balance is the list estimator. The recharge estimation in the area also should be done by other recharge estimation methods.

Keywords: groundwater, recharge, baseflow separation, soil moisture balance, Fetam catchment

Procedia PDF Downloads 352
7252 Evaluating Performance of Value at Risk Models for the MENA Islamic Stock Market Portfolios

Authors: Abderrazek Ben Maatoug, Ibrahim Fatnassi, Wassim Ben Ayed

Abstract:

In this paper we investigate the issue of market risk quantification for Middle East and North Africa (MENA) Islamic market equity. We use Value-at-Risk (VaR) as a measure of potential risk in Islamic stock market, for long and short position, based on Riskmetrics model and the conditional parametric ARCH class model volatility with normal, student and skewed student distribution. The sample consist of daily data for the 2006-2014 of 11 Islamic stock markets indices. We conduct Kupiec and Engle and Manganelli tests to evaluate the performance for each model. The main finding of our empirical results show that (i) the superior performance of VaR models based on the Student and skewed Student distribution, for the significance level of α=1% , for all Islamic stock market indices, and for both long and short trading positions (ii) Risk Metrics model, and VaR model based on conditional volatility with normal distribution provides the best accurate VaR estimations for both long and short trading positions for a significance level of α=5%.

Keywords: value-at-risk, risk management, islamic finance, GARCH models

Procedia PDF Downloads 589
7251 Correlations between Obesity Indices and Cardiometabolic Risk Factors in Obese Subgroups in Severely Obese Women

Authors: Seung Hun Lee, Sang Yeoup Lee

Abstract:

Objectives: To investigate associations between degrees of obesity using correlations between obesity indices and cardiometabolic risk factors. Methods: BMI, waist circumference (WC), fasting insulin, fasting glucose, lipids, and visceral adipose tissue (VAT) area using computed tomographic images were measured in 113 obese female without cardiovascular disease (CVD). Correlations between obesity indices and cardiometabolic risk factors were analyzed in obese subgroups defined using sequential obesity indices. Results: Mean BMI and WC were 29.6 kg/m2 and 92.8 cm. BMI showed significant correlations with all five cardiometabolic risk factors until the BMI cut-off point reached 27 kg/m2, but when it exceeded 30 kg/m2, correlations no longer existed. WC was significantly correlated with all five cardiometabolic risk factors up to a value of 85 cm, but when WC exceeded 90 cm, correlations no longer existed. Conclusions: Our data suggest that moderate weight-loss goals may not be enough to ameliorate cardiometabolic markers in severely obese patients. Therefore, individualized weight-loss goals should be recommended to such patients to improve health benefits.

Keywords: correlation, cardiovascular disease, risk factors, obesity

Procedia PDF Downloads 348
7250 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 214