Search results for: risk estimation
7530 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment
Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 4627529 Light-Weight Network for Real-Time Pose Estimation
Authors: Jianghao Hu, Hongyu Wang
Abstract:
The effective and efficient human pose estimation algorithm is an important task for real-time human pose estimation on mobile devices. This paper proposes a light-weight human key points detection algorithm, Light-Weight Network for Real-Time Pose Estimation (LWPE). LWPE uses light-weight backbone network and depthwise separable convolutions to reduce parameters and lower latency. LWPE uses the feature pyramid network (FPN) to fuse the high-resolution, semantically weak features with the low-resolution, semantically strong features. In the meantime, with multi-scale prediction, the predicted result by the low-resolution feature map is stacked to the adjacent higher-resolution feature map to intermediately monitor the network and continuously refine the results. At the last step, the key point coordinates predicted in the highest-resolution are used as the final output of the network. For the key-points that are difficult to predict, LWPE adopts the online hard key points mining strategy to focus on the key points that hard predicting. The proposed algorithm achieves excellent performance in the single-person dataset selected in the AI (artificial intelligence) challenge dataset. The algorithm maintains high-precision performance even though the model only contains 3.9M parameters, and it can run at 225 frames per second (FPS) on the generic graphics processing unit (GPU).Keywords: depthwise separable convolutions, feature pyramid network, human pose estimation, light-weight backbone
Procedia PDF Downloads 1547528 Conflation Methodology Applied to Flood Recovery
Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong
Abstract:
Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.Keywords: community resilience, conflation, flood risk, nuisance flooding
Procedia PDF Downloads 1057527 Remote Sensing and GIS Integration for Paddy Production Estimation in Bali Province, Indonesia
Authors: Sarono, Hamim Zaky Hadibasyir, dan Ridho Kurniawan
Abstract:
Estimation of paddy production is one of the areas that can be examined using the techniques of remote sensing and geographic information systems (GIS) in the field of agriculture. The purpose of this research is to know the amount of the paddy production estimation and how remote sensing and geographic information systems (GIS) are able to perform analysis of paddy production estimation in Tegalallang and Payangan Sub district, Bali Province, Indonesia. The method used is the method of land suitability. This method associates a physical parameters which are to be embodied in the smallest unit of a mapping that represents a mapping unit in a particular field and connecting with its field productivity. Analysis of estimated production using standard land suitability from FAO using matching technique. The parameters used to create the land unit is slope (FAO), climate classification (Oldeman), landform (Prapto Suharsono), and soil type. Land use map consist of paddy and non paddy field information obtained from Geo-eye 1 imagery using visual interpretation technique. Landsat image of the Data used for the interpretation of the landform, the classification of the slopes obtained from high point identification with method of interpolation spline, whereas climate data, soil, use secondary data originating from institutions-related institutions. The results of this research indicate Tegallalang and Payangan Districts in known wetland suitability consists of S1 (very suitable) covering an area of 2884,7 ha with the productivity of 5 tons/ha and S2 (suitable) covering an area of 482,9 ha with the productivity of 3 tons/ha. The sum of paddy production estimation as a results in both districts are 31.744, 3 tons in one year.Keywords: production estimation, paddy, remote sensing, geography information system, land suitability
Procedia PDF Downloads 3427526 Bereavement Risk Assessment of Family Caregivers of Patients with Cancer: Relationship between Bereavement Risk and Post-Loss Psychological Distress
Authors: Tomohiro Uchida, Noriaki Satake, Toshimichi Nakaho, Akira Inoue, Hidemitsu Saito
Abstract:
In this study, we assessed the bereavement risk of family caregivers of patients with cancer. In the palliative care unit of Tohoku University Hospital, we conducted a family psychoeducation session to support the family caregivers of patients with cancer. A total of 50 participants (8 males and 42 females; mean age = 62.98 years, SD = 11.10) were assessed after the session for bereavement risk using the Japanese version of the Bereavement Risk Assessment Tool (BRAT-J). According to the BRAT-J scores, eight participants were considered to be having no known risk (Level 1), seventeen had minimal risk (Level 2), twenty had a low risk (Level 3), four had a moderate risk (Level 4), and one had a high risk (Level 5). Of these participants, seven participants had completed the follow-up postal survey that assessed their psychological distress (the Kessler Psychological Distress Scale: K6) to compare the bereavement risk. According to the K6 scores, three-fourth of the individuals, who were considered to be at Level 3 on the BRAT-J, scored higher than the cutoff point (>10) for the detection of depressive disorder. On the other hand, one-third of the individuals, who were considered to be at Level 2 on the BRAT-J, scored higher than the cutoff point. Therefore, it appears that the BRAT-J can predict the likelihood of difficulties or complications in bereaved family caregivers. This research was approved by the Ethics Committee of Tohoku University Graduate School of Medicine and Tohoku University Hospital.Keywords: palliative care, family caregivers, bereavement risk, BRAT, post-loss psychological distress
Procedia PDF Downloads 4587525 Measures for Earthquake Risk Reduction in Algeria
Authors: Farah Lazzali, Yamina Ait Meziane
Abstract:
Recent earthquakes in Algeria have demonstrated the need for seismic risk reduction. In fact, the latest major earthquake that affected the Algiers-Boumerdes region in 2003 caused excessive levels of loss of life and property. Economic, social and environmental damage were also experienced. During the three days following the event, a relatively weak coordination of public authority was noted. Many localities did not receive any relief due to lack of information from concerned authorities and delay in connecting damaged roads. Following this event, Algerian government and civil society has recognized the urgent need for an appropriate and immediate seismic risk mitigation strategy. This paper describes procedures for emergency response following past earthquakes in Algeria and provides a brief review of risk mitigation activities since 1980. The paper also aims to provide measures to reduce earthquake risk through general strategy and practical implementation of the mitigation actions.Keywords: earthquake, hazard, prevention, strategy, risk reduction
Procedia PDF Downloads 5317524 Estimation and Forecasting with a Quantile AR Model for Financial Returns
Authors: Yuzhi Cai
Abstract:
This talk presents a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated MCMC algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. An application of the method to the USD to GBP daily currency exchange rates will also be discussed. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.Keywords: combining forecasts, MCMC, quantile modelling, quantile forecasting, predictive density functions
Procedia PDF Downloads 3477523 Monitoring Systemic Risk in the Hedge Fund Sector
Authors: Frank Hespeler, Giuseppe Loiacono
Abstract:
We propose measures for systemic risk generated through intra-sectorial interdependencies in the hedge fund sector. These measures are based on variations in the average cross-effects of funds showing significant interdependency between their individual returns and the moments of the sector’s return distribution. The proposed measures display a high ability to identify periods of financial distress, are robust to modifications in the underlying econometric model and are consistent with intuitive interpretation of the results.Keywords: hedge funds, systemic risk, vector autoregressive model, risk monitoring
Procedia PDF Downloads 3267522 Integrated Risk Management as a Framework for Organisational Success
Authors: Olakunle Felix Adekunle
Abstract:
Risk management is recognised as an essential tool to tackle the inevitable uncertainty associated with business and projects at all levels. But it frequently fails to meet expectations, with projects continuing to run late, over budget or under performing, and business is not gaining the expected benefits. The evident disconnect which often occurs between strategic vision and tactical project delivery typically arises from poorly defined project objectives and inadequate attention to the proactive management of risks that could affect those objectives. One of the main failings in the traditional approach to risk management arises from a narrow focus on the downside, restricted to the technical or operational field, addressing tactical threats to processes, performance or people. This shortcoming can be overcome by widening the scope of risk management to encompass both strategic risks and upside opportunities, creating an integrated approach which can bridge the gap between strategy and tactics. Integrated risk management addresses risk across a variety of levels in the organisation, including strategy and tactics, and covering both opportunity and threat. Effective implementation of integrated risk management can produce a number of benefits to the organisation which are not available from the typical limited-scope risk process. This paper explores how to expand risk management to deliver strategic advantage while retaining its use as a tactical tool.Keywords: risk management, success, organization, strategy, project, tactis, vision
Procedia PDF Downloads 3997521 Estimation of Opc, Fly Ash and Slag Contents in Blended and Composite Cements by Selective Dissolution Method
Authors: Suresh Palla
Abstract:
This research paper presents the results of the study on the estimation of fly ash, slag and cement contents in blended and composite cements by novel selective dissolution method. Types of cement samples investigated include OPC with fly ash as performance improver, OPC with slag as performance improver, PPC, PSC and Composite cement confirming to respective Indian Standards. Slag and OPC contents in PSC were estimated by selectively dissolving OPC in stage 1 and selectively dissolving slag in stage 2. In the case of composite cement sample, the percentage of cement, slag and fly ash were estimated systematically by selective dissolution of cement, slag and fly ash in three stages. In the first stage, cement dissolved and separated by leaving the residue of slag and fly ash, designated as R1. The second stage involves gravimetric estimation of fractions of OPC, residue and selective dissolution of fly ash and slag contents. Fly ash content, R2 was estimated through gravimetric analysis. Thereafter, the difference between the R1 and R2 is considered as slag content. The obtained results of cement, fly ash and slag using selective dissolution method showed 10% of standard deviation with the corresponding percentage of respective constituents. The results suggest that this novel selective dissolution method can be successfully used for estimation of OPC and SCMs contents in different types of cements.Keywords: selective dissolution method , fly ash, ggbfs slag, edta
Procedia PDF Downloads 1577520 A Risk Pathway of Distal and Proximal Factors for Self-Injury among Adolescents
Authors: Sarit Gideoni Cohen
Abstract:
The aim of the study was to examine possible risk pathway which initiated by the distal risk factors of insecure attachment to the mother, the father and peers and then developed by means of proximal risk factors: stressful life events and emotional distress. 275 participants (aged 13-26) from high-schools, youth groups and university were requited. Twenty-two percent participants reported at least one episode of self-injury. The relationship between paternal and peer attachment were partly mediated by stressful life events and depressive symptoms. Paternal and peer attachment influences during adolescence as contributing to risk pathway for self-injury were acknowledged.Keywords: self-injury, attachment, depression, stressful life-events, adolescence
Procedia PDF Downloads 2307519 Polynomially Adjusted Bivariate Density Estimates Based on the Saddlepoint Approximation
Authors: S. B. Provost, Susan Sheng
Abstract:
An alternative bivariate density estimation methodology is introduced in this presentation. The proposed approach involves estimating the density function associated with the marginal distribution of each of the two variables by means of the saddlepoint approximation technique and applying a bivariate polynomial adjustment to the product of these density estimates. Since the saddlepoint approximation is utilized in the context of density estimation, such estimates are determined from empirical cumulant-generating functions. In the univariate case, the saddlepoint density estimate is itself adjusted by a polynomial. Given a set of observations, the coefficients of the polynomial adjustments are obtained from the sample moments. Several illustrative applications of the proposed methodology shall be presented. Since this approach relies essentially on a determinate number of sample moments, it is particularly well suited for modeling massive data sets.Keywords: density estimation, empirical cumulant-generating function, moments, saddlepoint approximation
Procedia PDF Downloads 2807518 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding
Authors: Seongsoo Lee
Abstract:
Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization
Procedia PDF Downloads 3657517 Human Posture Estimation Based on Multiple Viewpoints
Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo
Abstract:
This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.Keywords: multi-view, pose estimation, ST-GCN, joint fusion
Procedia PDF Downloads 707516 Low Complexity Carrier Frequency Offset Estimation for Cooperative Orthogonal Frequency Division Multiplexing Communication Systems without Cyclic Prefix
Authors: Tsui-Tsai Lin
Abstract:
Cooperative orthogonal frequency division multiplexing (OFDM) transmission, which possesses the advantages of better connectivity, expanded coverage, and resistance to frequency selective fading, has been a more powerful solution for the physical layer in wireless communications. However, such a hybrid scheme suffers from the carrier frequency offset (CFO) effects inherited from the OFDM-based systems, which lead to a significant degradation in performance. In addition, insertion of a cyclic prefix (CP) at each symbol block head for combating inter-symbol interference will lead to a reduction in spectral efficiency. The design on the CFO estimation for the cooperative OFDM system without CP is a suspended problem. This motivates us to develop a low complexity CFO estimator for the cooperative OFDM decode-and-forward (DF) communication system without CP over the multipath fading channel. Especially, using a block-type pilot, the CFO estimation is first derived in accordance with the least square criterion. A reliable performance can be obtained through an exhaustive two-dimensional (2D) search with a penalty of heavy computational complexity. As a remedy, an alternative solution realized with an iteration approach is proposed for the CFO estimation. In contrast to the 2D-search estimator, the iterative method enjoys the advantage of the substantially reduced implementation complexity without sacrificing the estimate performance. Computer simulations have been presented to demonstrate the efficacy of the proposed CFO estimation.Keywords: cooperative transmission, orthogonal frequency division multiplexing (OFDM), carrier frequency offset, iteration
Procedia PDF Downloads 2687515 Parameter Estimation of False Dynamic EIV Model with Additive Uncertainty
Authors: Dalvinder Kaur Mangal
Abstract:
For the past decade, noise corrupted output measurements have been a fundamental research problem to be investigated. On the other hand, the estimation of the parameters for linear dynamic systems when also the input is affected by noise is recognized as more difficult problem which only recently has received increasing attention. Representations where errors or measurement noises/disturbances are present on both the inputs and outputs are usually called errors-in-variables (EIV) models. These disturbances may also have additive effects which are also considered in this paper. Parameter estimation of false EIV problem using equation error, output error and iterative prefiltering identification schemes with and without additive uncertainty, when only the output observation is corrupted by noise has been dealt in this paper. The comparative study of these three schemes has also been carried out.Keywords: errors-in-variable (EIV), false EIV, equation error, output error, iterative prefiltering, Gaussian noise
Procedia PDF Downloads 4977514 High-Resolution Flood Hazard Mapping Using Two-Dimensional Hydrodynamic Model Anuga: Case Study of Jakarta, Indonesia
Authors: Hengki Eko Putra, Dennish Ari Putro, Tri Wahyu Hadi, Edi Riawan, Junnaedhi Dewa Gede, Aditia Rojali, Fariza Dian Prasetyo, Yudhistira Satya Pribadi, Dita Fatria Andarini, Mila Khaerunisa, Raditya Hanung Prakoswa
Abstract:
Catastrophe risk management can only be done if we are able to calculate the exposed risks. Jakarta is an important city economically, socially, and politically and in the same time exposed to severe floods. On the other hand, flood risk calculation is still very limited in the area. This study has calculated the risk of flooding for Jakarta using 2-Dimensional Model ANUGA. 2-Dimensional model ANUGA and 1-Dimensional Model HEC-RAS are used to calculate the risk of flooding from 13 major rivers in Jakarta. ANUGA can simulate physical and dynamical processes between the streamflow against river geometry and land cover to produce a 1-meter resolution inundation map. The value of streamflow as an input for the model obtained from hydrological analysis on rainfall data using hydrologic model HEC-HMS. The probabilistic streamflow derived from probabilistic rainfall using statistical distribution Log-Pearson III, Normal and Gumbel, through compatibility test using Chi Square and Smirnov-Kolmogorov. Flood event on 2007 is used as a comparison to evaluate the accuracy of model output. Property damage estimations were calculated based on flood depth for 1, 5, 10, 25, 50, and 100 years return period against housing value data from the BPS-Statistics Indonesia, Centre for Research and Development of Housing and Settlements, Ministry of Public Work Indonesia. The vulnerability factor was derived from flood insurance claim. Jakarta's flood loss estimation for the return period of 1, 5, 10, 25, 50, and 100 years, respectively are Rp 1.30 t; Rp 16.18 t; Rp 16.85 t; Rp 21.21 t; Rp 24.32 t; and Rp 24.67 t of the total value of building Rp 434.43 t.Keywords: 2D hydrodynamic model, ANUGA, flood, flood modeling
Procedia PDF Downloads 2777513 Housing Security System and Household Entrepreneurship: Evidence from China
Authors: Wangshi Yong, Wei Shi, Jing Zou, Qiang Li, Yilin Tian
Abstract:
With the advancement of the reform of China’s housing security system, the impact is becoming increasingly profound. This paper explores the relationship between the housing security system and household entrepreneurship on the 2017 China Household Finance Survey (CHFS) and conducts a large number of robustness checks, including PSM and IV estimation. The results show that the assistance of the housing security system will significantly promote family entrepreneurship, increasing the probability of entrepreneurship by 2%. Its internal mechanism is mainly achieved by relaxing liquidity constraints and increasing household social capital. However, the risk preference effect has not existed. Heterogeneity analysis shows that the positive impact of the housing security system on family entrepreneurship is mainly reflected in areas with high housing prices and incomes, as well as households with long-term security and social or commercial insurance. Meanwhile, it also verifies that the positive externalities of the housing security system will also positively affect active entrepreneurial motivation, entrepreneurial intensity, and entrepreneurial innovation.Keywords: the housing security system, household entrepreneurship, social capital, liquidity constraints, risk preference
Procedia PDF Downloads 857512 Particle Filter State Estimation Algorithm Based on Improved Artificial Bee Colony Algorithm
Authors: Guangyuan Zhao, Nan Huang, Xuesong Han, Xu Huang
Abstract:
In order to solve the problem of sample dilution in the traditional particle filter algorithm and achieve accurate state estimation in a nonlinear system, a particle filter method based on an improved artificial bee colony (ABC) algorithm was proposed. The algorithm simulated the process of bee foraging and optimization and made the high likelihood region of the backward probability of particles moving to improve the rationality of particle distribution. The opposition-based learning (OBL) strategy is introduced to optimize the initial population of the artificial bee colony algorithm. The convergence factor is introduced into the neighborhood search strategy to limit the search range and improve the convergence speed. Finally, the crossover and mutation operations of the genetic algorithm are introduced into the search mechanism of the following bee, which makes the algorithm jump out of the local extreme value quickly and continue to search the global extreme value to improve its optimization ability. The simulation results show that the improved method can improve the estimation accuracy of particle filters, ensure the diversity of particles, and improve the rationality of particle distribution.Keywords: particle filter, impoverishment, state estimation, artificial bee colony algorithm
Procedia PDF Downloads 1527511 The Impact of Corporate Governance on Risk Taking in European Insurance Industry
Authors: Francesco Venuti, Simona Alfiero
Abstract:
The aim of this paper is to develop an empirical research on the nature and consequences of corporate governance on Eurozone Insurance Industry risk taking attitude. More particularly, we analyzed the effect of public ownership on risk taking with respect to privately held Insurance Companies. We also analyzed the effects on risk taking attitude of different degrees of ownership concentration, directors compensation, and the dimension/diversity of the Board of Directors. Our results provide quite strong evidence that, coherently with the Agency Theory, publicly traded insurance companies with more concentrated ownership are less risky than the corresponding privately held.Keywords: agency theory, corporate governance, insurance companies, risk taking
Procedia PDF Downloads 4317510 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground
Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee
Abstract:
To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk
Procedia PDF Downloads 3367509 Clustering for Detection of the Population at Risk of Anticholinergic Medication
Authors: A. Shirazibeheshti, T. Radwan, A. Ettefaghian, G. Wilson, C. Luca, Farbod Khanizadeh
Abstract:
Anticholinergic medication has been associated with events such as falls, delirium, and cognitive impairment in older patients. To further assess this, anticholinergic burden scores have been developed to quantify risk. A risk model based on clustering was deployed in a healthcare management system to cluster patients into multiple risk groups according to anticholinergic burden scores of multiple medicines prescribed to patients to facilitate clinical decision-making. To do so, anticholinergic burden scores of drugs were extracted from the literature, which categorizes the risk on a scale of 1 to 3. Given the patients’ prescription data on the healthcare database, a weighted anticholinergic risk score was derived per patient based on the prescription of multiple anticholinergic drugs. This study was conducted on over 300,000 records of patients currently registered with a major regional UK-based healthcare provider. The weighted risk scores were used as inputs to an unsupervised learning algorithm (mean-shift clustering) that groups patients into clusters that represent different levels of anticholinergic risk. To further evaluate the performance of the model, any association between the average risk score within each group and other factors such as socioeconomic status (i.e., Index of Multiple Deprivation) and an index of health and disability were investigated. The clustering identifies a group of 15 patients at the highest risk from multiple anticholinergic medication. Our findings also show that this group of patients is located within more deprived areas of London compared to the population of other risk groups. Furthermore, the prescription of anticholinergic medicines is more skewed to female than male patients, indicating that females are more at risk from this kind of multiple medications. The risk may be monitored and controlled in well artificial intelligence-equipped healthcare management systems.Keywords: anticholinergic medicines, clustering, deprivation, socioeconomic status
Procedia PDF Downloads 2127508 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment
Authors: Ritsuko Kawasaki, Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 3807507 A Comparison of Methods for Estimating Dichotomous Treatment Effects: A Simulation Study
Authors: Jacqueline Y. Thompson, Sam Watson, Lee Middleton, Karla Hemming
Abstract:
Introduction: The odds ratio (estimated via logistic regression) is a well-established and common approach for estimating covariate-adjusted binary treatment effects when comparing a treatment and control group with dichotomous outcomes. Its popularity is primarily because of its stability and robustness to model misspecification. However, the situation is different for the relative risk and risk difference, which are arguably easier to interpret and better suited to specific designs such as non-inferiority studies. So far, there is no equivalent, widely acceptable approach to estimate an adjusted relative risk and risk difference when conducting clinical trials. This is partly due to the lack of a comprehensive evaluation of available candidate methods. Methods/Approach: A simulation study is designed to evaluate the performance of relevant candidate methods to estimate relative risks to represent conditional and marginal estimation approaches. We consider the log-binomial, generalised linear models (GLM) with iteratively weighted least-squares (IWLS) and model-based standard errors (SE); log-binomial GLM with convex optimisation and model-based SEs; log-binomial GLM with convex optimisation and permutation tests; modified-Poisson GLM IWLS and robust SEs; log-binomial generalised estimation equations (GEE) and robust SEs; marginal standardisation and delta method SEs; and marginal standardisation and permutation test SEs. Independent and identically distributed datasets are simulated from a randomised controlled trial to evaluate these candidate methods. Simulations are replicated 10000 times for each scenario across all possible combinations of sample sizes (200, 1000, and 5000), outcomes (10%, 50%, and 80%), and covariates (ranging from -0.05 to 0.7) representing weak, moderate or strong relationships. Treatment effects (ranging from 0, -0.5, 1; on the log-scale) will consider null (H0) and alternative (H1) hypotheses to evaluate coverage and power in realistic scenarios. Performance measures (bias, mean square error (MSE), relative efficiency, and convergence rates) are evaluated across scenarios covering a range of sample sizes, event rates, covariate prognostic strength, and model misspecifications. Potential Results, Relevance & Impact: There are several methods for estimating unadjusted and adjusted relative risks. However, it is unclear which method(s) is the most efficient, preserves type-I error rate, is robust to model misspecification, or is the most powerful when adjusting for non-prognostic and prognostic covariates. GEE estimations may be biased when the outcome distributions are not from marginal binary data. Also, it seems that marginal standardisation and convex optimisation may perform better than GLM IWLS log-binomial.Keywords: binary outcomes, statistical methods, clinical trials, simulation study
Procedia PDF Downloads 1157506 A Financial Analysis of the Current State of IKEA: A Case Study
Authors: Isabela Vieira, Leonor Carvalho Garcez, Adalmiro Pereira, Tânia Teixeira
Abstract:
In the present work, we aim to analyse IKEA as a company, by focusing on its development, financial analysis and future benchmarks, as well as applying some of the knowledge learned in class, namely hedging and other financial risk mitigation solutions, to understand how IKEA navigates and protects itself from risk. The decision that led us to choose IKEA for our casework has to do with the long history of the company since the 1940s and its high internationalization in 63 different markets. The company also has clear financial reports which aided us in the making of the present essay and naturally, was a factor that contributed to our decision.Keywords: Ikea, financial risk, risk management, hedge
Procedia PDF Downloads 627505 On the Development of a Homogenized Earthquake Catalogue for Northern Algeria
Authors: I. Grigoratos, R. Monteiro
Abstract:
Regions with a significant percentage of non-seismically designed buildings and reduced urban planning are particularly vulnerable to natural hazards. In this context, the project ‘Improved Tools for Disaster Risk Mitigation in Algeria’ (ITERATE) aims at seismic risk mitigation in Algeria. Past earthquakes in North Algeria caused extensive damages, e.g. the El Asnam 1980 moment magnitude (Mw) 7.1 and Boumerdes 2003 Mw 6.8 earthquakes. This paper will address a number of proposed developments and considerations made towards a further improvement of the component of seismic hazard. In specific, an updated earthquake catalog (until year 2018) is compiled, and new conversion equations to moment magnitude are introduced. Furthermore, a network-based method for the estimation of the spatial and temporal distribution of the minimum magnitude of completeness is applied. We found relatively large values for Mc, due to the sparse network, and a nonlinear trend between Mw and body wave (mb) or local magnitude (ML), which are the most common scales reported in the region. Lastly, the resulting b-value of the Gutenberg-Richter distribution is sensitive to the declustering method.Keywords: conversion equation, magnitude of completeness, seismic events, seismic hazard
Procedia PDF Downloads 1667504 A Quantification Method of Attractiveness of Stations and an Estimation Method of Number of Passengers Taking into Consideration the Attractiveness of the Station
Authors: Naoya Ozaki, Takuya Watanabe, Ryosuke Matsumoto, Noriko Fukasawa
Abstract:
In the metropolitan areas in Japan, in many stations, shopping areas are set up, and escalators and elevators are installed to make the stations be barrier-free. Further, many areas around the stations are being redeveloped. Railway business operators want to know how much effect these circumstances have on attractiveness of the station or number of passengers using the station. So, we performed a questionnaire survey of the station users in the metropolitan areas for finding factors to affect the attractiveness of stations. Then, based on the analysis of the survey, we developed a method to quantitatively evaluate attractiveness of the stations. We also developed an estimation method for number of passengers based on combination of attractiveness of the station quantitatively evaluated and the residential and labor population around the station. Then, we derived precise linear regression models estimating the attractiveness of the station and number of passengers of the station.Keywords: attractiveness of the station, estimation method, number of passengers of the station, redevelopment around the station, renovation of the station
Procedia PDF Downloads 2877503 Intellectual Property Risk Assessment in Planning Market Entry to China
Authors: Qing Cao
Abstract:
Generally speaking, China has a relatively high level of intellectual property (IP) infringement. Risk assessment is indispensable in the strategic planning process. To complement the current literature in international business, the paper sheds the light on how to assess IP risk for foreign companies in planning market entry to China. Evaluating internal and external IP environment, proposed in the paper, consists of external analysis, internal analysis and further internal analysis. Through position the company’s IP environment, the risk assessment approach enables the foreign companies to either build the corresponding IP strategies or abort the entry plan beforehand to minimize the IP risks.Keywords: intellectual property, IP environment, risk assessment
Procedia PDF Downloads 5617502 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error
Procedia PDF Downloads 1437501 Sex Education Training Program Effect on Junior Secondary School Students Knowledge and Practice of Sexual Risk Behavior
Authors: Diyaolu Babajide Olufemi, Oyerinde Oyewole Olusesan
Abstract:
This study examined the effect of sex education training programs on the knowledge and practice of sexual risk behavior among secondary school adolescents in Ibadan North Local Government area of Oyo State. A total of 105 students were sampled from two schools in the Local Government area. Seventy students (70) constituted the experimental group while thirty-five (35) constituted the control group. Pretest-Posttest control group quasi-experimental design was adopted. A self-developed questionnaire was used to test participants’ knowledge and practice of sexual risk behavior before and after the training (α=.62, .82 and .74). Analysis indicated a significant effect of sex education training on participants’ knowledge and practice of sexual risk behavior, a significant gender difference in knowledge of sexual risk behavior but no significant age and gender difference in the practice of sexual risk behavior. It was thus concluded that sex education should be taught in schools and emphasized at homes with no age or gender restrictions.Keywords: early adolescent, health risk, sexual risk behavior, sex education
Procedia PDF Downloads 143