Search results for: scoring based risk assessment method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 43804

Search results for: scoring based risk assessment method

43744 Implementation of Risk Management System to Improve the Quality of Higher Education Institutes

Authors: Muhammad Wasif, Asif Ahmed Shaikh, Sarosh Hashmat Lodi, Muhammad Aslam Bhutto, Riazuddin

Abstract:

Risk Management System is quite popular in profit- based organizations, health and safety and project management fields since the last few decades. But due to rapidly changing environment and requirement of ISO 9001:2015 standards, public-sector institution, especially higher education institutes are also performing risk assessment to monitor the performance of the institution and aligning it with the latest benchmark. In this context, NED University of Engineering and Technology performed research and developed a Standard Operating Procedure (SOP) for the risk assessment, its monitoring and control. In this research, risks are broken into the four sources, namely; Internal Academics Risks, External Academics Risks, Internal Non-academic Risks, External Non-academic Risks. Risks are identified by the management at all levels. Severity and likelihood of the risks are assigned based on the previous audit results and the customer complains. Risk Ratings are calculated to orderly arrange the risk according to the Risk Rating, and controls for the risks are designed, which are assigned to the responsible person. At the end of the article, result and analysis on the different sources of risk are discussed in details and the conclusion is drawn. Discussion on few sample risks are presented in this article. Hence it is presented in the research that the Risk Management System can be applied in a Higher Education Institute to effectively control the risks which might affect the scope and Quality Management System of an organization.

Keywords: higher education, quality management system, risk assessment, risk management

Procedia PDF Downloads 269
43743 Syntax and Words as Evolutionary Characters in Comparative Linguistics

Authors: Nancy Retzlaff, Sarah J. Berkemer, Trudie Strauss

Abstract:

In the last couple of decades, the advent of digitalization of any kind of data was probably one of the major advances in all fields of study. This paves the way for also analysing these data even though they might come from disciplines where there was no initial computational necessity to do so. Especially in linguistics, one can find a rather manual tradition. Still when considering studies that involve the history of language families it is hard to overlook the striking similarities to bioinformatics (phylogenetic) approaches. Alignments of words are such a fairly well studied example of an application of bioinformatics methods to historical linguistics. In this paper we will not only consider alignments of strings, i.e., words in this case, but also alignments of syntax trees of selected Indo-European languages. Based on initial, crude alignments, a sophisticated scoring model is trained on both letters and syntactic features. The aim is to gain a better understanding on which features in two languages are related, i.e., most likely to have the same root. Initially, all words in two languages are pre-aligned with a basic scoring model that primarily selects consonants and adjusts them before fitting in the vowels. Mixture models are subsequently used to filter ‘good’ alignments depending on the alignment length and the number of inserted gaps. Using these selected word alignments it is possible to perform tree alignments of the given syntax trees and consequently find sentences that correspond rather well to each other across languages. The syntax alignments are then filtered for meaningful scores—’good’ scores contain evolutionary information and are therefore used to train the sophisticated scoring model. Further iterations of alignments and training steps are performed until the scoring model saturates, i.e., barely changes anymore. A better evaluation of the trained scoring model and its function in containing evolutionary meaningful information will be given. An assessment of sentence alignment compared to possible phrase structure will also be provided. The method described here may have its flaws because of limited prior information. This, however, may offer a good starting point to study languages where only little prior knowledge is available and a detailed, unbiased study is needed.

Keywords: alignments, bioinformatics, comparative linguistics, historical linguistics, statistical methods

Procedia PDF Downloads 125
43742 The Use of Coronary Calcium Scanning for Cholesterol Assessment and Management

Authors: Eva Kirzner

Abstract:

Based on outcome studies published over the past two decades, in 2018, the ACC/AHA published new guidelines for the management of hypercholesterolemia that incorporate the use of coronary artery calcium (CAC) scanning as a decision tool for ascertaining which patients may benefit from statin therapy. This use is based on the recognition that the absence of calcium on CAC scanning (i.e., a CAC score of zero) usually signifies the absence of significant atherosclerotic deposits in the coronary arteries. Specifically, in patients with a high risk for atherosclerotic cardiovascular disease (ASCVD), initiation of statin therapy is generally recommended to decrease ASCVD risk. However, among patients with intermediate ASCVD risk, the need for statin therapy is less certain. However, there is a need for new outcome studies that provide evidence that the management of hypercholesterolemia based on these new ACC/AHA recommendations is safe for patients. Based on a Pub-Med and Google Scholar literature search, four relevant population-based or patient-based cohort studies that studied the relationship between CAC scanning, risk assessment or mortality, and statin therapy that were published between 2017 and 2021 were identified (see references). In each of these studies, patients were assessed for their baseline risk for atherosclerotic cardiovascular disease (ASCVD) using the Pooled Cohorts Equation (PCE), an ACC/AHA calculator for determining patient risk based on assessment of patient age, gender, ethnicity, and coronary artery disease risk factors. The combined findings of these four studies provided concordant evidence that a zero CAC score defines patients who remain at low clinical risk despite the non-use of statin therapy. Thus, these new studies confirm the use of CAC scanning as a safe tool for reducing the potential overuse of statin therapy among patients with zero CAC scores. Incorporating these new data suggest the following best practice: (1) ascertain ASCVD risk according to the PCE in all patients; (2) following an initial attempt trial to lower ASCVD risk with optimal diet among patients with elevated ASCVD risk, initiate statin therapy for patients who have a high ASCVD risk score; (3) if the ASCVD score is intermediate, refer patients for CAC scanning; and (4) and if the CAC score is zero among the intermediate risk ASCVD patients, statin therapy can be safely withheld despite the presence of an elevated serum cholesterol level.

Keywords: cholesterol, cardiovascular disease, statin therapy, coronary calcium

Procedia PDF Downloads 87
43741 Development of an Image-Based Biomechanical Model for Assessment of Hip Fracture Risk

Authors: Masoud Nasiri Sarvi, Yunhua Luo

Abstract:

Low-trauma hip fracture, usually caused by fall from standing height, has become a main source of morbidity and mortality for the elderly. Factors affecting hip fracture include sex, race, age, body weight, height, body mass distribution, etc., and thus, hip fracture risk in fall differs widely from subject to subject. It is therefore necessary to develop a subject-specific biomechanical model to predict hip fracture risk. The objective of this study is to develop a two-level, image-based, subject-specific biomechanical model consisting of a whole-body dynamics model and a proximal-femur finite element (FE) model for more accurately assessing the risk of hip fracture in lateral falls. Required information for constructing the model is extracted from a whole-body and a hip DXA (Dual Energy X-ray Absorptiometry) image of the subject. The proposed model considers all parameters subject-specifically, which will provide a fast, accurate, and non-expensive method for predicting hip fracture risk.

Keywords: bone mineral density, hip fracture risk, impact force, sideways falls

Procedia PDF Downloads 504
43740 Risk Assessment of Heavy Rainfall and Development of Damage Prediction Function for Gyeonggi-Do Province

Authors: Jongsung Kim, Daegun Han, Myungjin Lee, Soojun Kim, Hung Soo Kim

Abstract:

Recently, the frequency and magnitude of natural disasters are gradually increasing due to climate change. Especially in Korea, large-scale damage caused by heavy rainfall frequently occurs due to rapid urbanization. Therefore, this study proposed a Heavy rain Damage Risk Index (HDRI) using PSR (Pressure – State - Response) structure for heavy rain risk assessment. We constructed pressure index, state index, and response index for the risk assessment of each local government in Gyeonggi-do province, and the evaluation indices were determined by principal component analysis. The indices were standardized using the Z-score method then HDRIs were obtained for 31 local governments in the province. The HDRI is categorized into three classes, say, the safest class is 1st class. As the results, the local governments of the 1st class were 15, 2nd class 7, and 3rd class 9. From the study, we were able to identify the risk class due to the heavy rainfall for each local government. It will be useful to develop the heavy rainfall prediction function by risk class, and this was performed in this issue. Also, this risk class could be used for the decision making for efficient disaster management. Acknowledgements: This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2017R1A2B3005695).

Keywords: natural disaster, heavy rain risk assessment, HDRI, PSR

Procedia PDF Downloads 156
43739 Application of Italian Guidelines for Existing Bridge Management

Authors: Giovanni Menichini, Salvatore Giacomo Morano, Gloria Terenzi, Luca Salvatori, Maurizio Orlando

Abstract:

The “Guidelines for Risk Classification, Safety Assessment, and Structural Health Monitoring of Existing Bridges” were recently approved by the Italian Government to define technical standards for managing the national network of existing bridges. These guidelines provide a framework for risk mitigation and safety assessment of bridges, which are essential elements of the built environment and form the basis for the operation of transport systems. Within the guideline framework, a workflow based on three main points was proposed: (1) risk-based, i.e., based on typical parameters of hazard, vulnerability, and exposure; (2) multi-level, i.e., including six assessment levels of increasing complexity; and (3) multirisk, i.e., assessing structural/foundational, seismic, hydrological, and landslide risks. The paper focuses on applying the Italian Guidelines to specific case studies, aiming to identify the parameters that predominantly influence the determination of the “class of attention”. The significance of each parameter is determined via sensitivity analysis. Additionally, recommendations for enhancing the process of assigning the class of attention are proposed.

Keywords: bridge safety assessment, Italian guidelines implementation, risk classification, structural health monitoring

Procedia PDF Downloads 15
43738 Risk Tolerance and Individual Worthiness Based on Simultaneous Analysis of the Cognitive Performance and Emotional Response to a Multivariate Situational Risk Assessment

Authors: Frederic Jumelle, Kelvin So, Didan Deng

Abstract:

A method and system for neuropsychological performance test, comprising a mobile terminal, used to interact with a cloud server which stores user information and is logged into by the user through the terminal device; the user information is directly accessed through the terminal device and is processed by artificial neural network, and the user information comprises user facial emotions information, performance test answers information and user chronometrics. This assessment is used to evaluate the cognitive performance and emotional response of the subject to a series of dichotomous questions describing various situations of daily life and challenging the users' knowledge, values, ethics, and principles. In industrial applications, the timing of this assessment will depend on the users' need to obtain a service from a provider, such as opening a bank account, getting a mortgage or an insurance policy, authenticating clearance at work, or securing online payments.

Keywords: artificial intelligence, neurofinance, neuropsychology, risk management

Procedia PDF Downloads 109
43737 Q-Test of Undergraduate Epistemology and Scientific Thought: Development and Testing of an Assessment of Scientific Epistemology

Authors: Matthew J. Zagumny

Abstract:

The QUEST is an assessment of scientific epistemic beliefs and was developed to measure students’ intellectual development in regards to beliefs about knowledge and knowing. The QUEST utilizes Q-sort methodology, which requires participants to rate the degree to which statements describe them personally. As a measure of personal theories of knowledge, the QUEST instrument is described with the Q-sort distribution and scoring explained. A preliminary demonstration of the QUEST assessment is described with two samples of undergraduate students (novice/lower division compared to advanced/upper division students) being assessed and their average QUEST scores compared. The usefulness of an assessment of epistemology is discussed in terms of the principle that assessment tends to drive educational practice and university mission. The critical need for university and academic programs to focus on development of students’ scientific epistemology is briefly discussed.

Keywords: scientific epistemology, critical thinking, Q-sort method, STEM undergraduates

Procedia PDF Downloads 349
43736 Canada Deuterium Uranium Updated Fire Probabilistic Risk Assessment Model for Canadian Nuclear Plants

Authors: Hossam Shalabi, George Hadjisophocleous

Abstract:

The Canadian Nuclear Power Plants (NPPs) use some portions of NUREG/CR-6850 in carrying out Fire Probabilistic Risk Assessment (PRA). An assessment for the applicability of NUREG/CR-6850 to CANDU reactors was performed and a CANDU Fire PRA was introduced. There are 19 operating CANDU reactors in Canada at five sites (Bruce A, Bruce B, Darlington, Pickering and Point Lepreau). A fire load density survey was done for all Fire Safe Shutdown Analysis (FSSA) fire zones in all CANDU sites in Canada. National Fire Protection Association (NFPA) Standard 557 proposes that a fire load survey must be conducted by either the weighing method or the inventory method or a combination of both. The combination method results in the most accurate values for fire loads. An updated CANDU Fire PRA model is demonstrated in this paper that includes the fuel survey in all Canadian CANDU stations. A qualitative screening step for the CANDU fire PRA is illustrated in this paper to include any fire events that can damage any part of the emergency power supply in addition to FSSA cables.

Keywords: fire safety, CANDU, nuclear, fuel densities, FDS, qualitative analysis, fire probabilistic risk assessment

Procedia PDF Downloads 103
43735 An Effective Decision-Making Strategy Based on Multi-Objective Optimization for Commercial Vehicles in Highway Scenarios

Authors: Weiming Hu, Xu Li, Xiaonan Li, Zhong Xu, Li Yuan, Xuan Dong

Abstract:

Maneuver decision-making plays a critical role in high-performance intelligent driving. This paper proposes a risk assessment-based decision-making network (RADMN) to address the problem of driving strategy for the commercial vehicle. RADMN integrates two networks, aiming at identifying the risk degree of collision and rollover and providing decisions to ensure the effectiveness and reliability of driving strategy. In the risk assessment module, risk degrees of the backward collision, forward collision and rollover are quantified for hazard recognition. In the decision module, a deep reinforcement learning based on multi-objective optimization (DRL-MOO) algorithm is designed, which comprehensively considers the risk degree and motion states of each traffic participant. To evaluate the performance of the proposed framework, Prescan/Simulink joint simulation was conducted in highway scenarios. Experimental results validate the effectiveness and reliability of the proposed RADMN. The output driving strategy can guarantee the safety and provide key technical support for the realization of autonomous driving of commercial vehicles.

Keywords: decision-making strategy, risk assessment, multi-objective optimization, commercial vehicle

Procedia PDF Downloads 103
43734 The Effects of Weather Events and Land Use Change on Urban Ecosystems: From Risk to Resilience

Authors: Szu-Hua Wang

Abstract:

Urban ecosystems, as complex coupled human-environment systems, contain abundant natural resources for breeding natural assets and, at the same time, attract urban assets and consume natural resources, triggered by urban development. Land use change illustrates the interaction between human activities and environments factually. However, IPCC (2014) announces that land use change and urbanization due to human activities are the major cause of climate change, leading to serious impacts on urban ecosystem resilience and risk. For this reason, risk assessment and resilience analysis are the keys for responding to climate change on urban ecosystems. Urban spatial planning can guide urban development by land use planning, transportation planning, and environmental planning and affect land use allocation and human activities by building major constructions and protecting important national land resources simultaneously. Urban spatial planning can aggravate climate change and, on the other hand, mitigate and adapt climate change. Research on effects of spatial planning on land use change and climate change is one of intense issues currently. Therefore, this research focuses on developing frameworks for risk assessment and resilience analysis from the aspect of ecosystem based on typhoon precipitation in Taipei area. The integrated method of risk assessment and resilience analysis will be also addressed for applying spatial planning practice and sustainable development.

Keywords: ecosystem, land use change, risk analysis, resilience

Procedia PDF Downloads 383
43733 A Framework for Security Risk Level Measures Using CVSS for Vulnerability Categories

Authors: Umesh Kumar Singh, Chanchala Joshi

Abstract:

With increasing dependency on IT infrastructure, the main objective of a system administrator is to maintain a stable and secure network, with ensuring that the network is robust enough against malicious network users like attackers and intruders. Security risk management provides a way to manage the growing threats to infrastructures or system. This paper proposes a framework for risk level estimation which uses vulnerability database National Institute of Standards and Technology (NIST) National Vulnerability Database (NVD) and the Common Vulnerability Scoring System (CVSS). The proposed framework measures the frequency of vulnerability exploitation; converges this measured frequency with standard CVSS score and estimates the security risk level which helps in automated and reasonable security management. In this paper equation for the Temporal score calculation with respect to availability of remediation plan is derived and further, frequency of exploitation is calculated with determined temporal score. The frequency of exploitation along with CVSS score is used to calculate the security risk level of the system. The proposed framework uses the CVSS vectors for risk level estimation and measures the security level of specific network environment, which assists system administrator for assessment of security risks and making decision related to mitigation of security risks.

Keywords: CVSS score, risk level, security measurement, vulnerability category

Procedia PDF Downloads 294
43732 Prevalence, Level and Health Risk Assessment of Mycotoxins in the Fried Poultry Eggs from Jordan

Authors: Sharaf S. Omar

Abstract:

In the current study, level and prevalence of deoxynivalenol (DON), aflatoxin B1 AFB1), zearalenone (ZEN), and ochratoxin A (OTA) in fried poultry eggs in Jordan was investigated. Poultry egg samples (n = 250) were collected. The level of DON, AFB1, ZEN and OTA in the white and yolk of poultry eggs was measured using LC-MS-MS. The health risk assessment was calculated using Margin of Exposures (MOEs) for AFB1 and OTA and hazard index (HI) for ZEN and DON. The highest prevalence in yolk and white of eggs was related to ZEN (96.56%) and OTA (97.44%), respectively. Also, the highest level in white and yolk was related to DON (1.07µg/kg) and DON (1.65 µg/kg), respectively. Level of DON in the yolk of eggs was significantly higher than white of eggs (P-value < 0.05). Risk assessment indicated that exposed population are at high risk of AFB1 (MOEs < 10,000) in fried poultry eggs.

Keywords: mycotoxins 2, aflatoxin b1, risk assessment, poultry egg

Procedia PDF Downloads 66
43731 Credit Risk Evaluation Using Genetic Programming

Authors: Ines Gasmi, Salima Smiti, Makram Soui, Khaled Ghedira

Abstract:

Credit risk is considered as one of the important issues for financial institutions. It provokes great losses for banks. To this objective, numerous methods for credit risk evaluation have been proposed. Many evaluation methods are black box models that cannot adequately reveal information hidden in the data. However, several works have focused on building transparent rules-based models. For credit risk assessment, generated rules must be not only highly accurate, but also highly interpretable. In this paper, we aim to build both, an accurate and transparent credit risk evaluation model which proposes a set of classification rules. In fact, we consider the credit risk evaluation as an optimization problem which uses a genetic programming (GP) algorithm, where the goal is to maximize the accuracy of generated rules. We evaluate our proposed approach on the base of German and Australian credit datasets. We compared our finding with some existing works; the result shows that the proposed GP outperforms the other models.

Keywords: credit risk assessment, rule generation, genetic programming, feature selection

Procedia PDF Downloads 314
43730 Development of a Geomechanical Risk Assessment Model for Underground Openings

Authors: Ali Mortazavi

Abstract:

The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).

Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering

Procedia PDF Downloads 113
43729 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 241
43728 Assessment Using Copulas of Simultaneous Damage to Multiple Buildings Due to Tsunamis

Authors: Yo Fukutani, Shuji Moriguchi, Takuma Kotani, Terada Kenjiro

Abstract:

If risk management of the assets owned by companies, risk assessment of real estate portfolio, and risk identification of the entire region are to be implemented, it is necessary to consider simultaneous damage to multiple buildings. In this research, the Sagami Trough earthquake tsunami that could have a significant effect on the Japanese capital region is focused on, and a method is proposed for simultaneous damage assessment using copulas that can take into consideration the correlation of tsunami depths and building damage between two sites. First, the tsunami inundation depths at two sites were simulated by using a nonlinear long-wave equation. The tsunamis were simulated by varying the slip amount (five cases) and the depths (five cases) for each of 10 sources of the Sagami Trough. For each source, the frequency distributions of the tsunami inundation depth were evaluated by using the response surface method. Then, Monte-Carlo simulation was conducted, and frequency distributions of tsunami inundation depth were evaluated at the target sites for all sources of the Sagami Trough. These are marginal distributions. Kendall’s tau for the tsunami inundation simulation at two sites was 0.83. Based on this value, the Gaussian copula, t-copula, Clayton copula, and Gumbel copula (n = 10,000) were generated. Then, the simultaneous distributions of the damage rate were evaluated using the marginal distributions and the copulas. For the correlation of the tsunami inundation depth at the two sites, the expected value hardly changed compared with the case of no correlation, but the damage rate of the ninety-ninth percentile value was approximately 2%, and the maximum value was approximately 6% when using the Gumbel copula.

Keywords: copulas, Monte-Carlo simulation, probabilistic risk assessment, tsunamis

Procedia PDF Downloads 111
43727 Prediction and Analysis of Human Transmembrane Transporter Proteins Based on SCM

Authors: Hui-Ling Huang, Tamara Vasylenko, Phasit Charoenkwan, Shih-Hsiang Chiu, Shinn-Ying Ho

Abstract:

The knowledge of the human transporters is still limited due to technically demanding procedure of crystallization for the structural characterization of transporters by spectroscopic methods. It is desirable to develop bioinformatics tools for effective analysis of available sequences in order to identify human transmembrane transporter proteins (HMTPs). This study proposes a scoring card method (SCM) based method for predicting HMTPs. We estimated a set of propensity scores of dipeptides to be HMTPs using SCM from the training dataset (HTS732) consisting of 366 HMTPs and 366 non-HMTPs. SCM using the estimated propensity scores of 20 amino acids and 400 dipeptides -as HMTPs, has a training accuracy of 87.63% and a test accuracy of 66.46%. The five top-ranked dipeptides include LD, NV, LI, KY, and MN with scores 996, 992, 989, 987, and 985, respectively. Five amino acids with the highest propensity scores are Ile, Phe, Met, Gly, and Leu, that hydrophobic residues are mostly highly-scored. Furthermore, obtained propensity scores were used to analyze physicochemical properties of human transporters.

Keywords: dipeptide composition, physicochemical property, human transmembrane transporter proteins, human transmembrane transporters binding propensity, scoring card method

Procedia PDF Downloads 339
43726 Comprehensive Risk Analysis of Decommissioning Activities with Multifaceted Hazard Factors

Authors: Hyeon-Kyo Lim, Hyunjung Kim, Kune-Woo Lee

Abstract:

Decommissioning process of nuclear facilities can be said to consist of a sequence of problem solving activities, partly because there may exist working environments contaminated by radiological exposure, and partly because there may also exist industrial hazards such as fire, explosions, toxic materials, and electrical and physical hazards. As for an individual hazard factor, risk assessment techniques are getting known to industrial workers with advance of safety technology, but the way how to integrate those results is not. Furthermore, there are few workers who experienced decommissioning operations a lot in the past. Therefore, not a few countries in the world have been trying to develop appropriate counter techniques in order to guarantee safety and efficiency of the process. In spite of that, there still exists neither domestic nor international standard since nuclear facilities are too diverse and unique. In the consequence, it is quite inevitable to imagine and assess the whole risk in the situation anticipated one by one. This paper aimed to find out an appropriate technique to integrate individual risk assessment results from the viewpoint of experts. Thus, on one hand the whole risk assessment activity for decommissioning operations was modeled as a sequence of individual risk assessment steps, and on the other, a hierarchical risk structure was developed. Then, risk assessment procedure that can elicit individual hazard factors one by one were introduced with reference to the standard operation procedure (SOP) and hierarchical task analysis (HTA). With an assumption of quantification and normalization of individual risks, a technique to estimate relative weight factors was tried by using the conventional Analytic Hierarchical Process (AHP) and its result was reviewed with reference to judgment of experts. Besides, taking the ambiguity of human judgment into consideration, debates based upon fuzzy inference was added with a mathematical case study.

Keywords: decommissioning, risk assessment, analytic hierarchical process (AHP), fuzzy inference

Procedia PDF Downloads 396
43725 Assessment of Arterial Stiffness through Measurement of Magnetic Flux Disturbance and Electrocardiogram Signal

Authors: Jing Niu, Jun X. Wang

Abstract:

Arterial stiffness predicts mortality and morbidity, independently of other cardiovascular risk factors. And it is a major risk factor for age-related morbidity and mortality. The non-invasive industry gold standard measurement system of arterial stiffness utilizes pulse wave velocity method. However, the desktop device is expensive and requires trained professional to operate. The main objective of this research is the proof of concept of the proposed non-invasive method which uses measurement of magnetic flux disturbance and electrocardiogram (ECG) signal for measuring arterial stiffness. The method could enable accurate and easy self-assessment of arterial stiffness at home, and to help doctors in research, diagnostic and prescription in hospitals and clinics. A platform for assessing arterial stiffness through acquisition and analysis of radial artery pulse waveform and ECG signal has been developed based on the proposed method. Radial artery pulse waveform is acquired using the magnetic based sensing technology, while ECG signal is acquired using two dry contact single arm ECG electrodes. The measurement only requires the participant to wear a wrist strap and an arm band. Participants were recruited for data collection using both the developed platform and the industry gold standard system. The results from both systems underwent correlation assessment analysis. A strong positive correlation between the results of the two systems is observed. This study presents the possibility of developing an accurate, easy to use and affordable measurement device for arterial stiffness assessment.

Keywords: arterial stiffness, electrocardiogram, pulse wave velocity, Magnetic Flux Disturbance

Procedia PDF Downloads 159
43724 Risk Prioritization in Tunneling Construction Projects

Authors: David Nantes, George Gilbert

Abstract:

There are a lot of risks that might crop up as a tunneling project develops, and it's crucial to be aware of them. Due to the unexpected nature of tunneling projects and the interconnectedness of risk occurrences, the risk assessment approach presents a significant challenge. The purpose of this study is to provide a hybrid FDEMATEL-ANP model to help prioritize risks during tunnel construction projects. The ambiguity in expert judgments and the relative severity of interdependencies across risk occurrences are both taken into consideration by this model, thanks to the Fuzzy Decision-Making Trial and Evaluation Laboratory (FDEMATEL). The Analytic Network Process (ANP) method is used to rank priorities and assess project risks. The authors provide a case study of a subway tunneling construction project to back up the validity of their methodology. The results showed that the proposed method successfully isolated key risk factors and elucidated their interplay in the case study. The proposed method has the potential to become a helpful resource for evaluating dangers associated with tunnel construction projects.

Keywords: risk, prioritization, FDEMATEL, ANP, tunneling construction projects

Procedia PDF Downloads 59
43723 Performance of the Strong Stability Method in the Univariate Classical Risk Model

Authors: Safia Hocine, Zina Benouaret, Djamil A¨ıssani

Abstract:

In this paper, we study the performance of the strong stability method of the univariate classical risk model. We interest to the stability bounds established using two approaches. The first based on the strong stability method developed for a general Markov chains. The second approach based on the regenerative processes theory . By adopting an algorithmic procedure, we study the performance of the stability method in the case of exponential distribution claim amounts. After presenting numerically and graphically the stability bounds, an interpretation and comparison of the results have been done.

Keywords: Marcov chain, regenerative process, risk model, ruin probability, strong stability

Procedia PDF Downloads 287
43722 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground

Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee

Abstract:

To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.

Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk

Procedia PDF Downloads 308
43721 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: probabilistic methods, risk assessment, risk management, slope stability

Procedia PDF Downloads 345
43720 An Information System for Strategic Performance Scoring in Municipal Management

Authors: Emin Gundogar, Aysegul Yilmaz

Abstract:

Strategic performance scoring is a significant procedure in management. There are various methods to improve this procedure. This study introduces an information system that is developed to score performance for municipal management. The application of the system is clarified by exemplifying municipal processes.

Keywords: management information system, municipal management, performance scoring

Procedia PDF Downloads 737
43719 Enhancing a Recidivism Prediction Tool with Machine Learning: Effectiveness and Algorithmic Fairness

Authors: Marzieh Karimihaghighi, Carlos Castillo

Abstract:

This work studies how Machine Learning (ML) may be used to increase the effectiveness of a criminal recidivism risk assessment tool, RisCanvi. The two key dimensions of this analysis are predictive accuracy and algorithmic fairness. ML-based prediction models obtained in this study are more accurate at predicting criminal recidivism than the manually-created formula used in RisCanvi, achieving an AUC of 0.76 and 0.73 in predicting violent and general recidivism respectively. However, the improvements are small, and it is noticed that algorithmic discrimination can easily be introduced between groups such as national vs foreigner, or young vs old. It is described how effectiveness and algorithmic fairness objectives can be balanced, applying a method in which a single error disparity in terms of generalized false positive rate is minimized, while calibration is maintained across groups. Obtained results show that this bias mitigation procedure can substantially reduce generalized false positive rate disparities across multiple groups. Based on these results, it is proposed that ML-based criminal recidivism risk prediction should not be introduced without applying algorithmic bias mitigation procedures.

Keywords: algorithmic fairness, criminal risk assessment, equalized odds, recidivism

Procedia PDF Downloads 123
43718 Assessment of Delirium, It's Possible Risk Factors and Outcome in Patient Admitted in Medical Intensive Care Unit

Authors: Rupesh K. Chaudhary, Narinder P. Jain, Rajesh Mahajan, Rajat Manchanda

Abstract:

Introduction: Delirium is a complex, multifactorial neuropsychiatric syndrome comprising a broad range of cognitive and neurobehavioral symptoms. In critically ill patients, it may develop secondary to multiple predisposing factors. Although it can be transient and irreversible but if left untreated may lead to long term cognitive dysfunction. Early identification and assessment of risk factors usually help in appropriate management of delirium which in turn leads to decreased hospital stay, cost of therapy and mortality. Aim and Objective: Aim of the present study was to estimate the incidence of delirium using a validated scale in medical ICU patients and to determine the associated risk factors and outcomes. Material and Method: A prospective study in an 18-bed medical-intensive care unit (ICU) was undertaken. A total of 357 consecutive patients admitted to ICU for more than 24 hours were assessed. These patients were screened with the help of Confusion Assessment Method for Intensive Care Unit -CAM-ICU, Richmond Agitation and Sedation Scale, Screening Checklist for delirium and APACHE II. Appropiate statistical analysis was done to evaluate the risk factors influencing mortality in delirium. Results: Delirium occurred in 54.6% of 194 patients. Risk of delirium was independently associated with a history of hypertension, diabetes but not with severity of illness APACHE II score. Delirium was linked to longer ICU stay 13.08 ± 9.6 ver 7.07 ± 4.98 days, higher ICU mortality (35.8% % vs. 17.0%). Conclusion: Our study concluded that delirium poses a great risk factor in the outcome of the patient and carries high mortality, so a timely intervention helps in addressing these issues.

Keywords: delirium, risk factors, outcome, intervention

Procedia PDF Downloads 134
43717 Simplifying Health Risk Assessment (HRA) and Its Operationalisation for Turnaround Activities

Authors: Thirumila Muthukamaru

Abstract:

The objective of a Health Risk Assessment (HRA) is to achieve a quality evaluation of risk assessments in a timely manner where adequate controls can be in place to protect workers health, especially during turnarounds where the exposure to health hazards is expected to rise during the performance of the many activities that take place, exposing workers to health risk. HRA development requires a competent team comprising experienced subject matter experts in the field, such as Industrial hygienists, Occupational Health Doctors, Turnaround Coordinators, Operation / Maintenance personnel, etc. The conventional way of conducting HRA is not only tedious and time-consuming but also less appreciated when it is not interpreted correctly, which may contribute to inadequate operationalization of it. Simplification can be the essence of timely intervention in managing health risks. This paper is intended as a sharing of the approach taken to simplify the methodology of developing the HRA report and operationalizing it. The approach includes developing a Generic HRA for turnaround activities to be used as a reference document and the empowerment of identified personnel through upskilling sessions to take up the role of facilitating HRA sessions. This empowerment is one of the key approaches towards the successful translation of the HRA into specific turnaround Job Hazard Analysis (JHA) that embed it in the Permit to Work (PTW) process. The approach used here increases awareness and compliance on HRA for turnaround activities through better interpretation and operationalization of the HRA report, adding value to the risk assessment for turnaround activities.

Keywords: industrial hygiene, health risk assessment, HRA, risk assessment

Procedia PDF Downloads 17
43716 Corrosion Risk Assessment/Risk Based Inspection (RBI)

Authors: Lutfi Abosrra, Alseddeq Alabaoub, Nuri Elhaloudi

Abstract:

Corrosion processes in the Oil & Gas industry can lead to failures that are usually costly to repair, costly in terms of loss of contaminated product, in terms of environmental damage and possibly costly in terms of human safety. This article describes the results of the corrosion review and criticality assessment done at Mellitah Gas (SRU unit) for pressure equipment and piping system. The information gathered through the review was intended for developing a qualitative RBI study. The corrosion criticality assessment has been carried out by applying company procedures and industrial recommended practices such as API 571, API 580/581, ASME PCC 3, which provides a guideline for establishing corrosion integrity assessment. The corrosion review is intimately related to the probability of failure (POF). During the corrosion study, the process units are reviewed by following the applicable process flow diagrams (PFDs) in the presence of Mellitah’s personnel from process engineering, inspection, and corrosion/materials and reliability engineers. The expected corrosion damage mechanism (internal and external) was identified, and the corrosion rate was estimated for every piece of equipment and corrosion loop in the process units. A combination of both Consequence and Likelihood of failure was used for determining the corrosion risk. A qualitative consequence of failure (COF) for each individual item was assigned based on the characteristics of the fluid as per its flammability, toxicity, and pollution into three levels (High, Medium, and Low). A qualitative probability of failure (POF)was applied to evaluate the internal and external degradation mechanism, a high-level point-based (0 to 10) for the purpose of risk prioritizing in the range of Low, Medium, and High.

Keywords: corrosion, criticality assessment, RBI, POF, COF

Procedia PDF Downloads 35
43715 Risk Identification of Investment Feasibility in Indonesia’s Toll Road Infrastructure Investment

Authors: Christo Februanto Putra

Abstract:

This paper presents risk identification that affects investment feasibility on toll road infrastructure in Indonesia using qualitative methods survey based on the expert practitioner in investor, contractor, and state officials. The problems on infrastructure investment in Indonesia, especially on KPBU model contract, is many risk factors in the investment plan is not calculated in detail thoroughly. Risk factor is a value used to provide an overview of the risk level assessment of an event which is a function of the probability of the occurrence and the consequences of the risks that arise. As results of the survey which is to show which risk factors impacts directly to the investment feasibility and rank them by their impacts on the investment.

Keywords: risk identification, indonesia toll road, investment feasibility

Procedia PDF Downloads 244