Search results for: weighted risk scores
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7499

Search results for: weighted risk scores

7499 Weighted Risk Scores Method Proposal for Occupational Safety Risk Assessment

Authors: Ulas Cinar, Omer Faruk Ugurlu, Selcuk Cebi

Abstract:

Occupational safety risk management is the most important element of a safe working environment. Effective risk management can only be possible with accurate analysis and evaluations. Scoring-based risk assessment methods offer considerable ease of application as they convert linguistic expressions into numerical results. It can also be easily adapted to any field. Contrary to all these advantages, important problems in scoring-based methods are frequently discussed. Effective measurability is one of the most critical problems. Existing methods allow experts to choose a score equivalent to each parameter. Therefore, experts prefer the score of the most likely outcome for risk. However, all other possible consequences are neglected. Assessments of the existing methods express the most probable level of risk, not the real risk of the enterprises. In this study, it is aimed to develop a method that will present a more comprehensive evaluation compared to the existing methods by evaluating the probability and severity scores, all sub-parameters, and potential results, and a new scoring-based method is proposed in the literature.

Keywords: occupational health and safety, risk assessment, scoring based risk assessment method, underground mining, weighted risk scores

Procedia PDF Downloads 112
7498 Clustering for Detection of the Population at Risk of Anticholinergic Medication

Authors: A. Shirazibeheshti, T. Radwan, A. Ettefaghian, G. Wilson, C. Luca, Farbod Khanizadeh

Abstract:

Anticholinergic medication has been associated with events such as falls, delirium, and cognitive impairment in older patients. To further assess this, anticholinergic burden scores have been developed to quantify risk. A risk model based on clustering was deployed in a healthcare management system to cluster patients into multiple risk groups according to anticholinergic burden scores of multiple medicines prescribed to patients to facilitate clinical decision-making. To do so, anticholinergic burden scores of drugs were extracted from the literature, which categorizes the risk on a scale of 1 to 3. Given the patients’ prescription data on the healthcare database, a weighted anticholinergic risk score was derived per patient based on the prescription of multiple anticholinergic drugs. This study was conducted on over 300,000 records of patients currently registered with a major regional UK-based healthcare provider. The weighted risk scores were used as inputs to an unsupervised learning algorithm (mean-shift clustering) that groups patients into clusters that represent different levels of anticholinergic risk. To further evaluate the performance of the model, any association between the average risk score within each group and other factors such as socioeconomic status (i.e., Index of Multiple Deprivation) and an index of health and disability were investigated. The clustering identifies a group of 15 patients at the highest risk from multiple anticholinergic medication. Our findings also show that this group of patients is located within more deprived areas of London compared to the population of other risk groups. Furthermore, the prescription of anticholinergic medicines is more skewed to female than male patients, indicating that females are more at risk from this kind of multiple medications. The risk may be monitored and controlled in well artificial intelligence-equipped healthcare management systems.

Keywords: anticholinergic medicines, clustering, deprivation, socioeconomic status

Procedia PDF Downloads 164
7497 Biimodal Biometrics System Using Fusion of Iris and Fingerprint

Authors: Attallah Bilal, Hendel Fatiha

Abstract:

This paper proposes the bimodal biometrics system for identity verification iris and fingerprint, at matching score level architecture using weighted sum of score technique. The features are extracted from the pre processed images of iris and fingerprint. These features of a query image are compared with those of a database image to obtain matching scores. The individual scores generated after matching are passed to the fusion module. This module consists of three major steps i.e., normalization, generation of similarity score and fusion of weighted scores. The final score is then used to declare the person as genuine or an impostor. The system is tested on CASIA database and gives an overall accuracy of 91.04% with FAR of 2.58% and FRR of 8.34%.

Keywords: iris, fingerprint, sum rule, fusion

Procedia PDF Downloads 331
7496 Econophysics: The Use of Entropy Measures in Finance

Authors: Muhammad Sheraz, Vasile Preda, Silvia Dedu

Abstract:

Concepts of econophysics are usually used to solve problems related to uncertainty and nonlinear dynamics. In the theory of option pricing the risk neutral probabilities play very important role. The application of entropy in finance can be regarded as the extension of both information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection, option pricing and asset pricing. Gulko applied Entropy Pricing Theory (EPT) for pricing stock options and introduced an alternative framework of Black-Scholes model for pricing European stock option. In this article, we present solutions to maximum entropy problems based on Tsallis, Weighted-Tsallis, Kaniadakis, Weighted-Kaniadakies entropies, to obtain risk-neutral densities. We have also obtained the value of European call and put in this framework.

Keywords: option pricing, Black-Scholes model, Tsallis entropy, Kaniadakis entropy, weighted entropy, risk-neutral density

Procedia PDF Downloads 264
7495 Notes on Frames in Weighted Hardy Spaces and Generalized Weighted Composition Operators

Authors: Shams Alyusof

Abstract:

This work is to enrich the studies of the frames due to their prominent role in pure mathematics as well as in applied mathematics and many applications in computer science and engineering. Recently, there are remarkable studies of operators that preserve frames on some spaces, and this research could be considered as an extension of such studies. Indeed, this paper is to we characterize weighted composition operators that preserve frames in weighted Hardy spaces on the open unit disk. Moreover, it shows that this characterization does not apply to generalized weighted composition operators on such spaces. Nevertheless, this study could be extended to provide more specific characterizations.

Keywords: frames, generalized weighted composition operators, weighted Hardy spaces, analytic functions

Procedia PDF Downloads 77
7494 A Geographical Information System Supported Method for Determining Urban Transformation Areas in the Scope of Disaster Risks in Kocaeli

Authors: Tayfun Salihoğlu

Abstract:

Following the Law No: 6306 on Transformation of Disaster Risk Areas, urban transformation in Turkey found its legal basis. In the best practices all over the World, the urban transformation was shaped as part of comprehensive social programs through the discourses of renewing the economic, social and physical degraded parts of the city, producing spaces resistant to earthquakes and other possible disasters and creating a livable environment. In Turkish practice, a contradictory process is observed. In this study, it is aimed to develop a method for better understanding of the urban space in terms of disaster risks in order to constitute a basis for decisions in Kocaeli Urban Transformation Master Plan, which is being prepared by Kocaeli Metropolitan Municipality. The spatial unit used in the study is the 50x50 meter grids. In order to reflect the multidimensionality of urban transformation, three basic components that have spatial data in Kocaeli were identified. These components were named as 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings', and 'Inadequacy of Urban Services'. Each component was weighted and scored for each grid. In order to delimitate urban transformation zones Optimized Outlier Analysis (Local Moran I) in the ArcGIS 10.6.1 was conducted to test the type of distribution (clustered or scattered) and its significance on the grids by assuming the weighted total score of the grid as Input Features. As a result of this analysis, it was found that the weighted total scores were not significantly clustering at all grids in urban space. The grids which the input feature is clustered significantly were exported as the new database to use in further mappings. Total Score Map reflects the significant clusters in terms of weighted total scores of 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings' and 'Inadequacy of Urban Services'. Resulting grids with the highest scores are the most likely candidates for urban transformation in this citywide study. To categorize urban space in terms of urban transformation, Grouping Analysis in ArcGIS 10.6.1 was conducted to data that includes each component scores in significantly clustered grids. Due to Pseudo Statistics and Box Plots, 6 groups with the highest F stats were extracted. As a result of the mapping of the groups, it can be said that 6 groups can be interpreted in a more meaningful manner in relation to the urban space. The method presented in this study can be magnified due to the availability of more spatial data. By integrating with other data to be obtained during the planning process, this method can contribute to the continuation of research and decision-making processes of urban transformation master plans on a more consistent basis.

Keywords: urban transformation, GIS, disaster risk assessment, Kocaeli

Procedia PDF Downloads 88
7493 Some Results for F-Minimal Hypersurfaces in Manifolds with Density

Authors: M. Abdelmalek

Abstract:

In this work, we study the hypersurfaces of constant weighted mean curvature embedded in weighted manifolds. We give a condition about these hypersurfaces to be minimal. This condition is given by the ellipticity of the weighted Newton transformations. We especially prove that two compact hypersurfaces of constant weighted mean curvature embedded in space forms and with the intersection in at least a point of the boundary must be transverse. The method is based on the calculus of the matrix of the second fundamental form in a boundary point and then the matrix associated with the Newton transformations. By equality, we find the weighted elementary symmetric function on the boundary of the hypersurface. We give in the end some examples and applications. Especially in Euclidean space, we use the above result to prove the Alexandrov spherical caps conjecture for the weighted case.

Keywords: weighted mean curvature, weighted manifolds, ellipticity, Newton transformations

Procedia PDF Downloads 55
7492 The Study of Rapid Entire Body Assessment and Quick Exposure Check Correlation in an Engine Oil Company

Authors: Mohammadreza Ashouria, Majid Motamedzadeb

Abstract:

Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods to assess the risk factors of work-related musculoskeletal disorders (WMSDs). This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. A trained occupational health practitioner observed all jobs. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of the studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%).There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs and determining the potential risk for incidence of WMSDs. Therefore, there is a possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.

Keywords: observational method, QEC, REBA, musculoskeletal disorders

Procedia PDF Downloads 330
7491 Analyzing Safety Incidents using the Fatigue Risk Index Calculator as an Indicator of Fatigue within a UK Rail Franchise

Authors: Michael Scott Evans, Andrew Smith

Abstract:

The feeling of fatigue at work could potentially have devastating consequences. The aim of this study was to investigate whether the well-established objective indicator of fatigue – the Fatigue Risk Index (FRI) calculator used by the rail industry is an effective indicator to the number of safety incidents, in which fatigue could have been a contributing factor. The study received ethics approval from Cardiff University’s Ethics Committee (EC.16.06.14.4547). A total of 901 safety incidents were recorded from a single British rail franchise between 1st June 2010 – 31st December 2016, into the Safety Management Information System (SMIS). The safety incident types identified that fatigue could have been a contributing factor were: Signal Passed at Danger (SPAD), Train Protection & Warning System (TPWS) activation, Automatic Warning System (AWS) slow to cancel, failed to call, and station overrun. From the 901 recorded safety incidents, the scheduling system CrewPlan was used to extract the Fatigue Index (FI) score and Risk Index (RI) score of all train drivers on the day of the safety incident. Only the working rosters of 64.2% (N = 578) (550 men and 28 female) ranging in age from 24 – 65 years old (M = 47.13, SD = 7.30) were accessible for analyses. Analysis from all 578 train drivers who were involved in safety incidents revealed that 99.8% (N = 577) of Fatigue Index (FI) scores fell within or below the identified guideline threshold of 45 as well as 97.9% (N = 566) of Risk Index (RI) scores falling below the 1.6 threshold range. Their scores represent good practice within the rail industry. These findings seem to indicate that the current objective indicator, i.e. the FRI calculator used in this study by the British rail franchise was not an effective predictor of train driver’s FI scores and RI scores, as safety incidents in which fatigue could have been a contributing factor represented only 0.2% of FI scores and 2.1% of RI scores. Further research is needed to determine whether there are other contributing factors that could provide a better indication as to why there is such a significantly large proportion of train drivers who are involved in safety incidents, in which fatigue could have been a contributing factor have such low FI and RI scores.

Keywords: fatigue risk index calculator, objective indicator of fatigue, rail industry, safety incident

Procedia PDF Downloads 149
7490 Apollo Clinical Excellence Scorecard (ACE@25): An Initiative to Drive Quality Improvement in Hospitals

Authors: Anupam Sibal

Abstract:

Whatever is measured tends to improve. With a view to objectively measuring and improving clinical quality across the Apollo Group Hospitals, the initiative of ACE @ 25 (Apollo Clinical Excellence@25) was launched on Jan 09. ACE @ 25 is a clinically balanced scorecard incorporating 25 clinical quality parameters involving complication rates, mortality rates, one-year survival rates and average length of stay after major procedures like liver and renal transplant, CABG, TKR, THR, TURP, PTCA, endoscopy, large bowel resection and MRM covering all major specialties. Also included are hospital acquired infection rates, pain satisfaction and medication errors. Benchmarks have been chosen from the world’s best hospitals. There are weighted scores for outcomes color coded green, orange and red. The cumulative score is 100. Data is reported monthly by 43 Group Hospitals online on the Lighthouse platform. Action taken reports for parameters falling in red are submitted quarterly and reviewed by the board. An audit team audits the data at all locations every six months. Scores are linked to appraisal of the medical head and there is an “ACE @ 25” Champion Award for the highest scorer. Scores for different parameters were variable from green to red at the start of the initiative. Most hospitals showed an improvement in scores over the last four years for parameters where they had showed scores in red or orange at the start of the initiative. The overall scores for the group have shown an increase from 72 in 2010 to 81 in 2015.

Keywords: benchmarks, clinical quality, lighthouse, platform, scores

Procedia PDF Downloads 262
7489 Bereavement Risk Assessment of Family Caregivers of Patients with Cancer: Relationship between Bereavement Risk and Post-Loss Psychological Distress

Authors: Tomohiro Uchida, Noriaki Satake, Toshimichi Nakaho, Akira Inoue, Hidemitsu Saito

Abstract:

In this study, we assessed the bereavement risk of family caregivers of patients with cancer. In the palliative care unit of Tohoku University Hospital, we conducted a family psychoeducation session to support the family caregivers of patients with cancer. A total of 50 participants (8 males and 42 females; mean age = 62.98 years, SD = 11.10) were assessed after the session for bereavement risk using the Japanese version of the Bereavement Risk Assessment Tool (BRAT-J). According to the BRAT-J scores, eight participants were considered to be having no known risk (Level 1), seventeen had minimal risk (Level 2), twenty had a low risk (Level 3), four had a moderate risk (Level 4), and one had a high risk (Level 5). Of these participants, seven participants had completed the follow-up postal survey that assessed their psychological distress (the Kessler Psychological Distress Scale: K6) to compare the bereavement risk. According to the K6 scores, three-fourth of the individuals, who were considered to be at Level 3 on the BRAT-J, scored higher than the cutoff point (>10) for the detection of depressive disorder. On the other hand, one-third of the individuals, who were considered to be at Level 2 on the BRAT-J, scored higher than the cutoff point. Therefore, it appears that the BRAT-J can predict the likelihood of difficulties or complications in bereaved family caregivers. This research was approved by the Ethics Committee of Tohoku University Graduate School of Medicine and Tohoku University Hospital.

Keywords: palliative care, family caregivers, bereavement risk, BRAT, post-loss psychological distress

Procedia PDF Downloads 427
7488 Post-Contrast Susceptibility Weighted Imaging vs. Post-Contrast T1 Weighted Imaging for Evaluation of Brain Lesions

Authors: Sujith Rajashekar Swamy, Meghana Rajashekara Swamy

Abstract:

Although T1-weighted gadolinium-enhanced imaging (T1-Gd) has its established clinical role in diagnosing brain lesions of infectious and metastatic origins, the use of post-contrast susceptibility-weighted imaging (SWI) has been understudied. This observational study aims to explore and compare the prominence of brain parenchymal lesions between T1-Gd and SWI-Gd images. A cross-sectional study design was utilized to analyze 58 patients with brain parenchymal lesions using T1-Gd and SWI-Gd scanning techniques. Our results indicated that SWI-Gd enhanced the conspicuity of metastatic as well as infectious brain lesions when compared to T1-Gd. Consequently, it can be used as an adjunct to T1-Gd for post-contrast imaging, thereby avoiding additional contrast administration. Improved conspicuity of brain lesions translates directly to enhanced patient outcomes, and hence SWI-Gd imaging proves useful to meet that endpoint.

Keywords: susceptibility weighted, T1 weighted, brain lesions, gadolinium contrast

Procedia PDF Downloads 81
7487 Optimization of Smart Beta Allocation by Momentum Exposure

Authors: J. B. Frisch, D. Evandiloff, P. Martin, N. Ouizille, F. Pires

Abstract:

Smart Beta strategies intend to be an asset management revolution with reference to classical cap-weighted indices. Indeed, these strategies allow a better control on portfolios risk factors and an optimized asset allocation by taking into account specific risks or wishes to generate alpha by outperforming indices called 'Beta'. Among many strategies independently used, this paper focuses on four of them: Minimum Variance Portfolio, Equal Risk Contribution Portfolio, Maximum Diversification Portfolio, and Equal-Weighted Portfolio. Their efficiency has been proven under constraints like momentum or market phenomenon, suggesting a reconsideration of cap-weighting.
 To further increase strategy return efficiency, it is proposed here to compare their strengths and weaknesses inside time intervals corresponding to specific identifiable market phases, in order to define adapted strategies depending on pre-specified situations. 
Results are presented as performance curves from different combinations compared to a benchmark. If a combination outperforms the applicable benchmark in well-defined actual market conditions, it will be preferred. It is mainly shown that such investment 'rules', based on both historical data and evolution of Smart Beta strategies, and implemented according to available specific market data, are providing very interesting optimal results with higher return performance and lower risk.
 Such combinations have not been fully exploited yet and justify present approach aimed at identifying relevant elements characterizing them.

Keywords: smart beta, minimum variance portfolio, equal risk contribution portfolio, maximum diversification portfolio, equal weighted portfolio, combinations

Procedia PDF Downloads 306
7486 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment

Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu

Abstract:

Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.

Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization

Procedia PDF Downloads 420
7485 Hybrid Fuzzy Weighted K-Nearest Neighbor to Predict Hospital Readmission for Diabetic Patients

Authors: Soha A. Bahanshal, Byung G. Kim

Abstract:

Identification of patients at high risk for hospital readmission is of crucial importance for quality health care and cost reduction. Predicting hospital readmissions among diabetic patients has been of great interest to many researchers and health decision makers. We build a prediction model to predict hospital readmission for diabetic patients within 30 days of discharge. The core of the prediction model is a modified k Nearest Neighbor called Hybrid Fuzzy Weighted k Nearest Neighbor algorithm. The prediction is performed on a patient dataset which consists of more than 70,000 patients with 50 attributes. We applied data preprocessing using different techniques in order to handle data imbalance and to fuzzify the data to suit the prediction algorithm. The model so far achieved classification accuracy of 80% compared to other models that only use k Nearest Neighbor.

Keywords: machine learning, prediction, classification, hybrid fuzzy weighted k-nearest neighbor, diabetic hospital readmission

Procedia PDF Downloads 148
7484 Discarding or Correcting Outlier Scores vs. Excluding Outlier Jurors to Reduce Manipulation in Classical Music Competitions.

Authors: Krzysztof Kontek, Kevin Kenner

Abstract:

This paper, written by an economist and pianist, aims to compare and analyze different methods of reducing manipulation in classical music competitions by focusing on outlier scores and outlier jurors. We first examine existing methods in competition practice and statistical literature for discarding or correcting jurors' scores that deviate significantly from the mean or median of all scores. We then introduce a method that involves eliminating all scores of outlier jurors, i.e., those jurors whose ratings significantly differ from those of other jurors. The properties of these standard and proposed methods are discussed in hypothetical voting scenarios, where one or more jurors assign scores that deviate considerably from the scores awarded by other jurors. Finally, we present examples of applying various methods to real-world data from piano competitions, demonstrating the potential effectiveness and implications of each approach in reducing manipulation within these events.

Keywords: voting systems, manipulation, outlier scores, outlier jurors

Procedia PDF Downloads 45
7483 Cognitive Characteristics of Industrial Workers in Fuzzy Risk Assessment

Authors: Hyeon-Kyo Lim, Sang-Hun Byun

Abstract:

Risk assessment is carried out in most industrial plants for accident prevention, but there exists insufficient data for statistical decision making. It is commonly said that risk can be expressed as a product of consequence and likelihood of a corresponding hazard factor. Eventually, therefore, risk assessment involves human decision making which cannot be objective per se. This study was carried out to comprehend perceptive characteristics of human beings in industrial plants. Subjects were shown a set of illustrations describing scenes of industrial plants, and were asked to assess the risk of each scene with not only linguistic variables but also numeric scores in the aspect of consequence and likelihood. After that, their responses were formulated as fuzzy membership functions, and compared with those of university students who had no experience of industrial works. The results showed that risk level of industrial workers were lower than those of any other groups, which implied that the workers might generally have a tendency to neglect more hazard factors in their work fields.

Keywords: fuzzy, hazard, linguistic variable, risk assessment

Procedia PDF Downloads 220
7482 Conflation Methodology Applied to Flood Recovery

Authors: Eva L. Suarez, Daniel E. Meeroff, Yan Yong

Abstract:

Current flooding risk modeling focuses on resilience, defined as the probability of recovery from a severe flooding event. However, the long-term damage to property and well-being by nuisance flooding and its long-term effects on communities are not typically included in risk assessments. An approach was developed to address the probability of recovering from a severe flooding event combined with the probability of community performance during a nuisance event. A consolidated model, namely the conflation flooding recovery (&FR) model, evaluates risk-coping mitigation strategies for communities based on the recovery time from catastrophic events, such as hurricanes or extreme surges, and from everyday nuisance flooding events. The &FR model assesses the variation contribution of each independent input and generates a weighted output that favors the distribution with minimum variation. This approach is especially useful if the input distributions have dissimilar variances. The &FR is defined as a single distribution resulting from the product of the individual probability density functions. The resulting conflated distribution resides between the parent distributions, and it infers the recovery time required by a community to return to basic functions, such as power, utilities, transportation, and civil order, after a flooding event. The &FR model is more accurate than averaging individual observations before calculating the mean and variance or averaging the probabilities evaluated at the input values, which assigns the same weighted variation to each input distribution. The main disadvantage of these traditional methods is that the resulting measure of central tendency is exactly equal to the average of the input distribution’s means without the additional information provided by each individual distribution variance. When dealing with exponential distributions, such as resilience from severe flooding events and from nuisance flooding events, conflation results are equivalent to the weighted least squares method or best linear unbiased estimation. The combination of severe flooding risk with nuisance flooding improves flood risk management for highly populated coastal communities, such as in South Florida, USA, and provides a method to estimate community flood recovery time more accurately from two different sources, severe flooding events and nuisance flooding events.

Keywords: community resilience, conflation, flood risk, nuisance flooding

Procedia PDF Downloads 55
7481 Nonparametric Estimation of Risk-Neutral Densities via Empirical Esscher Transform

Authors: Manoel Pereira, Alvaro Veiga, Camila Epprecht, Renato Costa

Abstract:

This paper introduces an empirical version of the Esscher transform for risk-neutral option pricing. Traditional parametric methods require the formulation of an explicit risk-neutral model and are operational only for a few probability distributions for the returns of the underlying. In our proposal, we make only mild assumptions on the pricing kernel and there is no need for the formulation of the risk-neutral model for the returns. First, we simulate sample paths for the returns under the physical distribution. Then, based on the empirical Esscher transform, the sample is reweighted, giving rise to a risk-neutralized sample from which derivative prices can be obtained by a weighted sum of the options pay-offs in each path. We compare our proposal with some traditional parametric pricing methods in four experiments with artificial and real data.

Keywords: esscher transform, generalized autoregressive Conditional Heteroscedastic (GARCH), nonparametric option pricing

Procedia PDF Downloads 450
7480 Research on the Risks of Railroad Receiving and Dispatching Trains Operators: Natural Language Processing Risk Text Mining

Authors: Yangze Lan, Ruihua Xv, Feng Zhou, Yijia Shan, Longhao Zhang, Qinghui Xv

Abstract:

Receiving and dispatching trains is an important part of railroad organization, and the risky evaluation of operating personnel is still reflected by scores, lacking further excavation of wrong answers and operating accidents. With natural language processing (NLP) technology, this study extracts the keywords and key phrases of 40 relevant risk events about receiving and dispatching trains and reclassifies the risk events into 8 categories, such as train approach and signal risks, dispatching command risks, and so on. Based on the historical risk data of personnel, the K-Means clustering method is used to classify the risk level of personnel. The result indicates that the high-risk operating personnel need to strengthen the training of train receiving and dispatching operations towards essential trains and abnormal situations.

Keywords: receiving and dispatching trains, natural language processing, risk evaluation, K-means clustering

Procedia PDF Downloads 33
7479 Developing a Risk Rating Tool for Shopping Centres

Authors: Prandesha Govender, Chris Cloete

Abstract:

Purpose: The objective of the paper is to develop a tool for the evaluation of the financial risk of a shopping center. Methodology: Important factors that indicate the success of a shopping center were identified from the available literature. Weights were allocated to these factors and a risk rating was calculated for 505 shopping centers in the largest province in South Africa by taking the factor scores, factor weights, and category weights into account. The ratings for ten randomly selected shopping centers were correlated with consumer feedback and standardized against the ECAI (External Credit Assessment Institutions) data for the same centers. The ratings were also mapped to corporates with the same risk rating to provide a better intuitive assessment of the meaning of the inherent risk of each center. Results: The proposed risk tool shows a strong linear correlation with consumer views and can be compared to expert opinions, such as that of fund managers and REITs. Interpretation of the tool was also illustrated by correlating the risk rating of selected shopping centers to the risk rating of reputable and established entities. Conclusions: The proposed Shopping Centre Risk Tool, used in conjunction with financial inputs from the relevant center, should prove useful to an investor when the desirability of investment in or expansion, renovation, or purchase of a shopping center is being considered.

Keywords: risk, shopping centres, risk modelling, investment, rating tool, rating scale

Procedia PDF Downloads 79
7478 Association of Preoperative Pain Catastrophizing with Postoperative Pain after Lower Limb Trauma Surgery

Authors: Asish Subedi, Krishna Pokharel, Birendra Prasad Sah, Pashupati Chaudhary

Abstract:

Objectives: To evaluate an association between preoperative Nepali pain catastrophizing scale (N-PCS) scores and postoperative pain intensity and total opioid consumption. Methods: In this prospective cohort study we enrolled 135 patients with an American Society of Anaesthesiologists physical status I or II, aged between 18 and 65 years, and scheduled for surgery for lower-extremity fracture under spinal anaesthesia. Maximum postoperative pain reported during the 24 h was classified into two groups, no-mild pain group (Numeric rating scale [NRS] scores 1 to 3) and a moderate-severe pain group (NRS 4-10). The Spearman correlation coefficient was used to compare the association between the baseline N-PCS scores and outcome variables, i.e., the maximum NRS pain score and the total tramadol consumption within the first 24 h after surgery. Logistic regression models were used to identify the predictors for the intensity of postoperative pain. Results: As four patients violated the protocol, the data of 131 patients were analysed. Mean N-PCS scores reported by the moderate-severe pain group was 27.39 ±9.50 compared to 18.64 ±10 mean N-PCS scores by the no-mild pain group (p<0.001). Preoperative PCS scores correlated positively with postoperative pain intensity (r =0.39, [95% CI 0.23-0.52], p<0.001) and total tramadol consumption (r =0.32, [95% CI 0.16-0.47], p<0.001). An increase in catastrophizing scores was associated with postoperative moderate-severe pain (odds ratio, 1.08 [95% confidence interval, 1.02-1.15], p=0.006) after adjusting for gender, ethnicity and preoperative anxiety. Conclusion: Patients who reported higher pain catastrophizing preoperatively were at increased risk of experiencing moderate-severe postoperative pain.

Keywords: nepali, pain catastrophizing, postoperative pain, trauma

Procedia PDF Downloads 84
7477 Variogram Fitting Based on the Wilcoxon Norm

Authors: Hazem Al-Mofleh, John Daniels, Joseph McKean

Abstract:

Within geostatistics research, effective estimation of the variogram points has been examined, particularly in developing robust alternatives. The parametric fit of these variogram points which eventually defines the kriging weights, however, has not received the same attention from a robust perspective. This paper proposes the use of the non-linear Wilcoxon norm over weighted non-linear least squares as a robust variogram fitting alternative. First, we introduce the concept of variogram estimation and fitting. Then, as an alternative to non-linear weighted least squares, we discuss the non-linear Wilcoxon estimator. Next, the robustness properties of the non-linear Wilcoxon are demonstrated using a contaminated spatial data set. Finally, under simulated conditions, increasing levels of contaminated spatial processes have their variograms points estimated and fit. In the fitting of these variogram points, both non-linear Weighted Least Squares and non-linear Wilcoxon fits are examined for efficiency. At all levels of contamination (including 0%), using a robust estimation and robust fitting procedure, the non-weighted Wilcoxon outperforms weighted Least Squares.

Keywords: non-linear wilcoxon, robust estimation, variogram estimation, wilcoxon norm

Procedia PDF Downloads 423
7476 Assessment of Mortgage Applications Using Fuzzy Logic

Authors: Swathi Sampath, V. Kalaichelvi

Abstract:

The assessment of the risk posed by a borrower to a lender is one of the common problems that financial institutions have to deal with. Consumers vying for a mortgage are generally compared to each other by the use of a number called the Credit Score, which is generated by applying a mathematical algorithm to information in the applicant’s credit report. The higher the credit score, the lower the risk posed by the candidate, and the better he is to be taken on by the lender. The objective of the present work is to use fuzzy logic and linguistic rules to create a model that generates Credit Scores.

Keywords: credit scoring, fuzzy logic, mortgage, risk assessment

Procedia PDF Downloads 368
7475 Risk Measure from Investment in Finance by Value at Risk

Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji

Abstract:

Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.

Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk

Procedia PDF Downloads 404
7474 A Comparative Analysis of Grade Weighted Average and Comprehensive Examination Result of Non Board Passers and Board Passers

Authors: Rob Gesley Capistrano, Jasper James Isaac, Rose Mae Moralda, Therese Anne Peleo, Danica Rillo, Maria Virginia Santillian

Abstract:

One of the valuable things that shows the intelligence among individuals is the academic background specifically their Grade Weighted Average and the significant result of the Comprehensive Examination. The general objective of the researchers to this study is to determine if there is a significant difference between General Weighted Average and Comprehensive Examination Result of Psychometrician Board Passers and Non-Board Passers. The respondents of this study composed of board passers and non-board passers. The researchers used purposive sampling technique. The result utilized by using T-test Independent Sample to determine the comparison of General Weighted Average and Comprehensive Examination Result of Board Passers and Non Board Passers. At the end, it concluded that the General Weighted Average of Board Passers and Non-Board Passers shows that there is no significant difference, but the average showed a minimal variation. The Comprehensive Examination Result of Board Passers and Non-Board Passers result revealed that there is a significant difference. The performance of comprehensive examination that will test the overall knowledge of an individual and will determine whose more proficient will likely to have a higher score. The result of the comprehensive examination had an impact in the passing performance of board examination.

Keywords: board passers, comprehensive examination result, grade weighted average, non board passers

Procedia PDF Downloads 147
7473 Spatial Analysis of Flood Vulnerability in Highly Urbanized Area: A Case Study in Taipei City

Authors: Liang Weichien

Abstract:

Without adequate information and mitigation plan for natural disaster, the risk to urban populated areas will increase in the future as populations grow, especially in Taiwan. Taiwan is recognized as the world's high-risk areas, where an average of 5.7 times of floods occur per year should seek to strengthen coherence and consensus in how cities can plan for flood and climate change. Therefore, this study aims at understanding the vulnerability to flooding in Taipei city, Taiwan, by creating indicators and calculating the vulnerability of each study units. The indicators were grouped into sensitivity and adaptive capacity based on the definition of vulnerability of Intergovernmental Panel on Climate Change. The indicators were weighted by using Principal Component Analysis. However, current researches were based on the assumption that the composition and influence of the indicators were the same in different areas. This disregarded spatial correlation that might result in inaccurate explanation on local vulnerability. The study used Geographically Weighted Principal Component Analysis by adding geographic weighting matrix as weighting to get the different main flood impact characteristic in different areas. Cross Validation Method and Akaike Information Criterion were used to decide bandwidth and Gaussian Pattern as the bandwidth weight scheme. The ultimate outcome can be used for the reduction of damage potential by integrating the outputs into local mitigation plan and urban planning.

Keywords: flood vulnerability, geographically weighted principal components analysis, GWPCA, highly urbanized area, spatial correlation

Procedia PDF Downloads 262
7472 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 289
7471 Risk of Type 2 Diabetes among Female College Students in Saudi Arabia

Authors: Noor A. Hakim

Abstract:

Several studies in the developed countries investigated the prevalence of diabetes and obesity among individuals from different socioeconomic levels and suggested lower rates among the higher socioeconomic groups. However, studies evaluating diabetes risk and prevalence of obesity among the population of middle- to high-income status in developing countries are limited. The aim of this study is to evaluate the risk of developing type-2 diabetes mellitus (T2DM) and the weight status of female students in private universities in Jeddah City, Saudi Arabia. This is a cross-sectional study of 121 female students aged ≤ 25 years old was conducted; participants were recruited from two private universities. Diabetes risk was evaluated using the Finnish Diabetes Risk Score. Anthropometric measurements were assessed, and body-mass-index (BMI) was calculated. Diabetes risk scores indicated that 35.5% of the female students had a slightly elevated risk, and 10.8% had a moderate to high risk to develop T2DM. One-third of the females (29.7%) were overweight or obese. The majority of the normal weight and underweight groups were classified to have a low risk of diabetes, 22.2% of the overweight participants were classified to have moderate to high risk, and over half of the obese participants (55.5%) were classified to be at the moderate to high-risk category. Conclusions: Given that diabetes risk is alarming among the population in Saudi Arabia, healthcare providers should utilize a simple screening tool to identify high-risk individuals and initiate diabetes preventive strategies to prevent, or delay, the onset of T2DM and improve the quality of life.

Keywords: risk of type 2 diabetes, weight status, college students, socioeconomic status

Procedia PDF Downloads 144
7470 Risk of Disrupted Eating Attitudes in Disabled Athletes

Authors: Zehra Buyuktuncer, Aylin H. Büyükkaragöz, Tuğçe N. Balcı, Nevin Ergun

Abstract:

Background: Undergoing rigid dietary habits for enhancing athletic performance could lead to eating disorders. High prevalence of eating disorders among female athletes has been already reported. However, the risk of disordered eating among disabled athletes is not known. A better knowledge of the different eating behaviors and their prevalence in disabled athletes would be helpful to understand interactions between eating and health. This study aimed to examine the cognitive restraint, uncontrolled eating and emotional eating behaviors in a disabled athlete population. Method: A total of 70 disabled Turkish national athletes (33 female, 37 male) from 5 sport branches (soccer, weight lifting, shooting, table tennis and basketball) were involved in the study. The cognitive restraint, uncontrolled eating and emotional eating behaviors were assessed using the revised version of Three Factor Eating Questionnaire-R18 (TFEQ-R18). The questionnaires were conducted by dietitian during the preparation camps of athletes. Body weight, height and waist circumference (WC) were measured; and body composition was analyzed by bioelectrical impedance analysis method. Results: The TFEQ scales showed a cognitive dietary restraint score of 13.9±4.2, uncontrolled eating score of 17.7±5.8 and emotional eating score of 4.9±2.5. The mean score of total TFEQ-R18 was 36.5±8.62. Neither total TFEQ-R18 score nor subscale scores differed significantly by gender or sport branches (p>0.05, for each). The scores were also similar in BMI groups (n=63; p>0.05). Total TFEQ, uncontrolled eating and emotional eating scores were significantly higher among the athletes with congenital disabilities compared to the scores of the athletes with acquired disabilities (p<0.05, for each). Moreover, the cognitive dietary restraint score was significantly high in athletes who would like to lose weight (p=0.009). Conclusion: Disabled athletes might have a risk of disordered eating. The different eating behaviors among disabled athletes should be assessed using validated tools to develop personalized nutritional strategies for those athletes.

Keywords: disabled athletes, eating behaviour, three-factor eating questionnaire-r18, body composition

Procedia PDF Downloads 303