Search results for: risk optimization
7954 Using Risk Management Indicators in Decision Tree Analysis
Authors: Adel Ali Elshaibani
Abstract:
Risk management indicators augment the reporting infrastructure, particularly for the board and senior management, to identify, monitor, and manage risks. This enhancement facilitates improved decision-making throughout the banking organization. Decision tree analysis is a tool that visually outlines potential outcomes, costs, and consequences of complex decisions. It is particularly beneficial for analyzing quantitative data and making decisions based on numerical values. By calculating the expected value of each outcome, decision tree analysis can help assess the best course of action. In the context of banking, decision tree analysis can assist lenders in evaluating a customer’s creditworthiness, thereby preventing losses. However, applying these tools in developing countries may face several limitations, such as data availability, lack of technological infrastructure and resources, lack of skilled professionals, cultural factors, and cost. Moreover, decision trees can create overly complex models that do not generalize well to new data, known as overfitting. They can also be sensitive to small changes in the data, which can result in different tree structures and can become computationally expensive when dealing with large datasets. In conclusion, while risk management indicators and decision tree analysis are beneficial for decision-making in banks, their effectiveness is contingent upon how they are implemented and utilized by the board of directors, especially in the context of developing countries. It’s important to consider these limitations when planning to implement these tools in developing countries.Keywords: risk management indicators, decision tree analysis, developing countries, board of directors, bank performance, risk management strategy, banking institutions
Procedia PDF Downloads 607953 Application of Container Technique to High-Risk Children: Its Effect on Their Levels of Stress, Anxiety and Depression
Authors: Nguyen Thi Loan, Phan Ngoc Thanh Tra
Abstract:
Container is one of the techniques used in Eye Movement Desensitization and Reprocessing (EDMR) Therapy. This paper presents the positive results of applying Container technique to “high risk children”. The sample for this research is composed of 60 “high risk children” whose ages range from 11 to 18 years old, housed in Ho Chi Minh City Youth Center. They have been under the program of the Worldwide Orphans Foundation since August 2015 for various reasons such as, loss of parents, anti-social behaviors, homelessness, child labor among others. These “high risk children” are under high levels of stress, anxiety and depression. The subjects were divided into two groups: the control and the experimental with 30 members each. The experimental group was applied Container Technique and the instruments used to measure their levels of stress, anxiety, and depression are DASS-42 and ASEBA. Results show that after applying the Container Technique to the experimental group, there are significant differences between the two groups’ levels of stress, anxiety and depression. The experimental group’s levels of stress, anxiety and depression decreased significantly. The results serve as a basis for the researchers to make an appeal to psychologists to apply Container Technique in doing psychological treatment in a suitable context.Keywords: anxiety, depression, container technique, EMDR
Procedia PDF Downloads 2977952 Surveying Earthquake Vulnerabilities of District 13 of Kabul City, Afghanistan
Authors: Mohsen Mohammadi, Toshio Fujimi
Abstract:
High population and irregular urban development in Kabul city, Afghanistan's capital, are among factors that increase its vulnerability to earthquake disasters (on top of its location in a high seismic region); this can lead to widespread economic loss and casualties. This study aims to evaluate earthquake risks in Kabul's 13th district based on scientific data. The research data, which include hazard curves of Kabul, vulnerability curves, and a questionnaire survey through sampling in district 13, have been incorporated to develop risk curves. To estimate potential casualties, we used a set of M parameters in a model developed by Coburn and Spence. The results indicate that in the worst case scenario, more than 90% of district 13, which comprises mostly residential buildings, is exposed to high risk; this may lead to nearly 1000 million USD economic loss and 120 thousand casualties (equal to 25.88% of the 13th district's population) for a nighttime earthquake. To reduce risks, we present the reconstruction of the most vulnerable buildings, which are primarily adobe and masonry buildings. A comparison of risk reduction between reconstructing adobe and masonry buildings indicates that rebuilding adobe buildings would be more effective.Keywords: earthquake risk evaluation, Kabul, mitigation, vulnerability
Procedia PDF Downloads 2817951 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm
Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali
Abstract:
Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir
Procedia PDF Downloads 2657950 Association of Maternal Age, Ethnicity and BMI with Gestational Diabetes Prevalence in Multi-Racial Singapore
Authors: Nur Atiqah Adam, Mor Jack Ng, Bernard Chern, Kok Hian Tan
Abstract:
Introduction: Gestational diabetes (GDM) is a common pregnancy complication with short and long-term health consequences for both mother and fetus. Factors such as family history of diabetes mellitus, maternal obesity, maternal age, ethnicity and parity have been reported to influence the risk of GDM. In a multi-racial country like Singapore, it is worthwhile to study the GDM prevalences of different ethnicities. We aim to investigate the influence of ethnicity on the racial prevalences of GDM in Singapore. This is important as it may help us to improve guidelines on GDM healthcare services according to significant risk factors unique to Singapore. Materials and Methods: Obstetric cohort data of 926 singleton deliveries in KK Women’s and Children’s Hospital (KKH) from 2011 to 2013 was obtained. Only patients aged 18 and above and without complicated pregnancies or chronic illnesses were targeted. Factors such as ethnicity, maternal age, parity and maternal body mass index (BMI) at booking visit were studied. A multivariable logistic regression model, adjusted for confounders, was used to determine which of these factors are significantly associated with an increased risk of GDM. Results: The overall GDM prevalence rate based on WHO 1999 criteria & at risk screening (race alone not a risk factor) was 8.86%. GDM rates were higher among women above 35 years old (15.96%), obese (15.15%) and multiparous women (10.12%). Indians had a higher GDM rate (13.0 %) compared to the Chinese (9.57%) and Malays (5.20%). However, using multiple logistic regression model, variables that are significantly related to GDM rates were maternal age (p < 0.001) and maternal BMI at booking visit (p = 0.006). Conclusion: Maternal age (p < 0.001) and maternal booking BMI (p = 0.006) are the strongest risk factors for GDM. Ethnicity per se does not seem to have a significant influence on the prevalence of GDM in Singapore (p = 0.064). Hence we should tailor guidelines on GDM healthcare services according to maternal age and booking BMI rather than ethnicity.Keywords: ethnicity, gestational diabetes, healthcare, pregnancy
Procedia PDF Downloads 2267949 A Small-Scale Survey on Risk Factors of Musculoskeletal Disorders in Workers of Logistics Companies in Cyprus and on the Early Adoption of Industrial Exoskeletons as Mitigation Measure
Authors: Kyriacos Clerides, Panagiotis Herodotou, Constantina Polycarpou, Evagoras Xydas
Abstract:
Background: Musculoskeletal disorders (MSDs) in the workplace is a very common problem in Europe which are caused by multiple risk factors. In recent years, wearable devices and exoskeletons for the workplace have been trying to address the various risk factors that are associated with strenuous tasks in the workplace. The logistics sector is a huge sector that includes warehousing, storage, and transportation. However, the task associated with logistics is not well-studied in terms of MSDs risk. This study was aimed at looking into the MSDs affecting workers of logistics companies. It compares the prevalence of MSDs among workers and evaluates multiple risk factors that contribute to the development of MSDs. Moreover, this study seeks to obtain user feedback on the adoption of exoskeletons in such a work environment. Materials and Methods: The study was conducted among workers in logistics companies in Nicosia, Cyprus, from July to September 2022. A set of standardized questionnaires was used for collecting different types of data. Results: A high proportion of logistics professionals reported MSDs in one or more other body regions, the lower back being the most commonly affected area. Working in the same position for long periods, working in awkward postures, and handling an excessive load, were found to be the most commonly reported job risk factor that contributed to the development of MSDs, in this study. A significant number of participants consider the back region as the most to be benefited from a wearable exoskeleton device. Half of the participants would like to have at least a 50% reduction in their daily effort. The most important characteristics for the adoption of exoskeleton devices were found to be how comfortable the device is and its weight. Conclusion: Lower back and posture were the highest risk factors among all logistics professionals assessed in this study. A larger scale study using quantitative analytical tools may give a more accurate estimate of MSDs, which would pave the way for making more precise recommendations to eliminate the risk factors and thereby prevent MSDs. A follow-up study using exoskeletons in the workplace should be done to assess whether they assist in MSD prevention.Keywords: musculoskeletal disorders, occupational health, safety, occupational risk, logistic companies, workers, Cyprus, industrial exoskeletons, wearable devices
Procedia PDF Downloads 1077948 Levels of Selected Heavy Metals in Varieties of Vegetable oils Consumed in Kingdom of Saudi Arabia and Health Risk Assessment of Local Population
Authors: Muhammad Waqar Ashraf
Abstract:
Selected heavy metals, namely Cu, Zn, Fe, Mn, Cd, Pb, and As, in seven popular varieties of edible vegetable oils collected from Saudi Arabia, were determined by graphite furnace atomic absorption spectrometry (GF-AAS) using microwave digestion. The accuracy of procedure was confirmed by certified reference materials (NIST 1577b). The concentrations for copper, zinc, iron, manganese, lead and arsenic were observed in the range of 0.035 - 0.286, 0.955 - 3.10, 17.3 - 57.8, 0.178 - 0.586, 0.011 - 0.017 and 0.011 - 0.018 µg/g, respectively. Cadmium was found to be in the range of 2.36 - 6.34 ng/g. The results are compared internationally and with standards laid down by world health agencies. A risk assessment study has been carried out to assess exposure to these metals via consumption of vegetable oils. A comparison has been made with safety intake levels for these heavy metals recommended by Institute of Medicine of the National Academies (IOM), US Environmental Protection Agency (US EPA) and Joint FAO/WHO Expert Committee on Food Additives (JECFA). The results indicated that the dietary intakes of the selected heavy metals from daily consumption of 25 g of edible vegetable oils for a 70 kg individual should pose no significant health risk to local population.Keywords: vegetable oils, heavy metals, contamination, health risk assessment
Procedia PDF Downloads 4527947 Role of Erythrocyte Fatty Acids in Predicting Cardiometabolic Risk among the Elderly: A Secondary Analysis of the Walnut and Healthy Aging Study
Authors: Tony Jehi, Sujatha Rajaram, Nader majzoub, Joan Sabate
Abstract:
Aging significantly increases the incidence of various cardiometabolic diseases, including cardiovascular disease (CVD). To combat CVD and its associated risk factors, it is imperative to adopt a healthy dietary pattern that is rife with beneficial nutrient and non-nutrient compounds. Unsaturated fats, specifically n-3 polyunsaturated fatty acids (n-3 PUFA), have cardio-protective effects; the opposite is true for saturated fatty acids. What role, if any, does the biomarker of fatty acid intake (specific fatty acids in the erythrocyte) play in predicting cardiometabolic risk among the elderly, a population highly susceptible to increased mortality and morbidity from CVD risk factors, remains unclear. This was a secondary analysis of the Walnuts and Healthy Aging Study. Briefly, elderly (n=192, mean age 69 y) participants followed their usual diet and were randomized into two groups to either eat walnuts daily or abstain from eating walnuts for a period of 2 years. The purpose was to identify potential associations between erythrocyte membrane fatty acids and cardiometabolic risk factors (body weight, blood pressure, blood lipids, and fasting glucose). Erythrocyte n-3 PUFA were inversely associated with total cholesterol (ß = -3.83; p= 0.02), triglycerides (ß = -7.66; p= <0.01), and fasting glucose (ß = -0.19; p=0.03). Specifically, erythrocyte ALA (ß= -1.59; P = 0.04) and DPA (ß= -0.62; P=0.04) were inversely associated with diastolic blood pressure and fasting glucose, respectively. N-6 PUFAs were positively associated with systolic blood pressure (ß=1.10; P=0.02). Mono-unsaturated fatty acids were positively associated with TAG (ß = 4.16; P=0.03). Total saturated fatty acids were not associated with any cardiometabolic risk factors. No association was found between any erythrocyte fatty acid and body weight. In conclusion, erythrocyte n-3 PUFA may be used as a biomarker to predict the cardiometabolic risk among healthy elders, providing support for the American Heart Association guidelines for including n-3 PUFA for preventing CVD.Keywords: cardiometabolic diseases, erythrocyte fatty acids, elderly, n-3 PUFA
Procedia PDF Downloads 717946 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique
Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak
Abstract:
The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method
Procedia PDF Downloads 1797945 Estimating the Value of Statistical Life under the Subsidization and Cultural Effects
Authors: Mohammad A. Alolayan, John S. Evans, James K. Hammitt
Abstract:
The value of statistical life has been estimated for a middle eastern country with high economical subsidization system. In this study, in-person interviews were conducted on a stratified random sample to estimate the value of mortality risk. Double-bounded dichotomous choice questions followed by open-ended question were used in the interview to investigate the willingness to pay of the respondent for mortality risk reduction. High willingness to pay was found to be associated with high income and education. Also, females were found to have lower willingness to pay than males. The estimated value of statistical life is larger than the ones estimated for western countries where taxation system exists. This estimate provides a baseline for monetizing the health benefits for proposed policy or program to the decision makers in an eastern country. Also, the value of statistical life for a country in the region can be extrapolated from this this estimate by using the benefit transfer method.Keywords: mortality, risk, VSL, willingness-to-pay
Procedia PDF Downloads 3157944 Parallel 2-Opt Local Search on GPU
Authors: Wen-Bao Qiao, Jean-Charles Créput
Abstract:
To accelerate the solution for large scale traveling salesman problems (TSP), a parallel 2-opt local search algorithm with simple implementation based on Graphics Processing Unit (GPU) is presented and tested in this paper. The parallel scheme is based on technique of data decomposition by dynamically assigning multiple K processors on the integral tour to treat K edges’ 2-opt local optimization simultaneously on independent sub-tours, where K can be user-defined or have a function relationship with input size N. We implement this algorithm with doubly linked list on GPU. The implementation only requires O(N) memory. We compare this parallel 2-opt local optimization against sequential exhaustive 2-opt search along integral tour on TSP instances from TSPLIB with more than 10000 cities.Keywords: parallel 2-opt, double links, large scale TSP, GPU
Procedia PDF Downloads 6257943 Multi-Objective Optimization of Electric Discharge Machining for Inconel 718
Authors: Pushpendra S. Bharti, S. Maheshwari
Abstract:
Electric discharge machining (EDM) is one of the most widely used non-conventional manufacturing process to shape difficult-to-cut materials. The process yield, in terms of material removal rate, surface roughness and tool wear rate, of EDM may considerably be improved by selecting the optimal combination(s) of process parameters. This paper employs Multi-response signal-to-noise (MRSN) ratio technique to find the optimal combination(s) of the process parameters during EDM of Inconel 718. Three cases v.i.z. high cutting efficiency, high surface finish, and normal machining have been taken and the optimal combinations of input parameters have been obtained for each case. Analysis of variance (ANOVA) has been employed to find the dominant parameter(s) in all three cases. The experimental verification of the obtained results has also been made. MRSN ratio technique found to be a simple and effective multi-objective optimization technique.Keywords: electric discharge machining, material removal rate, surface roughness, too wear rate, multi-response signal-to-noise ratio, multi response signal-to-noise ratio, optimization
Procedia PDF Downloads 3547942 Optimum Dewatering Network Design Using Firefly Optimization Algorithm
Authors: S. M. Javad Davoodi, Mojtaba Shourian
Abstract:
Groundwater table close to the ground surface causes major problems in construction and mining operation. One of the methods to control groundwater in such cases is using pumping wells. These pumping wells remove excess water from the site project and lower the water table to a desirable value. Although the efficiency of this method is acceptable, it needs high expenses to apply. It means even small improvement in a design of pumping wells can lead to substantial cost savings. In order to minimize the total cost in the method of pumping wells, a simulation-optimization approach is applied. The proposed model integrates MODFLOW as the simulation model with Firefly as the optimization algorithm. In fact, MODFLOW computes the drawdown due to pumping in an aquifer and the Firefly algorithm defines the optimum value of design parameters which are numbers, pumping rates and layout of the designing wells. The developed Firefly-MODFLOW model is applied to minimize the cost of the dewatering project for the ancient mosque of Kerman city in Iran. Repetitive runs of the Firefly-MODFLOW model indicates that drilling two wells with the total rate of pumping 5503 m3/day is the result of the minimization problem. Results show that implementing the proposed solution leads to at least 1.5 m drawdown in the aquifer beneath mosque region. Also, the subsidence due to groundwater depletion is less than 80 mm. Sensitivity analyses indicate that desirable groundwater depletion has an enormous impact on total cost of the project. Besides, in a hypothetical aquifer decreasing the hydraulic conductivity contributes to decrease in total water extraction for dewatering.Keywords: groundwater dewatering, pumping wells, simulation-optimization, MODFLOW, firefly algorithm
Procedia PDF Downloads 2947941 Economic Evaluation Offshore Wind Project under Uncertainly and Risk Circumstances
Authors: Sayed Amir Hamzeh Mirkheshti
Abstract:
Offshore wind energy as a strategic renewable energy, has been growing rapidly due to availability, abundance and clean nature of it. On the other hand, budget of this project is incredibly higher in comparison with other renewable energies and it takes more duration. Accordingly, precise estimation of time and cost is needed in order to promote awareness in the developers and society and to convince them to develop this kind of energy despite its difficulties. Occurrence risks during on project would cause its duration and cost constantly changed. Therefore, to develop offshore wind power, it is critical to consider all potential risks which impacted project and to simulate their impact. Hence, knowing about these risks could be useful for the selection of most influencing strategies such as avoidance, transition, and act in order to decrease their probability and impact. This paper presents an evaluation of the feasibility of 500 MV offshore wind project in the Persian Gulf and compares its situation with uncertainty resources and risk. The purpose of this study is to evaluate time and cost of offshore wind project under risk circumstances and uncertain resources by using Monte Carlo simulation. We analyzed each risk and activity along with their distribution function and their effect on the project.Keywords: wind energy project, uncertain resources, risks, Monte Carlo simulation
Procedia PDF Downloads 3527940 Kriging-Based Global Optimization Method for Bluff Body Drag Reduction
Authors: Bingxi Huang, Yiqing Li, Marek Morzynski, Bernd R. Noack
Abstract:
We propose a Kriging-based global optimization method for active flow control with multiple actuation parameters. This method is designed to converge quickly and avoid getting trapped into local minima. We follow the model-free explorative gradient method (EGM) to alternate between explorative and exploitive steps. This facilitates a convergence similar to a gradient-based method and the parallel exploration of potentially better minima. In contrast to EGM, both kinds of steps are performed with Kriging surrogate model from the available data. The explorative step maximizes the expected improvement, i.e., favors regions of large uncertainty. The exploitive step identifies the best location of the cost function from the Kriging surrogate model for a subsequent weight-biased linear-gradient descent search method. To verify the effectiveness and robustness of the improved Kriging-based optimization method, we have examined several comparative test problems of varying dimensions with limited evaluation budgets. The results show that the proposed algorithm significantly outperforms some model-free optimization algorithms like genetic algorithm and differential evolution algorithm with a quicker convergence for a given budget. We have also performed direct numerical simulations of the fluidic pinball (N. Deng et al. 2020 J. Fluid Mech.) on three circular cylinders in equilateral-triangular arrangement immersed in an incoming flow at Re=100. The optimal cylinder rotations lead to 44.0% net drag power saving with 85.8% drag reduction and 41.8% actuation power. The optimal results for active flow control based on this configuration have achieved boat-tailing mechanism by employing Coanda forcing and wake stabilization by delaying separation and minimizing the wake region.Keywords: direct numerical simulations, flow control, kriging, stochastic optimization, wake stabilization
Procedia PDF Downloads 1067939 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production
Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng
Abstract:
Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.Keywords: management strategies, pigeonpea, risk factors, smallholder farmers
Procedia PDF Downloads 2137938 Demographic Profile, Risk Factors and In-hospital Outcomes of Acute Coronary Syndrome (ACS) in Young Population, in Pakistan-Single Center Real World Experience
Authors: Asma Qudrat, Abid Ullah, Rafi Ullah, Ali Raza, Shah Zeb, Syed Ali Shan Ul-Haq, Shahkar Ahmed Shah, Attiya Hameed Khan, Saad Zaheer, Umama Qasim, Kiran Jamal, Zahoor khan
Abstract:
Objectives: Coronary artery disease (CAD) is the major public health issue associated with high mortality and morbidity rate worldwide. Young patients with ACS have unique characteristics with different demographic profiles and risk factors. The precise diagnosis and early risk stratification is important in guiding treatment and predicting the prognosis of young patients with ACS. To evaluate the associated demographics, risk factors, and outcomes profile of ACS in young age patients. Methods: The research follow a retrospective design, the single centre study of patients diagnosis with the first event of ACS in young age (>18 and <40) were included. Data collection included demographic profiles, risk factors, and in-hospital outcomes of young ACS patients. The patient’s data was retrieved through Electronic Medical Records (EMR) of Peshawar Institute of Cardiology (PIC), and all characteristic were assessed. Results: In this study, 77% were male, and 23% were female patients. The risk factors were assessed with CAD and shown significant results (P < 0.01). The most common presentation was STEMI, with (45%) most in ACS young patients. The angiographic pattern showed single vessel disease (SVD) in 49%, double vessel disease (DVD) in 17% and triple vessel disease (TVD) was found in 10%, and Left Artery Disease (LAD) (54%) was present to be the most common involved artery. Conclusion: It is concluded that the male sex was predominant in ACS young age patients. SVD was the common coronary angiographic finding. Risk factors showed significant results towards CAD and common presentations.Keywords: coronary artery disease, Non-ST elevation myocardial infarction, ST elevation myocardial infarction, unstable angina, acute coronary syndrome
Procedia PDF Downloads 1657937 Calibration of Hybrid Model and Arbitrage-Free Implied Volatility Surface
Authors: Kun Huang
Abstract:
This paper investigates whether the combination of local and stochastic volatility models can be calibrated exactly to any arbitrage-free implied volatility surface of European option. The risk neutral Brownian Bridge density is applied for calibration of the leverage function of our Hybrid model. Furthermore, the tails of marginal risk neutral density are generated by Generalized Extreme Value distribution in order to capture the properties of asset returns. The local volatility is generated from the arbitrage-free implied volatility surface using stochastic volatility inspired parameterization.Keywords: arbitrage free implied volatility, calibration, extreme value distribution, hybrid model, local volatility, risk-neutral density, stochastic volatility
Procedia PDF Downloads 2677936 Mixed Integer Programing for Multi-Tier Rebate with Discontinuous Cost Function
Authors: Y. Long, L. Liu, K. V. Branin
Abstract:
One challenge faced by procurement decision-maker during the acquisition process is how to compare similar products from different suppliers and allocate orders among different products or services. This work focuses on allocating orders among multiple suppliers considering rebate. The objective function is to minimize the total acquisition cost including purchasing cost and rebate benefit. Rebate benefit is complex and difficult to estimate at the ordering step. Rebate rules vary for different suppliers and usually change over time. In this work, we developed a system to collect the rebate policies, standardized the rebate policies and developed two-stage optimization models for ordering allocation. Rebate policy with multi-tiers is considered in modeling. The discontinuous cost function of rebate benefit is formulated for different scenarios. A piecewise linear function is used to approximate the discontinuous cost function of rebate benefit. And a Mixed Integer Programing (MIP) model is built for order allocation problem with multi-tier rebate. A case study is presented and it shows that our optimization model can reduce the total acquisition cost by considering rebate rules.Keywords: discontinuous cost function, mixed integer programming, optimization, procurement, rebate
Procedia PDF Downloads 2607935 Simulation and Optimization of Hybrid Energy System Autonomous PV-Diesel-Wind Power with Battery Storage for Relay Antenna Telecommunication
Authors: Tahri Toufik, Bouchachia Mohamed, Braikia Oussama
Abstract:
The objective of this work is the design and optimization of a hybrid PV-Diesel-Wind power system with storage in order to power a relay antenna telecommunication isolated in Chlef region. The aim of the simulation of this hybrid system by the HOMER software is to determine the size and the number of each element of the system and to determine the optimal technical and economic configuration using monthly average values per year for a fixed charge antenna relay telecommunication of 22kWh/d.Keywords: HOMER, hybrid, PV-diesel-wind system, relay antenna telecommunication
Procedia PDF Downloads 5187934 Radial Distribution Network Reliability Improvement by Using Imperialist Competitive Algorithm
Authors: Azim Khodadadi, Sahar Sadaat Vakili, Ebrahim Babaei
Abstract:
This study presents a numerical method to optimize the failure rate and repair time of a typical radial distribution system. Failure rate and repair time are effective parameters in customer and energy based indices of reliability. Decrease of these parameters improves reliability indices. Thus, system stability will be boost. The penalty functions indirectly reflect the cost of investment which spent to improve these indices. Constraints on customer and energy based indices, i.e. SAIFI, SAIDI, CAIDI and AENS have been considered by using a new method which reduces optimization algorithm controlling parameters. Imperialist Competitive Algorithm (ICA) used as main optimization technique and particle swarm optimization (PSO), simulated annealing (SA) and differential evolution (DE) has been applied for further investigation. These algorithms have been implemented on a test system by MATLAB. Obtained results have been compared with each other. The optimized values of repair time and failure rate are much lower than current values which this achievement reduced investment cost and also ICA gives better answer than the other used algorithms.Keywords: imperialist competitive algorithm, failure rate, repair time, radial distribution network
Procedia PDF Downloads 6697933 The Communication of Audit Report: Key Audit Matters in United Kingdom
Authors: L. Sierra, N. Gambetta, M. A. Garcia-Benau, M. Orta
Abstract:
Financial scandals and financial crisis have led to an international debate on the value of auditing. In recent years there have been significant legislative reforms aiming to increase markets’ confidence in audit services. In particular, there has been a significant debate on the need to improve the communication of auditors with audit reports users as a way to improve its informative value and thus, to improve audit quality. The International Auditing and Assurance Standards Board (IAASB) has proposed changes to the audit report standards. The International Standard on Auditing 701, Communicating Key Audit Matters (KAM) in the Independent Auditor's Report, has introduced new concepts that go beyond the auditor's opinion and requires to disclose the risks that, from the auditor's point of view, are more significant in the audited company information. Focusing on the companies included in the Financial Times Stock Exchange 100 index, this study aims to focus on the analysis of the determinants of the number of KAM disclosed by the auditor in the audit report and moreover, the analysis of the determinants of the different type of KAM reported during the period 2013-2015. To test the hypotheses in the empirical research, two different models have been used. The first one is a linear regression model to identify the client’s characteristics, industry sector and auditor’s characteristics that are related to the number of KAM disclosed in the audit report. Secondly, a logistic regression model is used to identify the determinants of the number of each KAM type disclosed in the audit report; in line with the risk-based approach to auditing financial statements, we categorized the KAM in 2 groups: Entity-level KAM and Accounting-level KAM. Regarding the auditor’s characteristics impact on the KAM disclosure, the results show that PwC tends to report a larger number of KAM while KPMG tends to report less KAM in the audit report. Further, PwC reports a larger number of entity-level risk KAM while KPMG reports less account-level risk KAM. The results also show that companies paying higher fees tend to have more entity-level risk KAM and less account-level risk KAM. The materiality level is positively related to the number of account-level risk KAM. Additionally, these study results show that the relationship between client’s characteristics and number of KAM is more evident in account-level risk KAM than in entity-level risk KAM. A highly leveraged company carries a great deal of risk, but due to this, they are usually subject to strong capital providers monitoring resulting in less account-level risk KAM. The results reveal that the number of account-level risk KAM is strongly related to the industry sector in which the company operates assets. This study helps to understand the UK audit market, provides information to auditors and finally, it opens new research avenues in the academia.Keywords: FTSE 100, IAS 701, key audit matters, auditor’s characteristics, client’s characteristics
Procedia PDF Downloads 2317932 Multi-Criteria Optimal Management Strategy for in-situ Bioremediation of LNAPL Contaminated Aquifer Using Particle Swarm Optimization
Authors: Deepak Kumar, Jahangeer, Brijesh Kumar Yadav, Shashi Mathur
Abstract:
In-situ remediation is a technique which can remediate either surface or groundwater at the site of contamination. In the present study, simulation optimization approach has been used to develop management strategy for remediating LNAPL (Light Non-Aqueous Phase Liquid) contaminated aquifers. Benzene, toluene, ethyl benzene and xylene are the main component of LNAPL contaminant. Collectively, these contaminants are known as BTEX. In in-situ bioremediation process, a set of injection and extraction wells are installed. Injection wells supply oxygen and other nutrient which convert BTEX into carbon dioxide and water with the help of indigenous soil bacteria. On the other hand, extraction wells check the movement of plume along downstream. In this study, optimal design of the system has been done using PSO (Particle Swarm Optimization) algorithm. A comprehensive management strategy for pumping of injection and extraction wells has been done to attain a maximum allowable concentration of 5 ppm and 4.5 ppm. The management strategy comprises determination of pumping rates, the total pumping volume and the total running cost incurred for each potential injection and extraction well. The results indicate a high pumping rate for injection wells during the initial management period since it facilitates the availability of oxygen and other nutrients necessary for biodegradation, however it is low during the third year on account of sufficient oxygen availability. This is because the contaminant is assumed to have biodegraded by the end of the third year when the concentration drops to a permissible level.Keywords: groundwater, in-situ bioremediation, light non-aqueous phase liquid, BTEX, particle swarm optimization
Procedia PDF Downloads 4457931 Gariep Dam Basin Management for Satisfying Ecological Flow Requirements
Authors: Dimeji Abe, Nonso Okoye, Gideon Ikpimi, Prince Idemudia
Abstract:
Multi-reservoir optimization operation has been a critical issue for river basin management. Water, as a scarce resource, is in high demand and the problems associated with the reservoir as its storage facility are enormous. The complexity in balancing the supply and demand of this prime resource has created the need to examine the best way to solve the problem using optimization techniques. The objective of this study is to evaluate the performance of the multi-objective meta-heuristic algorithm for the operation of Gariep Dam for satisfying ecological flow requirements. This study uses an evolutionary algorithm called backtrack search algorithm (BSA) to determine the best way to optimise the dam operations of hydropower production, flood control, and water supply without affecting the environmental flow requirement for the survival of aquatic bodies and sustain life downstream of the dam. To achieve this objective, the operations of the dam that corresponds to different tradeoffs between the objectives are optimized. The results indicate the best model from the algorithm that satisfies all the objectives without any constraint violation. It is expected that hydropower generation will be improved and more water will be available for ecological flow requirements with the use of the algorithm. This algorithm also provides farmers with more irrigation water as well to improve their business.Keywords: BSA evolutionary algorithm, metaheuristics, optimization, river basin management
Procedia PDF Downloads 2457930 Systematic Review of Quantitative Risk Assessment Tools and Their Effect on Racial Disproportionality in Child Welfare Systems
Authors: Bronwen Wade
Abstract:
Over the last half-century, child welfare systems have increasingly relied on quantitative risk assessment tools, such as actuarial or predictive risk tools. These tools are developed by performing statistical analysis of how attributes captured in administrative data are related to future child maltreatment. Some scholars argue that attributes in administrative data can serve as proxies for race and that quantitative risk assessment tools reify racial bias in decision-making. Others argue that these tools provide more “objective” and “scientific” guides for decision-making instead of subjective social worker judgment. This study performs a systematic review of the literature on the impact of quantitative risk assessment tools on racial disproportionality; it examines methodological biases in work on this topic, summarizes key findings, and provides suggestions for further work. A search of CINAHL, PsychInfo, Proquest Social Science Premium Collection, and the ProQuest Dissertations and Theses Collection was performed. Academic and grey literature were included. The review includes studies that use quasi-experimental methods and development, validation, or re-validation studies of quantitative risk assessment tools. PROBAST (Prediction model Risk of Bias Assessment Tool) and CHARMS (CHecklist for critical Appraisal and data extraction for systematic Reviews of prediction Modelling Studies) were used to assess the risk of bias and guide data extraction for risk development, validation, or re-validation studies. ROBINS-I (Risk of Bias in Non-Randomized Studies of Interventions) was used to assess for bias and guide data extraction for the quasi-experimental studies identified. Due to heterogeneity among papers, a meta-analysis was not feasible, and a narrative synthesis was conducted. 11 papers met the eligibility criteria, and each has an overall high risk of bias based on the PROBAST and ROBINS-I assessments. This is deeply concerning, as major policy decisions have been made based on a limited number of studies with a high risk of bias. The findings on racial disproportionality have been mixed and depend on the tool and approach used. Authors use various definitions for racial equity, fairness, or disproportionality. These concepts of statistical fairness are connected to theories about the reason for racial disproportionality in child welfare or social definitions of fairness that are usually not stated explicitly. Most findings from these studies are unreliable, given the high degree of bias. However, some of the less biased measures within studies suggest that quantitative risk assessment tools may worsen racial disproportionality, depending on how disproportionality is mathematically defined. Authors vary widely in their approach to defining and addressing racial disproportionality within studies, making it difficult to generalize findings or approaches across studies. This review demonstrates the power of authors to shape policy or discourse around racial justice based on their choice of statistical methods; it also demonstrates the need for improved rigor and transparency in studies of quantitative risk assessment tools. Finally, this review raises concerns about the impact that these tools have on child welfare systems and racial disproportionality.Keywords: actuarial risk, child welfare, predictive risk, racial disproportionality
Procedia PDF Downloads 547929 Health Risk Assessment and Source Apportionment of Elemental Particulate Contents from a South Asian Future Megacity
Authors: Afifa Aslam, Muhammad Ibrahim, Abid Mahmood, Muhammad Usman Alvi, Fariha Jabeen, Umara Tabassum
Abstract:
Many factors cause air pollution in Pakistan, which poses a significant threat to human health. Diesel fuel and gasoline motor vehicles, as well as industrial companies, pollute the air in Pakistan's cities. The study's goal is to determine the level of air pollution in a Pakistani industrial city and to establish risk levels for the health of the population. We measured the intensity of air pollution by chemical characterization and examination of air samples collected at stationary remark sites. The PM10 levels observed at all sampling sites, including residential, commercial, high-traffic, and industrial areas were well above the limits imposed by Pakistan EPA, the United States EPA, and WHO. We assessed the health risk via chemical factors using a methodology approved for risk assessment. All Igeo index values greater than one were considered moderately contaminated or moderately to severely contaminated. Heavy metals have a substantial risk of acute adverse effects. In Faisalabad, Pakistan, there was an enormously high risk of chronic effects produced by a heavy metal acquaintance. Concerning specified toxic metals, intolerable levels of carcinogenic risks have been determined for the entire population. As a result, in most of the investigated areas of Faisalabad, the indices and hazard quotients for chronic and acute exposure exceeded the permissible level of 1.0. In the current study, re-suspended roadside mineral dust, anthropogenic exhaust emissions from traffic and industry, and industrial dust were identified as major emission sources of elemental particulate contents. Because of the unacceptable levels of risk in the research area, it is strongly suggested that a comprehensive study of the population's health status as a result of air pollution should be conducted for policies to be developed against these risks.Keywords: elemental composition, particulate pollution, Igeo index, health risk assessment, hazard quotient
Procedia PDF Downloads 917928 Screening Ecological Risk Assessment at an Old Abandoned Mine in Northern Taiwan
Authors: Hui-Chen Tsai, Chien-Jen Ho, Bo-Wei Power Liang, Ying Shen, Yi-Hsin Lai
Abstract:
Former Taiwan Metal Mining Corporation and its associated 3 wasted flue gas tunnels, hereinafter referred to as 'TMMC', was contaminated with heavy metals, Polychlorinated biphenyls (PCBs) and Total Petroleum Hydrocarbons (TPHs) in soil. Since the contamination had been exposed and unmanaged in the environment for more than 40 years, the extent of the contamination area is estimated to be more than 25 acres. Additionally, TMMC is located in a remote, mountainous area where almost no residents are residing in the 1-km radius area. Thus, it was deemed necessary to conduct an ecological risk assessment in order to evaluate the details of future contaminated site management plan. According to the winter and summer, ecological investigation results, one type of endangered, multiple vulnerable and near threaten plant was discovered, as well as numerous other protected species, such as Crested Serpent Eagle, Crested Goshawk, Black Kite, Brown Shrike, Taiwan Blue Magpie were observed. Ecological soil screening level (Eco-SSLs) developed by USEPA was adopted as a reference to conduct screening assessment. Since all the protected species observed surrounding TMMC site were birds, screening ecological risk assessment was conducted on birds only. The assessment was assessed mainly based on the chemical evaluation, which the contamination in different environmental media was compared directly with the ecological impact levels (EIL) of each evaluation endpoints and the respective hazard quotient (HQ) and hazard index (HI) could be obtained. The preliminary ecological risk assessment results indicated HI is greater than 1. In other words, the biological stressors (birds) were exposed to the contamination, which was already exceeded the dosage that could cause unacceptable impacts to the ecological system. This result was mainly due to the high concentration of arsenic, metal and lead; thus it was suggested the above mention contaminants should be remediated as soon as possible or proper risk management measures should be taken.Keywords: screening, ecological risk assessment, ecological impact levels, risk management
Procedia PDF Downloads 1347927 Using Predictive Analytics to Identify First-Year Engineering Students at Risk of Failing
Authors: Beng Yew Low, Cher Liang Cha, Cheng Yong Teoh
Abstract:
Due to a lack of continual assessment or grade related data, identifying first-year engineering students in a polytechnic education at risk of failing is challenging. Our experience over the years tells us that there is no strong correlation between having good entry grades in Mathematics and the Sciences and excelling in hardcore engineering subjects. Hence, identifying students at risk of failure cannot be on the basis of entry grades in Mathematics and the Sciences alone. These factors compound the difficulty of early identification and intervention. This paper describes the development of a predictive analytics model in the early detection of students at risk of failing and evaluates its effectiveness. Data from continual assessments conducted in term one, supplemented by data of student psychological profiles such as interests and study habits, were used. Three classification techniques, namely Logistic Regression, K Nearest Neighbour, and Random Forest, were used in our predictive model. Based on our findings, Random Forest was determined to be the strongest predictor with an Area Under the Curve (AUC) value of 0.994. Correspondingly, the Accuracy, Precision, Recall, and F-Score were also highest among these three classifiers. Using this Random Forest Classification technique, students at risk of failure could be identified at the end of term one. They could then be assigned to a Learning Support Programme at the beginning of term two. This paper gathers the results of our findings. It also proposes further improvements that can be made to the model.Keywords: continual assessment, predictive analytics, random forest, student psychological profile
Procedia PDF Downloads 1347926 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: inversion, limitations, optimization, resistivity
Procedia PDF Downloads 3657925 Revised Risk Priority Number in Failure Mode and Effects Analysis Model from the Perspective of Healthcare System
Authors: Fatemeh Rezaei, Mohammad H. Yarmohammadian, Masoud Ferdosi, Abbas Haghshnas
Abstract:
Background: Failure Modes and Effect Analysis is now having known as the main methods of risk assessment and the accreditation requirements for many organizations. The Risk Priority Number (RPN) approach is generally preferred, especially for its easiness of use. Indeed it does not require statistical data, but it is based on subjective evaluations given by the experts about the Occurrence (O i), the Severity (Si) and the Detectability (D i) of each cause of failure. Methods: This study is a quantitative – qualitative research. In terms of qualitative dimension, method of focus groups with inductive approach is used. To evaluate the results of the qualitative study, quantitative assessment was conducted to calculate RPN score. Results; We have studied patient’s journey process in surgery ward and the most important phase of the process determined Transport of the patient from the holding area to the operating room. Failures of the phase with the highest priority determined by defining inclusion criteria included severity (clinical effect, claim consequence, waste of time and financial loss), occurrence (time- unit occurrence and degree of exposure to risk) and preventability (degree of preventability and defensive barriers) and quantifying risks priority criteria in the context of RPN index. Ability of improved RPN reassess by root cause (RCA) analysis showed some variations. Conclusions: Finally, It could be concluded that understandable criteria should have been developed according to personnel specialized language and communication field. Therefore, participation of both technical and clinical groups is necessary to modify and apply these models.Keywords: failure mode, effects analysis, risk priority number(RPN), health system, risk assessment
Procedia PDF Downloads 313