Search results for: combined evaluation of concurrent risk events
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15700

Search results for: combined evaluation of concurrent risk events

14740 Comprehensive Risk Analysis of Decommissioning Activities with Multifaceted Hazard Factors

Authors: Hyeon-Kyo Lim, Hyunjung Kim, Kune-Woo Lee

Abstract:

Decommissioning process of nuclear facilities can be said to consist of a sequence of problem solving activities, partly because there may exist working environments contaminated by radiological exposure, and partly because there may also exist industrial hazards such as fire, explosions, toxic materials, and electrical and physical hazards. As for an individual hazard factor, risk assessment techniques are getting known to industrial workers with advance of safety technology, but the way how to integrate those results is not. Furthermore, there are few workers who experienced decommissioning operations a lot in the past. Therefore, not a few countries in the world have been trying to develop appropriate counter techniques in order to guarantee safety and efficiency of the process. In spite of that, there still exists neither domestic nor international standard since nuclear facilities are too diverse and unique. In the consequence, it is quite inevitable to imagine and assess the whole risk in the situation anticipated one by one. This paper aimed to find out an appropriate technique to integrate individual risk assessment results from the viewpoint of experts. Thus, on one hand the whole risk assessment activity for decommissioning operations was modeled as a sequence of individual risk assessment steps, and on the other, a hierarchical risk structure was developed. Then, risk assessment procedure that can elicit individual hazard factors one by one were introduced with reference to the standard operation procedure (SOP) and hierarchical task analysis (HTA). With an assumption of quantification and normalization of individual risks, a technique to estimate relative weight factors was tried by using the conventional Analytic Hierarchical Process (AHP) and its result was reviewed with reference to judgment of experts. Besides, taking the ambiguity of human judgment into consideration, debates based upon fuzzy inference was added with a mathematical case study.

Keywords: decommissioning, risk assessment, analytic hierarchical process (AHP), fuzzy inference

Procedia PDF Downloads 423
14739 Self-Efficacy Perceptions and the Attitudes of Prospective Teachers towards Assessment and Evaluation

Authors: Münevver Başman, Ezel Tavşancıl

Abstract:

Making the right decisions about students depends on teachers’ use of the assessment and evaluation techniques effectively. In order to do that, teachers should have positive attitudes and adequate self-efficacy perception towards assessment and evaluation. The purpose of this study is to investigate relationship between self-efficacy perception and the attitudes of prospective teachers towards assessment and evaluation and what kind of differences these issues have in terms of a variety of demographic variables. The study group consisted of 277 prospective teachers who have been studying in different departments of Marmara University, Faculty of Education. In this study, ‘Personal Information Form’, ‘A Perceptual Scale for Measurement and Evaluation of Prospective Teachers Self-Efficacy in Education’ and ‘Attitudes toward Educational Measurement Inventory’ are applied. As a result, positive correlation was found between self-efficacy perceptions and the attitudes of prospective teachers towards assessment and evaluation. Considering different departments, there is a significant difference between the mean score of attitudes of prospective teachers and between the mean score of self-efficacy perceptions of them. However, considering variables of attending statistics class and the class types at the graduated high school, there is no significant difference between the mean score of attitudes of prospective teachers and between the mean score of self-efficacy perceptions of them.

Keywords: attitude, perception, prospective teacher, self-efficacy

Procedia PDF Downloads 302
14738 Performance Evaluation and Planning for Road Safety Measures Using Data Envelopment Analysis and Fuzzy Decision Making

Authors: Hamid Reza Behnood, Esmaeel Ayati, Tom Brijs, Mohammadali Pirayesh Neghab

Abstract:

Investment projects in road safety planning can benefit from an effectiveness evaluation regarding their expected safety outcomes. The objective of this study is to develop a decision support system (DSS) to support policymakers in taking the right choice in road safety planning based on the efficiency of previously implemented safety measures in a set of regions in Iran. The measures considered for each region in the study include performance indicators about (1) police operations, (2) treated black spots, (3) freeway and highway facility supplies, (4) speed control cameras, (5) emergency medical services, and (6) road lighting projects. To this end, inefficiency measure is calculated, defined by the proportion of fatality rates in relation to the combined measure of road safety performance indicators (i.e., road safety measures) which should be minimized. The relative inefficiency for each region is modeled by the Data Envelopment Analysis (DEA) technique. In a next step, a fuzzy decision-making system is constructed to convert the information obtained from the DEA analysis into a rule-based system that can be used by policy makers to evaluate the expected outcomes of certain alternative investment strategies in road safety.

Keywords: performance indicators, road safety, decision support system, data envelopment analysis, fuzzy reasoning

Procedia PDF Downloads 348
14737 Operationalizing the Concept of Community Resilience through Community Capitals Framework-Based Index

Authors: Warda Ajaz

Abstract:

This study uses the ‘Community Capitals Framework’ (CCF) to develop a community resilience index that can serve as a useful tool for measuring resilience of communities in diverse contexts and backgrounds. CCF is an important analytical tool to assess holistic community change. This framework identifies seven major types of community capitals: natural, cultural, human, social, political, financial and built, and claims that the communities that have been successful in supporting healthy sustainable community and economic development have paid attention to all these capitals. The framework, therefore, proposes to study the community development through identification of assets in these major capitals (stock), investment in these capitals (flow), and the interaction between these capitals. Capital based approaches have been extensively used to assess community resilience, especially in the context of natural disasters and extreme events. Therefore, this study identifies key indicators for estimating each of the seven capitals through an extensive literature review and then develops an index to calculate a community resilience score. The CCF-based community resilience index presents an innovative way of operationalizing the concept of community resilience and will contribute toward decision-relevant research regarding adaptation and mitigation of community vulnerabilities to climate change-induced, as well as other adverse events.

Keywords: adverse events, community capitals, community resilience, climate change, economic development, sustainability

Procedia PDF Downloads 264
14736 Sources and Potential Ecological Risks of Heavy Metals in the Sediment Samples From Coastal Area in Ondo, Southwest Nigeria

Authors: Ogundele Lasun Tunde, Ayeku Oluwagbemiga Patrick

Abstract:

Heavy metals are released into the sediments in aquatic environment from both natural and anthropogenic sources and they are considered as worldwide issue due to their deleterious ecological risks and food chain disruption. In this study, sediments samples were collected at three major sites (Awoye, Abereke and Ayetoro) along Ondo coastal area using VanVeen grab sampler. The concentrations of As, Cd, Cr, Cu, Fe, Mn, Ni, Pb, V and Zn were determined by employing Atomic Absorption Spectroscopy (AAS). The combined concentrations data were subjected to Positive Matrix Factorization (PMF) receptor approach for source identification and apportionment. The probable risks that might be posed by heavy metals in the sediment were estimated by potential and integrated ecological risks indices. Among the measured heavy metals, Fe had the average concentrations of 20.38 ± 2.86, 23.56 ± 4.16 and 25.32 ± 4.83 lg/g at Abereke, Awoye and Ayetoro sites, respectively. The PMF resulted in identification of four sources of heavy metals in the sediments. The resolved sources and their percentage contributions were oil exploration (39%), industrial waste/sludge (35%), detrital process (18%) and Mn-sources (8%). Oil exploration activities and industrial wastes are the major sources that contribute heavy metals into the coastal sediments. The major pollutants that posed ecological risks to the local aquatic ecosystem are As, Pb, Cr and Cd (40 B Ei ≤ 80) classifying the sites as moderate risk. The integrate risks values of Awoye, Abereke and Ayetoro are 231.2, 234.0 and 236.4, respectively suggesting that the study areas had a moderate ecological risk. The study showed the suitability of PMF receptor model for source identification of heavy metals in the sediments. Also, the intensive anthropogenic activities and natural sources could largely discharge heavy metals into the study area, which may increase the heavy metal contents of the sediments and further contribute to the associated ecological risk, thus affecting the local aquatic ecosystem.

Keywords: positive matrix factorization, sediments, heavy metals, sources, ecological risks

Procedia PDF Downloads 19
14735 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors

Procedia PDF Downloads 431
14734 Boussinesq Model for Dam-Break Flow Analysis

Authors: Najibullah M, Soumendra Nath Kuiry

Abstract:

Dams and reservoirs are perceived for their estimable alms to irrigation, water supply, flood control, electricity generation, etc. which civilize the prosperity and wealth of society across the world. Meantime the dam breach could cause devastating flood that can threat to the human lives and properties. Failures of large dams remain fortunately very seldom events. Nevertheless, a number of occurrences have been recorded in the world, corresponding in an average to one to two failures worldwide every year. Some of those accidents have caused catastrophic consequences. So it is decisive to predict the dam break flow for emergency planning and preparedness, as it poses high risk to life and property. To mitigate the adverse impact of dam break, modeling is necessary to gain a good understanding of the temporal and spatial evolution of the dam-break floods. This study will mainly deal with one-dimensional (1D) dam break modeling. Less commonly used in the hydraulic research community, another possible option for modeling the rapidly varied dam-break flows is the extended Boussinesq equations (BEs), which can describe the dynamics of short waves with a reasonable accuracy. Unlike the Shallow Water Equations (SWEs), the BEs taken into account the wave dispersion and non-hydrostatic pressure distribution. To capture the dam-break oscillations accurately it is very much needed of at least fourth-order accurate numerical scheme to discretize the third-order dispersion terms present in the extended BEs. The scope of this work is therefore to develop an 1D fourth-order accurate in both space and time Boussinesq model for dam-break flow analysis by using finite-volume / finite difference scheme. The spatial discretization of the flux and dispersion terms achieved through a combination of finite-volume and finite difference approximations. The flux term, was solved using a finite-volume discretization whereas the bed source and dispersion term, were discretized using centered finite-difference scheme. Time integration achieved in two stages, namely the third-order Adams Basforth predictor stage and the fourth-order Adams Moulton corrector stage. Implementation of the 1D Boussinesq model done using PYTHON 2.7.5. Evaluation of the performance of the developed model predicted as compared with the volume of fluid (VOF) based commercial model ANSYS-CFX. The developed model is used to analyze the risk of cascading dam failures similar to the Panshet dam failure in 1961 that took place in Pune, India. Nevertheless, this model can be used to predict wave overtopping accurately compared to shallow water models for designing coastal protection structures.

Keywords: Boussinesq equation, Coastal protection, Dam-break flow, One-dimensional model

Procedia PDF Downloads 228
14733 Oral Contraceptic Pill Associated Hypertension on the Sex Productive Women in the Andalas Public Health Center, Padang, Indonesia

Authors: Armenia Nazar, Sally M. J. Anggelya, Rose Dinda

Abstract:

Hypertension prevalence in Indonesian has increased from time to time since 2013, especially in women. This cross-sectional analysis study was made to observe the incidence of hypertension on the reproductive women (20-49 years old) with several risk factors who use contraceptive pills. Data was collected from June - October 2016 in the Andalas Public Health Center, East Padang District, Indonesia. An amount of 167 respondents who were taken using consecutive sampling technique were participate in this study. Data of social demography, contraceptive used, duration of use, hypertension risk factors (age, family history, central obesity, body mass index, physical activity, and stress) were collected and analyzed statistically using Chi-Square analysis. Significant was taken at p < 0.05. Results showed that the woman with contraceptive pill was tent to get hypertension (OR = 3,90 and p < 0,001). In addition, woman with a family history OR of 6,77 (p = 0,09), mild physical activity OR of 3,67 (p = 0,33), moderate physical activity OR of 3,33 (p = 0,16), and stressed OR of 5.11 (p = 0.18). These indicated that the contraceptive pill user is 3.9 times more risk to develop hypertension than non-users, especially one with a family history of hypertension. Other risk factors were not associated with hypertension risk in these sex productive women.

Keywords: hypertension, oral contraceptive, sex productive woman, risk factors

Procedia PDF Downloads 296
14732 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 160
14731 A Double Acceptance Sampling Plan for Truncated Life Test Having Exponentiated Transmuted Weibull Distribution

Authors: A. D. Abdellatif, A. N. Ahmed, M. E. Abdelaziz

Abstract:

The main purpose of this paper is to design a double acceptance sampling plan under the time truncated life test when the product lifetime follows an exponentiated transmuted Weibull distribution. Here, the motive is to meet both the consumer’s risk and producer’s risk simultaneously at the specified quality levels, while the termination time is specified. A comparison between the results of the double and single acceptance sampling plans is conducted. We demonstrate the applicability of our results to real data sets.

Keywords: double sampling plan, single sampling plan, producer’s risk, consumer’s risk, exponentiated transmuted weibull distribution, time truncated experiment, single, double, Marshal-Olkin

Procedia PDF Downloads 485
14730 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 158
14729 Constraining Bank Risk: International Evidence on the Role of Bank Capital and Charter Value

Authors: Mamiza Haq

Abstract:

This paper examines the relevance of bank capital and charter value on bank insolvency and liquidity risks. Using an unbalanced panel of 2,111 unique local banks across 22 countries over 1998-2012, we find that both bank capital and charter value lower insolvency and liquidity risks, but this effect varies among conventional, Islamic, and Islamic-window banks. The risk constraining effect of bank capital becomes more prominent in the post 2007-2008 global financial crisis. Moreover, the relationships vary when conditioned upon other key bank-specific characteristics. For instance, the effect of capital on risk-reduction diminishes in the presence of high charter value for conventional-G7 and Islamic-window banks, during-GFC and pre-GFC period; respectively. Our findings have important policy implications related to bank safety. The results are robust to a range of robustness tests.

Keywords: bank capital, charter value, risk, financial crisis

Procedia PDF Downloads 272
14728 Combined Surface Tension and Natural Convection of Nanofluids in a Square Open Cavity

Authors: Habibis Saleh, Ishak Hashim

Abstract:

Combined surface tension and natural convection heat transfer in an open cavity is studied numerically in this article. The cavity is filled with water-{Cu} nanofluids. The left wall is kept at low temperature, the right wall at high temperature and the bottom and top walls are adiabatic. The top free surface is assumed to be flat and non--deformable. Finite difference method is applied to solve the dimensionless governing equations. It is found that the insignificant effect of adding the nanoparticles were obtained about $Ma_{bf}=250$.

Keywords: natural convection, marangoni convection, nanofluids, square open cavity

Procedia PDF Downloads 545
14727 Equity Risk Premiums and Risk Free Rates in Modelling and Prediction of Financial Markets

Authors: Mohammad Ghavami, Reza S. Dilmaghani

Abstract:

This paper presents an adaptive framework for modelling financial markets using equity risk premiums, risk free rates and volatilities. The recorded economic factors are initially used to train four adaptive filters for a certain limited period of time in the past. Once the systems are trained, the adjusted coefficients are used for modelling and prediction of an important financial market index. Two different approaches based on least mean squares (LMS) and recursive least squares (RLS) algorithms are investigated. Performance analysis of each method in terms of the mean squared error (MSE) is presented and the results are discussed. Computer simulations carried out using recorded data show MSEs of 4% and 3.4% for the next month prediction using LMS and RLS adaptive algorithms, respectively. In terms of twelve months prediction, RLS method shows a better tendency estimation compared to the LMS algorithm.

Keywords: adaptive methods, LSE, MSE, prediction of financial Markets

Procedia PDF Downloads 331
14726 On a Theoretical Framework for Language Learning Apps Evaluation

Authors: Juan Manuel Real-Espinosa

Abstract:

This paper addresses the first step to evaluate language learning apps: what theoretical framework to adopt when designing the app evaluation framework. The answer is not just one since there are several options that could be proposed. However, the question to be clarified is to what extent the learning design of apps is based on a specific learning approach, or on the contrary, on a fusion of elements from several theoretical proposals and paradigms, such as m-learning, mobile assisted language learning, and a number of theories about language acquisition. The present study suggests that the reality is closer to the second assumption. This implies that the theoretical framework against which the learning design of the apps should be evaluated must also be a hybrid theoretical framework, which integrates evaluation criteria from the different theories involved in language learning through mobile applications.

Keywords: mobile-assisted language learning, action-oriented approach, apps evaluation, post-method pedagogy, second language acquisition

Procedia PDF Downloads 200
14725 A Lower Dose of Topiramate with Enough Antiseizure Effect: A Realistic Therapeutic Range of Topiramate

Authors: Seolah Lee, Yoohyk Jang, Soyoung Lee, Kon Chu, Sang Kun Lee

Abstract:

Objective: The International League Against Epilepsy (ILAE) currently suggests a topiramate serum level range of 5-20 mg/L. However, numerous institutions have observed substantial drug response at lower levels. This study aims to investigate the correlation between topiramate serum levels, drug responsiveness, and adverse events to establish a more accurate and tailored therapeutic range. Methods: We retrospectively analyzed topiramate serum samples collected between January 2017 and January 2022 at Seoul National University Hospital. Clinical data, including serum levels, antiseizure regimens, seizure frequency, and adverse events, were collected. Patient responses were categorized as "insufficient" (reduction in seizure frequency <50%) or "sufficient" (reduction ≥ 50%). Within the "sufficient" group, further subdivisions included seizure-free and tolerable seizure subgroups. A population pharmacokinetic model estimated serum levels from spot measurements. ROC curve analysis determined the optimal serum level cut-off. Results: A total of 389 epilepsy patients, with 555 samples, were reviewed, having a mean dose of 178.4±117.9 mg/day and a serum level of 3.9±2.8 mg/L. Out of the samples, only 5.6% (n=31) exhibited insufficient response, with a mean serum level of 3.6±2.5 mg/L. In contrast, 94.4% (n=524) of samples demonstrated sufficient response, with a mean serum level of 4.0±2.8 mg/L. This difference was not statistically significant (p = 0.45). Among the 78 reported adverse events, logistic regression analysis identified a significant association between ataxia and serum concentration (p = 0.04), with an optimal cut-off value of 6.5 mg/L. In the subgroup of patients receiving monotherapy, those in the tolerable seizure group exhibited a significantly higher serum level compared to the seizure-free group (4.8±2.0 mg/L vs 3.4±2.3 mg/L, p < 0.01). Notably, patients in the tolerable seizure group displayed a higher likelihood of progressing into drug-resistant epilepsy during follow-up visits compared to the seizure-free group. Significance: This study proposed an optimal therapeutic concentration for topiramate based on the patient's responsiveness to the drug and the incidence of adverse effects. We employed a population pharmacokinetic model and analyzed topiramate serum levels to recommend a serum level below 6.5 mg/L to mitigate the risk of ataxia-related side effects. Our findings also indicated that topiramate dose elevation is unnecessary for suboptimal responders, as the drug's effectiveness plateaus at minimal doses.

Keywords: topiramate, therapeutic range, low dos, antiseizure effect

Procedia PDF Downloads 50
14724 Satisfaction Evaluation on the Fundamental Public Services for a Large-Scale Indemnificatory Residential Community: A Case Study of Nanjing

Authors: Dezhi Li, Peng Cui, Bo Zhang, Tengyuan Chang

Abstract:

In order to solve the housing problem for the low-income families, the construction of affordable housing is booming in China. However, due to various reasons, the service facilities and systems in the indemnificatory residential community meet many problems. This article established a Satisfaction Evaluation System of the Fundamental Public Services for Large-scale Indemnificatory Residential Community based on the national standards and local criteria and developed evaluation methods and processes. At last, in the case of Huagang project in Nanjing, the satisfaction of basic public service is calculated according to a survey of local residents.

Keywords: indemnificatory residential community, public services, satisfaction evaluation, structural equation modeling

Procedia PDF Downloads 358
14723 Application of Public Access Two-Dimensional Hydrodynamic and Distributed Hydrological Models for Flood Forecasting in Ungauged Basins

Authors: Ahmad Shayeq Azizi, Yuji Toda

Abstract:

In Afghanistan, floods are the most frequent and recurrent events among other natural disasters. On the other hand, lack of monitoring data is a severe problem, which increases the difficulty of making the appropriate flood countermeasures of flood forecasting. This study is carried out to simulate the flood inundation in Harirud River Basin by application of distributed hydrological model, Integrated Flood Analysis System (IFAS) and 2D hydrodynamic model, International River Interface Cooperative (iRIC) based on satellite rainfall combined with historical peak discharge and global accessed data. The results of the simulation can predict the inundation area, depth and velocity, and the hardware countermeasures such as the impact of levee installation can be discussed by using the present method. The methodology proposed in this study is suitable for the area where hydrological and geographical data including river survey data are poorly observed.

Keywords: distributed hydrological model, flood inundation, hydrodynamic model, ungauged basins

Procedia PDF Downloads 164
14722 Integrating Individual and Structural Health Risk: A Social Identity Perspective on the HIV/AIDS Pandemic in Sub-Saharan Africa

Authors: Orla Muldoon, Tamaryn Nicolson, Mike Quayle, Aisling O'Donnell

Abstract:

Psychology most often considers the role of experience and behaviour in shaping health at the individual level. On the other hand epidemiology has long considered risk at the wider group or structural level. Here we use the social identity approach to integrate group-level risk with individual level behaviour. Using a social identity approach we demonstrate that group or macro-level factors impact implicitly and profoundly in everyday ways at the level of individuals, via social identities. We illustrate how identities related to race, gender and inequality intersect to affect HIV/AIDS risk and AIDS treatment behaviours; how social identity processes drive stigmatising consequences of HIV and AIDS, and promote positive and effective interventions. We conclude by arguing that the social identity approach offers the field an explanatory framework that conceptualizes how social and political forces intersect with individual identity and agency to affect human health.

Keywords: social identity approach, HIV/AIDS, Africa, HIV risk, race, gender

Procedia PDF Downloads 524
14721 Familial Exome Sequencing to Decipher the Complex Genetic Basis of Holoprosencephaly

Authors: Artem Kim, Clara Savary, Christele Dubourg, Wilfrid Carre, Houda Hamdi-Roze, Valerie Dupé, Sylvie Odent, Marie De Tayrac, Veronique David

Abstract:

Holoprosencephaly (HPE) is a rare congenital brain malformation resulting from the incomplete separation of the two cerebral hemispheres. It is characterized by a wide phenotypic spectrum and a high degree of locus heterogeneity. Genetic defects in 16 genes have already been implicated in HPE, but account for only 30% of cases, suggesting that a large part of genetic factors remains to be discovered. HPE has been recently redefined as a complex multigenic disorder, requiring the joint effect of multiple mutational events in genes belonging to one or several developmental pathways. The onset of HPE may result from accumulation of the effects of multiple rare variants in functionally-related genes, each conferring a moderate increase in the risk of HPE onset. In order to decipher the genetic basis of HPE, unconventional patterns of inheritance involving multiple genetic factors need to be considered. The primary objective of this study was to uncover possible disease causing combinations of multiple rare variants underlying HPE by performing trio-based Whole Exome Sequencing (WES) of familial cases where no molecular diagnosis could be established. 39 families were selected with no fully-penetrant causal mutation in known HPE gene, no chromosomic aberrations/copy number variants and without any implication of environmental factors. As the main challenge was to identify disease-related variants among a large number of nonpathogenic polymorphisms detected by WES classical scheme, a novel variant prioritization approach was established. It combined WES filtering with complementary gene-level approaches: transcriptome-driven (RNA-Seq data) and clinically-driven (public clinical data) strategies. Briefly, a filtering approach was performed to select variants compatible with disease segregation, population frequency and pathogenicity prediction to identify an exhaustive list of rare deleterious variants. The exome search space was then reduced by restricting the analysis to candidate genes identified by either transcriptome-driven strategy (genes sharing highly similar expression patterns with known HPE genes during cerebral development) or clinically-driven strategy (genes associated to phenotypes of interest overlapping with HPE). Deeper analyses of candidate variants were then performed on a family-by-family basis. These included the exploration of clinical information, expression studies, variant characteristics, recurrence of mutated genes and available biological knowledge. A novel bioinformatics pipeline was designed. Applied to the 39 families, this final integrated workflow identified an average of 11 candidate variants per family. Most of candidate variants were inherited from asymptomatic parents suggesting a multigenic inheritance pattern requiring the association of multiple mutational events. The manual analysis highlighted 5 new strong HPE candidate genes showing recurrences in distinct families. Functional validations of these genes are foreseen.

Keywords: complex genetic disorder, holoprosencephaly, multiple rare variants, whole exome sequencing

Procedia PDF Downloads 200
14720 Risk Factors for Fall in Elderly with Diabetes Mellitus Type 2 in Jeddah Saudi Arabia 2022: A Cross-Sectional Study

Authors: Rami S. Alasmari, Abdullah Al Zahrani, Hattan A. Hassani, Hattan A. Hassani, Nawwaf A. Almalky, Abdullah F. Bokhari, Alwalied A. Hafez

Abstract:

Diabetes mellitus type 2 (DMT2) is a major chronic condition that is considered common among elderly people, with multiple potential complications that could contribute to falls. However, this concept is not well understood, thus, the aim of this study is to determine whether diabetes is an independent risk factor for falls in elderly. In this observational cross-sectional study, 309 diabetic patients aged 60 or more who visited the primary healthcare centers of the Ministry of National Guard Health Affairs in Jeddah were chosen via convenience sampling method. To collect the data, Semi-structured Fall Risk Assessment questionnaire and Fall Efficacy Score scale were used. The mean age of the participants was estimated to be 68.5 (SD:7.4) years. Among the participants, 48.2% experienced falling before, and 63.1% of them suffered falls in the past 12-months. The results showed that gait problems were independently associated with a higher likelihood of fall among the elderly patients (OR = 1.98, 95%CI, 1.08 to 3.62, p = 0.026. This paper suggests that diabetes mellitus is an independent fall risk factor among elderly. Therefore, identifying such patients as being at higher risk and prompt referral to a specialist falls clinic is recommended.

Keywords: diabetes, fall, elderly, risk factors

Procedia PDF Downloads 100
14719 A Novel Approach for Energy Utilisation in a Pyrolysis Plant

Authors: S. Murugan, Bohumil Horak

Abstract:

Pyrolysis is one of the possible technologies to derive energy from waste organic substances. In recent years, pilot level and demonstrated plants have been installed in few countries. The heat energy lost during the process is not effectively utilized resulting in less savings of energy and money. This paper proposes a novel approach to integrate a combined heat and power unit(CHP) and reduce the primary energy consumption in a tyre pyrolysis pilot plant. The proposal primarily uses the micro combined heat and power concept that will help to produce both heat and power in the process.

Keywords: pyrolysis, waste tyres, waste plastics, biomass, waste heat

Procedia PDF Downloads 324
14718 Risk Assessment for Aerial Package Delivery

Authors: Haluk Eren, Ümit Çelik

Abstract:

Recent developments in unmanned aerial vehicles (UAVs) have begun to attract intense interest. UAVs started to use for many different applications from military to civilian use. Some online retailer and logistics companies are testing the UAV delivery. UAVs have great potentials to reduce cost and time of deliveries and responding to emergencies in a short time. Despite these great positive sides, just a few works have been done for routing of UAVs for package deliveries. As known, transportation of goods from one place to another may have many hazards on delivery route due to falling hazards that can be exemplified as ground objects or air obstacles. This situation refers to wide-range insurance concept. For this reason, deliveries that are made with drones get into the scope of shipping insurance. On the other hand, air traffic was taken into account in the absence of unmanned aerial vehicle. But now, it has been a reality for aerial fields. In this study, the main goal is to conduct risk analysis of package delivery services using drone, based on delivery routes.

Keywords: aerial package delivery, insurance estimation, territory risk map, unmanned aerial vehicle, route risk estimation, drone risk assessment, drone package delivery

Procedia PDF Downloads 336
14717 Accelerated Evaluation of Structural Reliability under Tsunami Loading

Authors: Sai Hung Cheung, Zhe Shao

Abstract:

It is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis in view of recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 which brought huge losses of lives and properties. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of a recently proposed moving least squares response surface approach for stochastic sampling and the Subset Simulation algorithm is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.

Keywords: response surface, stochastic simulation, structural reliability tsunami, risk

Procedia PDF Downloads 672
14716 VaR or TCE: Explaining the Preferences of Regulators

Authors: Silvia Faroni, Olivier Le Courtois, Krzysztof Ostaszewski

Abstract:

While a lot of research concentrates on the merits of VaR and TCE, which are the two most classic risk indicators used by financial institutions, little has been written on explaining why regulators favor the choice of VaR or TCE in their set of rules. In this paper, we investigate the preferences of regulators with the aim of understanding why, for instance, a VaR with a given confidence level is ultimately retained. Further, this paper provides equivalence rules that explain how a given choice of VaR can be equivalent to a given choice of TCE. Then, we introduce a new risk indicator that extends TCE by providing a more versatile weighting of the constituents of probability distribution tails. All of our results are illustrated using the generalized Pareto distribution.

Keywords: generalized pareto distribution, generalized tail conditional expectation, regulator preferences, risk measure

Procedia PDF Downloads 166
14715 Performance Augmentation of a Combined Cycle Power Plant with Waste Heat Recovery and Solar Energy

Authors: Mohammed A. Elhaj, Jamal S. Yassin

Abstract:

In the present time, energy crises are considered a severe problem across the world. For the protection of global environment and maintain ecological balance, energy saving is considered one of the most vital issues from the view point of fuel consumption. As the industrial sectors everywhere continue efforts to improve their energy efficiency, recovering waste heat losses provides an attractive opportunity for an emission free and less costly energy resource. In the other hand the using of solar energy has become more insistent particularly after the high gross of prices and running off the conventional energy sources. Therefore, it is essential that we should endeavor for waste heat recovery as well as solar energy by making significant and concrete efforts. For these reasons this investigation is carried out to study and analyze the performance of a power plant working by a combined cycle in which Heat Recovery System Generator (HRSG) gets its energy from the waste heat of a gas turbine unit. Evaluation of the performance of the plant is based on different thermal efficiencies of the main components in addition to the second law analysis considering the exergy destructions for the whole components. The contribution factors including the solar as well as the wasted energy are considered in the calculations. The final results have shown that there is significant exergy destruction in solar concentrator and the combustion chamber of the gas turbine unit. Other components such as compressor, gas turbine, steam turbine and heat exchangers having insignificant exergy destruction. Also, solar energy can contribute by about 27% of the input energy to the plant while the energy lost with exhaust gases can contribute by about 64% at maximum cases.

Keywords: solar energy, environment, efficiency, waste heat, steam generator, performance, exergy destruction

Procedia PDF Downloads 294
14714 A Comparative Study of Motion Events Encoding in English and Italian

Authors: Alfonsina Buoniconto

Abstract:

The aim of this study is to investigate the degree of cross-linguistic and intra-linguistic variation in the encoding of motion events (MEs) in English and Italian, these being typologically different languages both showing signs of disobedience to their respective types. As a matter of fact, the traditional typological classification of MEs encoding distributes languages into two macro-types, based on the preferred locus for the expression of Path, the main ME component (other components being Figure, Ground and Manner) characterized by conceptual and structural prominence. According to this model, Satellite-framed (SF) languages typically express Path information in verb-dependent items called satellites (e.g. preverbs and verb particles) with main verbs encoding Manner of motion; whereas Verb-framed languages (VF) tend to include Path information within the verbal locus, leaving Manner to adjuncts. Although this dichotomy is valid altogether, languages do not always behave according to their typical classification patterns. English, for example, is usually ascribed to the SF type due to the rich inventory of postverbal particles and phrasal verbs used to express spatial relations (i.e. the cat climbed down the tree); nevertheless, it is not uncommon to find constructions such as the fog descended slowly, which is typical of the VF type. Conversely, Italian is usually described as being VF (cf. Paolo uscì di corsa ‘Paolo went out running’), yet SF constructions like corse via in lacrime ‘She ran away in tears’ are also frequent. This paper will try to demonstrate that such a typological overlapping is due to the fact that the semantic units making up MEs are distributed within several loci of the sentence –not only verbs and satellites– thus determining a number of different constructions stemming from convergent factors. Indeed, the linguistic expression of motion events depends not only on the typological nature of languages in a traditional sense, but also on a series morphological, lexical, and syntactic resources, as well as on inferential, discursive, usage-related, and cultural factors that make semantic information more or less accessible, frequent, and easy to process. Hence, rather than describe English and Italian in dichotomic terms, this study focuses on the investigation of cross-linguistic and intra-linguistic variation in the use of all the strategies made available by each linguistic system to express motion. Evidence for these assumptions is provided by parallel corpora analysis. The sample texts are taken from two contemporary Italian novels and their respective English translations. The 400 motion occurrences selected (200 in English and 200 in Italian) were scanned according to the MODEG (an acronym for Motion Decoding Grid) methodology, which grants data comparability through the indexation and retrieval of combined morphosyntactic and semantic information at different levels of detail.

Keywords: construction typology, motion event encoding, parallel corpora, satellite-framed vs. verb-framed type

Procedia PDF Downloads 255
14713 Analysis of the Extreme Hydrometeorological Events in the Theorical Hydraulic Potential and Streamflow Forecast

Authors: Sara Patricia Ibarra-Zavaleta, Rabindranarth Romero-Lopez, Rosario Langrave, Annie Poulin, Gerald Corzo, Mathias Glaus, Ricardo Vega-Azamar, Norma Angelica Oropeza

Abstract:

The progressive change in climatic conditions worldwide has increased frequency and severity of extreme hydrometeorological events (EHE). Mexico is an example; this has been affected by the presence of EHE leaving economic, social and environmental losses. The objective of this research was to apply a Canadian distributed hydrological model (DHM) to tropical conditions and to evaluate its capacity to predict flows in a basin in the central Gulf of Mexico. In addition, the DHM (once calibrated and validated) was used to calculate the theoretical hydraulic power and the performance to predict streamflow before the presence of an EHE. The results of the DHM show that the goodness of fit indicators between the observed and simulated flows in the calibration process (NSE=0.83, RSR=0.021 and BIAS=-4.3) and validation: temporal was assessed at two points: point one (NSE=0.78, RSR=0.113 and BIAS=0.054) and point two (NSE=0.825, RSR=0.103 and BIAS=0.063) are satisfactory. The DHM showed its applicability in tropical environments and its ability to characterize the rainfall-runoff relationship in the study area. This work can serve as a tool for identifying vulnerabilities before floods and for the rational and sustainable management of water resources.

Keywords: HYDROTEL, hydraulic power, extreme hydrometeorological events, streamflow

Procedia PDF Downloads 336
14712 Equity Investment Restrictions and Pension Replacement Rates in Nigeria: A Ruin-Risk Analysis

Authors: Uche A. Ibekwe

Abstract:

Pension funds are pooled assets which are established to provide income for retirees. The funds are usually regulated to check excessive risk taking by fund managers. In Nigeria, the current defined contribution (DC) pension scheme appears to contain some overly stringent restrictions which might be hampering its successful implementation. Notable among these restrictions is the 25 percent maximum limit on investment in ordinary shares of quoted companies. This paper examines the extent to which these restrictions affect pension replacement rates at retirement. The study made use of both simulated and historical asset return distributions using mean-variance, regression analysis and ruin-risk analyses, the study found that the current equity investment restriction policy in Nigeria reduces replacement rates at retirement.

Keywords: equity investment, replacement rates, restrictions, ruin-risk

Procedia PDF Downloads 339
14711 Landslide Susceptibility Mapping Using Soft Computing in Amhara Saint

Authors: Semachew M. Kassa, Africa M Geremew, Tezera F. Azmatch, Nandyala Darga Kumar

Abstract:

Frequency ratio (FR) and analytical hierarchy process (AHP) methods are developed based on past landslide failure points to identify the landslide susceptibility mapping because landslides can seriously harm both the environment and society. However, it is still difficult to select the most efficient method and correctly identify the main driving factors for particular regions. In this study, we used fourteen landslide conditioning factors (LCFs) and five soft computing algorithms, including Random Forest (RF), Support Vector Machine (SVM), Logistic Regression (LR), Artificial Neural Network (ANN), and Naïve Bayes (NB), to predict the landslide susceptibility at 12.5 m spatial scale. The performance of the RF (F1-score: 0.88, AUC: 0.94), ANN (F1-score: 0.85, AUC: 0.92), and SVM (F1-score: 0.82, AUC: 0.86) methods was significantly better than the LR (F1-score: 0.75, AUC: 0.76) and NB (F1-score: 0.73, AUC: 0.75) method, according to the classification results based on inventory landslide points. The findings also showed that around 35% of the study region was made up of places with high and very high landslide risk (susceptibility greater than 0.5). The very high-risk locations were primarily found in the western and southeastern regions, and all five models showed good agreement and similar geographic distribution patterns in landslide susceptibility. The towns with the highest landslide risk include Amhara Saint Town's western part, the Northern part, and St. Gebreal Church villages, with mean susceptibility values greater than 0.5. However, rainfall, distance to road, and slope were typically among the top leading factors for most villages. The primary contributing factors to landslide vulnerability were slightly varied for the five models. Decision-makers and policy planners can use the information from our study to make informed decisions and establish policies. It also suggests that various places should take different safeguards to reduce or prevent serious damage from landslide events.

Keywords: artificial neural network, logistic regression, landslide susceptibility, naïve Bayes, random forest, support vector machine

Procedia PDF Downloads 73