Search results for: fire probabilistic risk assessment
10277 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods
Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu
Abstract:
The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.Keywords: accident analysis, multi-factorial error modeling, risk, systemic methods
Procedia PDF Downloads 20810276 A Holistic Analysis of the Emergency Call: From in Situ Negotiation to Policy Frameworks and Back
Authors: Jo Angouri, Charlotte Kennedy, Shawnea Ting, David Rawlinson, Matthew Booker, Nigel Rees
Abstract:
Ambulance services need to balance the large volume of emergency (999 in the UK) calls they receive (e.g., West Midlands Ambulance Service reports per day about 4,000 999 calls; about 679,000 calls per year are received in Wales), with dispatching limited resource for on-site intervention to the most critical cases. The process by which Emergency Medical Dispatch (EMD) decisions are made is related to risk assessment and involves the caller and call-taker as well as clinical teams negotiating risk levels on a case-by-case basis. Medical Priority Dispatch System (MPDS – also referred to as Advanced Medical Priority Dispatch System AMPDS) are used in the UK by NHS Trusts (e.,g WAST) to process and prioritise 999 calls. MPDS / AMPDS provide structured protocols for call prioritisation and call management. Protocols/policy frameworks have not been examined before in the way we propose in our project. In more detail, the risk factors that play a role in the EMD negotiation between the caller and call-taker have been analysed in both medical and social science research. Research has focused on the structural, morphological and phonological aspects that could improve, and train, human-to-human interaction or automate risk detection, as well as the medical factors that need to be captured from the caller to inform the dispatch decision. There are two significant gaps in our knowledge that we address in our work: 1. the role of backstage clinical teams in translating the caller/call-taker interaction in their internal risk negotiation and, 2. the role of policy frameworks, protocols and regulations in the framing of institutional priorities and resource allocation. We take a multi method approach and combine the analysis of 999 calls with the analysis of policy documents. We draw on interaction analysis, corpus methodologies and thematic analysis. In this paper, we report on our preliminary findings and focus in particular on the risk factors we have identified and the relationship with the regulations that create the frame within which teams operate. We close the paper with implications of our study for providing evidence-based policy intervention and recommendations for further research.Keywords: emergency (999) call, interaction analysis, discourse analysis, ambulance dispatch, medical discourse
Procedia PDF Downloads 10310275 Downtime Modelling for the Post-Earthquake Building Assessment Phase
Authors: S. Khakurel, R. P. Dhakal, T. Z. Yeow
Abstract:
Downtime is one of the major sources (alongside damage and injury/death) of financial loss incurred by a structure in an earthquake. The length of downtime associated with a building after an earthquake varies depending on the time taken for the reaction (to the earthquake), decision (on the future course of action) and execution (of the decided course of action) phases. Post-earthquake assessment of buildings is a key step in the decision making process to decide the appropriate safety placarding as well as to decide whether a damaged building is to be repaired or demolished. The aim of the present study is to develop a model to quantify downtime associated with the post-earthquake building-assessment phase in terms of two parameters; i) duration of the different assessment phase; and ii) probability of different colour tagging. Post-earthquake assessment of buildings includes three stages; Level 1 Rapid Assessment including a fast external inspection shortly after the earthquake, Level 2 Rapid Assessment including a visit inside the building and Detailed Engineering Evaluation (if needed). In this study, the durations of all three assessment phases are first estimated from the total number of damaged buildings, total number of available engineers and the average time needed for assessing each building. Then, probability of different tag colours is computed from the 2010-11 Canterbury earthquake Sequence database. Finally, a downtime model for the post-earthquake building inspection phase is proposed based on the estimated phase length and probability of tag colours. This model is expected to be used for rapid estimation of seismic downtime within the Loss Optimisation Seismic Design (LOSD) framework.Keywords: assessment, downtime, LOSD, Loss Optimisation Seismic Design, phase length, tag color
Procedia PDF Downloads 18510274 Temperature-Related Alterations to Mineral Levels and Crystalline Structure in Porcine Long Bone: Intense Heat Vs. Open Flame
Authors: Caighley Logan
Abstract:
The outcome of fire related fatalities, along with other research, has found fires can have a detrimental effect to the mineral and crystalline structures within bone. This study focused on the mineral and crystalline structures within porcine bone samples to analyse the changes caused, with the intent of effectively ‘reverse engineering’ the data collected from burned bone samples to discover what may have happened. Using Fourier Transform Infrared (FT-IR), and X-Ray Fluorescence (XRF), the data collected from a controlled source of intense heat (muffle furnace) and an open fire, based in a living room setting in a standard size shipping container (8.5ft x 8ft) of a similar temperature with a known ignition source, a gasoline lighter. This approach is to analyse the changes to the samples and how the changes differ depending on the heat source. Results have found significant differences in the levels of remaining minerals for each type of heat/burning (p=<0.001), particularly Phosphorus and Calcium, this also includes notable additions of absorbed elements and minerals from the surrounding materials, i.e., Cerium (Ce), Bromine (Br) and Neodymium (Ne). The analysis techniques included provide validated results in conjunction with previous studies.Keywords: forensic anthropology, thermal alterations, porcine bone, FTIR, XRF
Procedia PDF Downloads 8510273 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps
Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur
Abstract:
The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion
Procedia PDF Downloads 11710272 Assessment of Soil Erosion Risk Using Soil and Water Assessment Tools Model: Case of Siliana Watershed, Northwest Tunisia
Authors: Sana Dridi, Jalel Aouissi, Rafla Attia, Taoufik Hermassi, Thouraya Sahli
Abstract:
Soil erosion is an increasing issue in Mediterranean countries. In Tunisia, the capacity of dam reservoirs continues to decrease as a consequence of soil erosion. This study aims to predict sediment yield to enrich soil management practices using Soil and Water Assessment Tools model (SWAT) in the Siliana watershed (1041.6 km²), located in the northwest of Tunisia. A database was constructed using remote sensing and Geographical Information System. Climatic and flow data were collected from water resources directorates in Tunisia. The SWAT model was built to simulate hydrological processes and sediment transport. A sensitivity analysis, calibration, and validation were performed using SWAT-CUP software. The model calibration of stream flow simulations shows a good performance with NSE and R² values of 0.77 and 0.79, respectively. The model validation shows a very good performance with values of NSE and R² for 0.8 and 0.88, respectively. After calibration and validation of stream flow simulation, the model was used to simulate the soil erosion and sediment load transport. The spatial distributions of soil loss rate for determining the critical sediment source areas show that 63 % of the study area has a low soil loss rate less than 7 t ha⁻¹y⁻¹. The annual average soil loss rate simulated with the SWAT model in the Siliana watershed is 4.62 t ha⁻¹y⁻¹.Keywords: water erosion, SWAT model, streamflow, SWATCUP, sediment yield
Procedia PDF Downloads 10210271 Association of Selected Polymorphisms of BER Pathway with the Risk of Colorectal Cancer in the Polish Population
Authors: Jacek Kabzinski, Karolina Przybylowska, Lukasz Dziki, Adam Dziki, Ireneusz Majsterek
Abstract:
The incidence of colorectal cancer (CRC) is increasing from year to year. Despite intensive research CRC etiology remains unknown. Studies suggest that at the basis of the process of carcinogenesis can lie reduced efficiency of DNA repair mechanisms, often caused by polymorphisms in DNA repair genes. The aim of the study was to determine the relationship between gene polymorphisms Pro242Arg of PolB gene and Arg780His of Lig3 gene and modulation of the risk of colorectal cancer in the Polish population. Determination of the molecular basis of carcinogenesis process and predicting increased risk will allow qualifying patients to increased risk group and including them in preventive program. We used blood collected from 110 patients diagnosed with colorectal cancer. The control group consisted of equal number of healthy people. Genotyping was performed by TaqMan method. The obtained results indicate that the genotype 780Arg/His of Lig3 gene is associated with an increased risk of colorectal cancer. On the basis of these results, we conclude that Lig3 gene polymorphism Arg780His may be associated with an increased risk of colorectal cancer.Keywords: BER, colorectal cancer, PolB, Lig3, polymorphisms
Procedia PDF Downloads 45410270 Vulnerable Paths Assessment for Distributed Denial of Service Attacks in a Cloud Computing Environment
Authors: Manas Tripathi, Arunabha Mukhopadhyay
Abstract:
In Cloud computing environment, cloud servers, sometimes may crash after receiving huge amount of request and cloud services may stop which can create huge loss to users of that cloud services. This situation is called Denial of Service (DoS) attack. In Distributed Denial of Service (DDoS) attack, an attacker targets multiple network paths by compromising various vulnerable systems (zombies) and floods the victim with huge amount of request through these zombies. There are many solutions to mitigate this challenge but most of the methods allows the attack traffic to arrive at Cloud Service Provider (CSP) and then only takes actions against mitigation. Here in this paper we are rather focusing on preventive mechanism to deal with these attacks. We analyze network topology and find most vulnerable paths beforehand without waiting for the traffic to arrive at CSP. We have used Dijkstra's and Yen’s algorithm. Finally, risk assessment of these paths can be done by multiplying the probabilities of attack for these paths with the potential loss.Keywords: cloud computing, DDoS, Dijkstra, Yen’s k-shortest path, network security
Procedia PDF Downloads 27810269 Discussion as a Means to Improve Peer Assessment Accuracy
Authors: Jung Ae Park, Jooyong Park
Abstract:
Writing is an important learning activity that cultivates higher level thinking. Effective and immediate feedback is necessary to help improve students' writing skills. Peer assessment can be an effective method in writing tasks because it makes it possible for students not only to receive quick feedback on their writing but also to get a chance to examine different perspectives on the same topic. Peer assessment can be practiced frequently and has the advantage of immediate feedback. However, there is controversy about the accuracy of peer assessment. In this study, we tried to demonstrate experimentally how the accuracy of peer assessment could be improved. Participants (n=76) were randomly assigned to groups of 4 members. All the participant graded two sets of 4 essays on the same topic. They graded the first set twice, and the second set or the posttest once. After the first grading of the first set, each group in the experimental condition 1 (discussion group), were asked to discuss the results of the peer assessment and then to grade the essays again. Each group in the experimental condition 2 (reading group), were asked to read the assessment on each essay by an expert and then to grade the essays again. In the control group, the participants were asked to grade the 4 essays twice in different orders. Afterwards, all the participants graded the second set of 4 essays. The mean score from 4 participants was calculated for each essay. The accuracy of the peer assessment was measured by Pearson correlation with the scores of the expert. The results were analyzed by two-way repeated measure ANOVA. The main effect of grading was observed: Grading accuracy got better as the number of grading experience increased. Analysis of posttest accuracy revealed that the score variations within a group of 4 participants decreased in both discussion and reading conditions but not in the control condition. These results suggest that having students discuss their grading together can be an efficient means to improve peer assessment accuracy. By discussing, students can learn from others about what to consider in grading and whether their grading is too strict or lenient. Further research is needed to examine the exact cause of the grading accuracy.Keywords: peer assessment, evaluation accuracy, discussion, score variations
Procedia PDF Downloads 26710268 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 38210267 Existing Cardiovascular Risk among Children Diagnosed with Type 1 Diabetes Mellitus at the Emergency Clinic
Authors: Masuma Novak, Daniel Novak
Abstract:
Background: Sweden along with other Nordic countries has the highest incidence of type 1 diabetes mellitus (T1DM) worldwide. The trend is increasing globally. The diagnosis is often given at the emergency clinic when children arrive with cardinal symptom of T1DM. Children with T1DM are known to have an increased risk of microvascular- and macrovascular complications. A family history of cardiovascular complications may further increase their risk. Clinically evident diabetes-related vascular complications are however rarely visible in childhood and adolescence, whereby an intensive diabetes treatment and normoglycemic control is a goal for every child. This study is a risk evaluation of children with T1DM based on their family’s cardiovascular history. Method: Since 2005 the Better Diabetes Diagnosis (BDD) study is a nationwide Swedish prospective cohort study that recruits new-onset T1DM who are less than 18 years old at time of diagnosis. For each newly diagnosed child, blood samples are collected for specific HLA genotyping and islet autoantibody assays and their family’s cardiovascular history is evaluated. As part of the BDD study, during the years 2010-2013 all children diagnosed with T1DM at the Queen Silvia’s Children’s Hospital in Sweden were asked about their family’s cardiovascular history. Questions regarded maternal and paternal high blood pressure, stroke, and myocardial infarction before the age of 55 years, and hyperlipidemia were answered. A maximum risk score of eight was possible. All children are clinically observed prospectively for early functional and structural abnormalities such as protein uremia, blood pressure, and retinopathy. Results: A total of 275 children aged 0 to 18 years were diagnosed with T1DM at the Queen Silvia’s Children’s Hospital emergency clinic during this four year period. The participation rate was 99.7%. 26.4% of the children had no hereditary cardiovascular risk factors. 22.7 % had one risk factor and 18.8% had two risk factors. 14.8% had three risk factors. 9.7% had four risk factors and 7.5% had five risk factors or more. Conclusion: Among children with T1DM in Sweden there is a difference in hereditary cardiovascular risk factors. These results indicate that children with T1DM who also have increased hereditary cardiovascular risk factors should be monitored closely with early screening for functional and structural cardiovascular abnormalities. This is a very preliminary and ongoing study which will be complemented with the cardiovascular risk analysis among children without T1DM.Keywords: children, type I diabetes, emergency clinic, CVD risk
Procedia PDF Downloads 36510266 Modeling Geogenic Groundwater Contamination Risk with the Groundwater Assessment Platform (GAP)
Authors: Joel Podgorski, Manouchehr Amini, Annette Johnson, Michael Berg
Abstract:
One-third of the world’s population relies on groundwater for its drinking water. Natural geogenic arsenic and fluoride contaminate ~10% of wells. Prolonged exposure to high levels of arsenic can result in various internal cancers, while high levels of fluoride are responsible for the development of dental and crippling skeletal fluorosis. In poor urban and rural settings, the provision of drinking water free of geogenic contamination can be a major challenge. In order to efficiently apply limited resources in the testing of wells, water resource managers need to know where geogenically contaminated groundwater is likely to occur. The Groundwater Assessment Platform (GAP) fulfills this need by providing state-of-the-art global arsenic and fluoride contamination hazard maps as well as enabling users to create their own groundwater quality models. The global risk models were produced by logistic regression of arsenic and fluoride measurements using predictor variables of various soil, geological and climate parameters. The maps display the probability of encountering concentrations of arsenic or fluoride exceeding the World Health Organization’s (WHO) stipulated concentration limits of 10 µg/L or 1.5 mg/L, respectively. In addition to a reconsideration of the relevant geochemical settings, these second-generation maps represent a great improvement over the previous risk maps due to a significant increase in data quantity and resolution. For example, there is a 10-fold increase in the number of measured data points, and the resolution of predictor variables is generally 60 times greater. These same predictor variable datasets are available on the GAP platform for visualization as well as for use with a modeling tool. The latter requires that users upload their own concentration measurements and select the predictor variables that they wish to incorporate in their models. In addition, users can upload additional predictor variable datasets either as features or coverages. Such models can represent an improvement over the global models already supplied, since (a) users may be able to use their own, more detailed datasets of measured concentrations and (b) the various processes leading to arsenic and fluoride groundwater contamination can be isolated more effectively on a smaller scale, thereby resulting in a more accurate model. All maps, including user-created risk models, can be downloaded as PDFs. There is also the option to share data in a secure environment as well as the possibility to collaborate in a secure environment through the creation of communities. In summary, GAP provides users with the means to reliably and efficiently produce models specific to their region of interest by making available the latest datasets of predictor variables along with the necessary modeling infrastructure.Keywords: arsenic, fluoride, groundwater contamination, logistic regression
Procedia PDF Downloads 34810265 The Effective Method for Postering Thinking Dispositions of Learners
Authors: H. Jalahi, A. Yazdanpanah Nozari
Abstract:
Background and Purpose: Assessment of learners’ performance is an important factors in teaching-learning process. When a factor is sensitive and has high influence on life, their assessment should be done precisely. Thinking dispositions are very important factors in medical education because of its specific condition. In this study a model is designed for fostering thinking dispositions of learners in which authentic assessment is an important element. Materials and Methods: Objective based research is developmental, and such a model was not designed for curricula. Data collection and comparing approaches about assessment and analyzing current assessments offered applied proposals. Results: Based on research findings, the current assessments are response-based, that is students instead of product of response, only offers the specific response which the teachers expects; but authentic assessment is a form of assessment in which students are asked to perform real-word tasks that demonstrate meaningful application of essential knowledge and skills. Conclusion: Because of the difficulties and unexpected problems in life and individuals needs to lifelong learning and conditions in medical course that require decision making in specific times, we must pay attention to reach thinking dispositions and it should be included in curriculum. Authentic assessment as an important aspect of curriculum can help fostering thinking dispositions of learners. Using this kind of assessments which focus on application of information and skills to solve real-word tasks have more important role in medical courses.Keywords: assessment, authentic, medical courses, developmental
Procedia PDF Downloads 36510264 An Empirical Analysis of the Effects of Corporate Derivatives Use on the Underlying Stock Price Exposure: South African Evidence
Authors: Edson Vengesai
Abstract:
Derivative products have become essential instruments in portfolio diversification, price discovery, and, most importantly, risk hedging. Derivatives are complex instruments; their valuation, volatility implications, and real impact on the underlying assets' behaviour are not well understood. Little is documented empirically, with conflicting conclusions on how these instruments affect firm risk exposures. Given the growing interest in using derivatives in risk management and portfolio engineering, this study examines the practical impact of derivative usage on the underlying stock price exposure and systematic risk. The paper uses data from South African listed firms. The study employs GARCH models to understand the effect of derivative uses on conditional stock volatility. The GMM models are used to estimate the effect of derivatives use on stocks' systematic risk as measured by Beta and on the total risk of stocks as measured by the standard deviation of returns. The results provide evidence on whether derivatives use is instrumental in reducing stock returns' systematic and total risk. The results are subjected to numerous controls for robustness, including financial leverage, firm size, growth opportunities, and macroeconomic effects.Keywords: derivatives use, hedging, volatility, stock price exposure
Procedia PDF Downloads 11010263 Methodologies for Crack Initiation in Welded Joints Applied to Inspection Planning
Authors: Guang Zou, Kian Banisoleiman, Arturo González
Abstract:
Crack initiation and propagation threatens structural integrity of welded joints and normally inspections are assigned based on crack propagation models. However, the approach based on crack propagation models may not be applicable for some high-quality welded joints, because the initial flaws in them may be so small that it may take long time for the flaws to develop into a detectable size. This raises a concern regarding the inspection planning of high-quality welded joins, as there is no generally acceptable approach for modeling the whole fatigue process that includes the crack initiation period. In order to address the issue, this paper reviews treatment methods for crack initiation period and initial crack size in crack propagation models applied to inspection planning. Generally, there are four approaches, by: 1) Neglecting the crack initiation period and fitting a probabilistic distribution for initial crack size based on statistical data; 2) Extrapolating the crack propagation stage to a very small fictitious initial crack size, so that the whole fatigue process can be modeled by crack propagation models; 3) Assuming a fixed detectable initial crack size and fitting a probabilistic distribution for crack initiation time based on specimen tests; and, 4) Modeling the crack initiation and propagation stage separately using small crack growth theories and Paris law or similar models. The conclusion is that in view of trade-off between accuracy and computation efforts, calibration of a small fictitious initial crack size to S-N curves is the most efficient approach.Keywords: crack initiation, fatigue reliability, inspection planning, welded joints
Procedia PDF Downloads 35310262 Assessment of Green Infrastructure for Sustainable Urban Water Management
Authors: Suraj Sharma
Abstract:
Green infrastructure (GI) offers a contemporary approach for reducing the risk of flooding, improve water quality, and harvesting stormwater for sustainable use. GI promotes landscape planning to enhance sustainable development and urban resilience. However, the existing literature is lacking in ensuring the comprehensive assessment of GI performance in terms of ecosystem function and services for social, ecological, and economical system resilience. We propose a robust indicator set and fuzzy comprehensive evaluation (FCE) for quantitative and qualitative analysis for sustainable water management to assess the capacity of urban resilience. Green infrastructure in urban resilience water management system (GIUR-WMS) supports decision-making for GI planning through scenario comparisons with urban resilience capacity index. To demonstrate the GIUR-WMS, we develop five scenarios for five sectors of Chandigarh (12, 26, 14, 17, and 34) to test common type of GI (rain barrel, rain gardens, detention basins, porous pavements, and open spaces). The result shows the open spaces achieve the highest green infrastructure urban resilience index of 4.22/5. To implement the open space scenario in urban sites, suitable vacant can be converted to green spaces (example: forest, low impact recreation areas, and detention basins) GIUR-WMS is easy to replicate, customize and apply to cities of different sizes to assess environmental, social and ecological dimensions.Keywords: green infrastructure, assessment, urban resilience, water management system, fuzzy comprehensive evaluation
Procedia PDF Downloads 14310261 Comprehensive Assessment of Energy Efficiency within the Production Process
Authors: S. Kreitlein, N. Eder, J. Franke
Abstract:
The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production
Procedia PDF Downloads 73310260 Evaluation of Digital Assessment of Anal Sphincter Muscle Strength
Authors: Emmanuel Kamal Aziz Saba, Gihan Abd El-Lateif Younis El-Tantawi, Mohammed Hamdy Zahran, Ibrahim Khalil Ibrahim, Mohammed Abd El-Salam Shehata, Hussein Al-Moghazy Sultan, Medhat Mohamed Anwar
Abstract:
Examination of the external anal sphincter muscle strength of voluntary contraction is essential in initial assessment and assessment of efficacy of rehabilitation of patients with faecal incontinence (FI) and obstructed defecation (OD). The present study was conducted to evaluate the digital assessment of the external anal sphincter muscle strength of voluntary contraction by using Modified Oxford Scale (MOS) in comparison to anal manometry squeeze pressure. The present cross-sectional study included 65 patients. There were 40 patients (61.5 %) with FI and 25 patients (38.5 %) with OD. All patients were subjected to history taking, clinical examination including assessment of the external anal sphincter muscle strength of voluntary contraction by using MOS and anal manometry (mean squeeze pressure and maximal squeeze pressure). There was a statistically significant positive correlation between MOS and anal manometry squeeze pressures including mean squeeze pressure and maximal squeeze pressure among FI group and OD group. In conclusion, assessment of the external anal sphincter muscle strength of voluntary contraction by using MOS is a valid method and can substitute anal manometry assessment.Keywords: anal manometry, external anal sphincter muscle, Modified Oxford Scale, muscle strength
Procedia PDF Downloads 41610259 Management Tools for Assessment of Adverse Reactions Caused by Contrast Media at the Hospital
Authors: Pranee Suecharoen, Ratchadaporn Soontornpas, Jaturat Kanpittaya
Abstract:
Background: Contrast media has an important role for disease diagnosis through detection of pathologies. Contrast media can, however, cause adverse reactions after administration of its agents. Although non-ionic contrast media are commonly used, the incidence of adverse events is relatively low. The most common reactions found (10.5%) were mild and manageable and/or preventable. Pharmacists can play an important role in evaluating adverse reactions, including awareness of the specific preparation and the type of adverse reaction. As most common types of adverse reactions are idiosyncratic or pseudo-allergic reactions, common standards need to be established to prevent and control adverse reactions promptly and effectively. Objective: To measure the effect of using tools for symptom evaluation in order to reduce the severity, or prevent the occurrence, of adverse reactions from contrast media. Methods: Retrospective review descriptive research with data collected on adverse reactions assessment and Naranjo’s algorithm between June 2015 and May 2016. Results: 158 patients (10.53%) had adverse reactions. Of the 1,500 participants with an adverse event evaluation, 137 (9.13%) had a mild adverse reaction, including hives, nausea, vomiting, dizziness, and headache. These types of symptoms can be treated (i.e., with antihistamines, anti-emetics) and the patient recovers completely within one day. The group with moderate adverse reactions, numbering 18 cases (1.2%), had hypertension or hypotension, and shortness of breath. Severe adverse reactions numbered 3 cases (0.2%) and included swelling of the larynx, cardiac arrest, and loss of consciousness, requiring immediate treatment. No other complications under close medical supervision were recorded (i.e., corticosteroids use, epinephrine, dopamine, atropine, or life-saving devices). Using the guideline, therapies are divided into general and specific and are performed according to the severity, risk factors and ingestion of contrast media agents. Patients who have high-risk factors were screened and treated (i.e., prophylactic premedication) for prevention of severe adverse reactions, especially those with renal failure. Thus, awareness for the need for prescreening of different risk factors is necessary for early recognition and prompt treatment. Conclusion: Studying adverse reactions can be used to develop a model for reducing the level of severity and setting a guideline for a standardized, multidisciplinary approach to adverse reactions.Keywords: role of pharmacist, management of adverse reactions, guideline for contrast media, non-ionic contrast media
Procedia PDF Downloads 30310258 Dynamic-cognition of Strategic Mineral Commodities; An Empirical Assessment
Authors: Carlos Tapia Cortez, Serkan Saydam, Jeff Coulton, Claude Sammut
Abstract:
Strategic mineral commodities (SMC) both energetic and metals have long been fundamental for human beings. There is a strong and long-run relation between the mineral resources industry and society's evolution, with the provision of primary raw materials, becoming one of the most significant drivers of economic growth. Due to mineral resources’ relevance for the entire economy and society, an understanding of the SMC market behaviour to simulate price fluctuations has become crucial for governments and firms. For any human activity, SMC price fluctuations are affected by economic, geopolitical, environmental, technological and psychological issues, where cognition has a major role. Cognition is defined as the capacity to store information in memory, processing and decision making for problem-solving or human adaptation. Thus, it has a significant role in those systems that exhibit dynamic equilibrium through time, such as economic growth. Cognition allows not only understanding past behaviours and trends in SCM markets but also supports future expectations of demand/supply levels and prices, although speculations are unavoidable. Technological developments may also be defined as a cognitive system. Since the Industrial Revolution, technological developments have had a significant influence on SMC production costs and prices, likewise allowing co-integration between commodities and market locations. It suggests a close relation between structural breaks, technology and prices evolution. SCM prices forecasting have been commonly addressed by econometrics and Gaussian-probabilistic models. Econometrics models may incorporate the relationship between variables; however, they are statics that leads to an incomplete approach of prices evolution through time. Gaussian-probabilistic models may evolve through time; however, price fluctuations are addressed by the assumption of random behaviour and normal distribution which seems to be far from the real behaviour of both market and prices. Random fluctuation ignores the evolution of market events and the technical and temporal relation between variables, giving the illusion of controlled future events. Normal distribution underestimates price fluctuations by using restricted ranges, curtailing decisions making into a pre-established space. A proper understanding of SMC's price dynamics taking into account the historical-cognitive relation between economic, technological and psychological factors over time is fundamental in attempting to simulate prices. The aim of this paper is to discuss the SMC market cognition hypothesis and empirically demonstrate its dynamic-cognitive capacity. Three of the largest and traded SMC's: oil, copper and gold, will be assessed to examine the economic, technological and psychological cognition respectively.Keywords: commodity price simulation, commodity price uncertainties, dynamic-cognition, dynamic systems
Procedia PDF Downloads 46310257 Predicting the Effects of Counseling Psychology on the Sexual Risk Behavior of In-School Adolescents: Implication for National Development
Authors: Olusola Joseph Adesina, Adebayo Adeyinka Salako
Abstract:
The study adopted a descriptive research design. Two hundred (200) in-school adolescents were purposely selected in Afijio Local Government Area of Oyo State. Two hypotheses were also raised to pilot the study. The researchers developed an instrument which was validated by psychological experts, the instrument tagged counseling psychology and sexual risk behavior questionnaire (CPSRBQ)(r = 0.78). The results were analysed using t-test at 0.05 level of significance. The result showed that there is a significant relationship between counseling psychology and sexual risk behavior of in-school adolescents. It was also noticed that there is a significant difference in the sexual risk behavior of male and female adolescents. Based on the findings, it was recommended that more counselors are still needed in Nigeria schools. There is need for restructuring Nigeria Curriculum most especially on sex education and related diseases. Lastly, adolescents should be more exposed to seminars on HIV/AIDS, sex education enlightenment programmes and marital counseling.Keywords: counseling psychology, sexual behavior, risk and adolescent, cognitive sciences
Procedia PDF Downloads 50910256 Increasing Prevalence of CVD and Its Risk Factors in India: A Review
Authors: Deepa Shokeen, Bani Tamber Aeri
Abstract:
Non-communicable diseases in general and cardiovascular diseases (CVD) in particular are a big cause of concern worldwide especially in fast growing economy like India. CVD is one of the leading causes of deaths in India. Risk factors for cardiovascular disease are now significant in all populations. At least one-third of all CVD is attributable to five risk factors: tobacco use, alcohol use, high blood pressure, high cholesterol and obesity. Methods: This article aspires to collate data gathered by relevant studies conducted after year 2000 and provide an overview of the prevalence of CVD in India and worldwide. Results: Studies show an increased prevalence of cardiovascular risk factors in India as compared to other developing and developed countries with recent trends showing incidence in younger age group. It is seen to affect almost all sections of the society from young to old and most affluent to least affluent. High blood pressure, high cholesterol, tobacco and alcohol use, as well as low vegetable and fruit intake, already figure among the top risk factors. Conclusion: The prevalence of risk factors associated with CVD has increased and will keep on increasing in India as indicated by studies in the last decade and as predicted by the projections for future estimates. Some major risks are modifiable in that they can be prevented, treated, and controlled. There are considerable health benefits at all ages, for both men and women, in stopping smoking, reducing cholesterol and blood pressure, eating a healthy diet and increasing physical activity.Keywords: prevalence, cardiovascular disease, India, risk factors
Procedia PDF Downloads 51410255 Spatial and Temporal Analysis of Forest Cover Change with Special Reference to Anthropogenic Activities in Kullu Valley, North-Western Indian Himalayan Region
Authors: Krisala Joshi, Sayanta Ghosh, Renu Lata, Jagdish C. Kuniyal
Abstract:
Throughout the world, monitoring and estimating the changing pattern of forests across diverse landscapes through remote sensing is instrumental in understanding the interactions of human activities and the ecological environment with the changing climate. Forest change detection using satellite imageries has emerged as an important means to gather information on a regional scale. Kullu valley in Himachal Pradesh, India is situated in a transitional zone between the lesser and the greater Himalayas. Thus, it presents a typical rugged mountainous terrain with moderate to high altitude which varies from 1200 meters to over 6000 meters. Due to changes in agricultural cropping patterns, urbanization, industrialization, hydropower generation, climate change, tourism, and anthropogenic forest fire, it has undergone a tremendous transformation in forest cover in the past three decades. The loss and degradation of forest cover results in soil erosion, loss of biodiversity including damage to wildlife habitats, and degradation of watershed areas, and deterioration of the overall quality of nature and life. The supervised classification of LANDSAT satellite data was performed to assess the changes in forest cover in Kullu valley over the years 2000 to 2020. Normalized Burn Ratio (NBR) was calculated to discriminate between burned and unburned areas of the forest. Our study reveals that in Kullu valley, the increasing number of forest fire incidents specifically, those due to anthropogenic activities has been on a rise, each subsequent year. The main objective of the present study is, therefore, to estimate the change in the forest cover of Kullu valley and to address the various social aspects responsible for the anthropogenic forest fires. Also, to assess its impact on the significant changes in the regional climatic factors, specifically, temperature, humidity, and precipitation over three decades, with the help of satellite imageries and ground data. The main outcome of the paper, we believe, will be helpful for the administration for making a quantitative assessment of the forest cover area changes due to anthropogenic activities and devising long-term measures for creating awareness among the local people of the area.Keywords: Anthropogenic Activities, Forest Change Detection, Normalized Burn Ratio (NBR), Supervised Classification
Procedia PDF Downloads 17310254 Quality Assurance as an Educational Development Tool: Case from the European Higher Education
Authors: Maha Mourad
Abstract:
Higher education in any competitive European economy should serve the new information society by increasing the supply of good quality education services and by creating good international brands in the international higher education market. Hence, continuous risk management techniques through higher educational reforms programs became one of the top priorities within the European Union to control the quality of higher education. Risk is higher education is studies by several researchers who agreed that the risk in higher education has a direct influence on continuity of quality education and research contribution. The focus of this research is to highlights the Internal Quality Assurance (IQA) activities in the Polish higher education system as a risk management tool used to control the quality of education. This paper presents a qualitative empirical analysis in 5 different universities in Poland. In addition, it aims to help in finding global practical and create benchmark for policy makers concerning the risk management techniques based on the Polish experience.Keywords: education development, quality assurance, sustainability, european higher education
Procedia PDF Downloads 46810253 Regulatory and Economic Challenges of AI Integration in Cyber Insurance
Authors: Shreyas Kumar, Mili Shangari
Abstract:
Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware
Procedia PDF Downloads 3310252 Ports and Airports: Gateways to Vector-Borne Diseases in Portugal Mainland
Authors: Maria C. Proença, Maria T. Rebelo, Maria J. Alves, Sofia Cunha
Abstract:
Vector-borne diseases are transmitted to humans by mosquitos, sandflies, bugs, ticks, and other vectors. Some are re-transmitted between vectors, if the infected human has a new contact when his levels of infection are high. The vector is infected for lifetime and can transmit infectious diseases not only between humans but also from animals to humans. Some vector borne diseases are very disabling and globally account for more than one million deaths worldwide. The mosquitoes from the complex Culex pipiens sl. are the most abundant in Portugal, and we dispose in this moment of a data set from the surveillance program that has been carried on since 2006 across the country. All mosquitos’ species are included, but the large coverage of Culex pipiens sl. and its importance for public health make this vector an interesting candidate to assess risk of disease amplification. This work focus on ports and airports identified as key areas of high density of vectors. Mosquitoes being ectothermic organisms, the main factor for vector survival and pathogen development is temperature. Minima and maxima local air temperatures for each area of interest are averaged by month from data gathered on a daily basis at the national network of meteorological stations, and interpolated in a geographic information system (GIS). The range of temperatures ideal for several pathogens are known and this work shows how to use it with the meteorological data in each port and airport facility, to focus an efficient implementation of countermeasures and reduce simultaneously risk transmission and mitigation costs. The results show an increased alert with decreasing latitude, which corresponds to higher minimum and maximum temperatures and a lower amplitude range of the daily temperature.Keywords: human health, risk assessment, risk management, vector-borne diseases
Procedia PDF Downloads 41910251 Family Management, Relations Risk and Protective Factors for Adolescent Substance Abuse in South Africa
Authors: Beatrice Wamuyu Muchiri, Monika M. L. Dos Santos
Abstract:
An increasingly recognised prevention approach for substance use entails reduction in risk factors and enhancement of promotive or protective factors in individuals and the environment surrounding them during their growth and development. However, in order to enhance the effectiveness of this approach, continuous study of risk aspects targeting different cultures, social groups and mixture of society has been recommended. This study evaluated the impact of potential risk and protective factors associated with family management and relations on adolescent substance abuse in South Africa. Exploratory analysis and cumulative odds ordinal logistic regression modelling was performed on the data while controlling for demographic and socio-economic characteristics on adolescent substance use. The most intensely used substances were tobacco, cannabis, cocaine, heroin and alcohol in decreasing order of use intensity. The specific protective or risk impact of family management or relations factors varied from substance to substance. Risk factors associated with demographic and socio-economic factors included being male, younger age, being in lower education grades, coloured ethnicity, adolescents from divorced parents and unemployed or fully employed mothers. Significant family relations risk and protective factors against substance use were classified as either family functioning and conflict or family bonding and support. Several family management factors, categorised as parental monitoring, discipline, behavioural control and rewards, demonstrated either risk or protective effect on adolescent substance use. Some factors had either interactive risk or protective impact on substance use or lost significance when analysed jointly with other factors such as controlled variables. Interaction amongst risk or protective factors as well as the type of substance should be considered when further considering interventions based on these risk or protective factors. Studies in other geographical regions, institutions and with better gender balance are recommended to improve upon the representativeness of the results. Several other considerations to be made when formulating interventions, the shortcomings of this study and possible improvements as well as future studies are also suggested.Keywords: risk factors, protective factors, substance use, adolescents
Procedia PDF Downloads 20410250 A Fuzzy Inference Tool for Assessing Cancer Risk from Radiation Exposure
Authors: Bouharati Lokman, Bouharati Imen, Bouharati Khaoula, Bouharati Oussama, Bouharati Saddek
Abstract:
Ionizing radiation exposure is an established cancer risk factor. Compared to other common environmental carcinogens, it is relatively easy to determine organ-specific radiation dose and, as a result, radiation dose-response relationships tend to be highly quantified. Nevertheless, there can be considerable uncertainty about questions of radiation-related cancer risk as they apply to risk protection and public policy, and the interpretations of interested parties can differ from one person to another. Examples of tools used in the analysis of the risk of developing cancer due to radiation are characterized by uncertainty. These uncertainties are related to the history of exposure and different assumptions involved in the calculation. We believe that the results of statistical calculations are characterized by uncertainty and imprecision. Having regard to the physiological variation from one person to another. In this study, we develop a tool based on fuzzy logic inference. As fuzzy logic deals with imprecise and uncertain, its application in this area is adequate. We propose a fuzzy system with three input variables (age, sex and body attainable cancer). The output variable expresses the risk of infringement rate of each organ. A base rule is established from recorded actual data. After successful simulation, this will instantly predict the risk of infringement rate of each body following chronic exposure to 0.1 Gy.Keywords: radiation exposure, cancer, modeling, fuzzy logic
Procedia PDF Downloads 31110249 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact
Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed
Abstract:
Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis
Procedia PDF Downloads 12810248 Vulnerability Analysis for Risk Zones Boundary Definition to Support a Decision Making Process at CBRNE Operations
Authors: Aliaksei Patsekha, Michael Hohenberger, Harald Raupenstrauch
Abstract:
An effective emergency response to accidents with chemical, biological, radiological, nuclear, or explosive materials (CBRNE) that represent highly dynamic situations needs immediate actions within limited time, information and resources. The aim of the study is to provide the foundation for division of unsafe area into risk zones according to the impact of hazardous parameters (heat radiation, thermal dose, overpressure, chemical concentrations). A decision on the boundary values for three risk zones is based on the vulnerability analysis that covered a variety of accident scenarios containing the release of a toxic or flammable substance which either evaporates, ignites and/or explodes. Critical values are selected for the boundary definition of the Red, Orange and Yellow risk zones upon the examination of harmful effects that are likely to cause injuries of varying severity to people and different levels of damage to structures. The obtained results provide the basis for creating a comprehensive real-time risk map for a decision support at CBRNE operations.Keywords: boundary values, CBRNE threats, decision making process, hazardous effects, vulnerability analysis, risk zones
Procedia PDF Downloads 209