Search results for: TensorFlow probability
992 Risk Analysis of Leaks from a Subsea Oil Facility Based on Fuzzy Logic Techniques
Authors: Belén Vinaixa Kinnear, Arturo Hidalgo López, Bernardo Elembo Wilasi, Pablo Fernández Pérez, Cecilia Hernández Fuentealba
Abstract:
The expanded use of risk assessment in legislative and corporate decision-making has increased the role of expert judgement in giving data for security-related decision-making. Expert judgements are required in most steps of risk assessment: danger recognizable proof, hazard estimation, risk evaluation, and examination of choices. This paper presents a fault tree analysis (FTA), which implies a probabilistic failure analysis applied to leakage of oil in a subsea production system. In standard FTA, the failure probabilities of items of a framework are treated as exact values while evaluating the failure probability of the top event. There is continuously insufficiency of data for calculating the failure estimation of components within the drilling industry. Therefore, fuzzy hypothesis can be used as a solution to solve the issue. The aim of this paper is to examine the leaks from the Zafiro West subsea oil facility by using fuzzy fault tree analysis (FFTA). As a result, the research has given theoretical and practical contributions to maritime safety and environmental protection. It has been also an effective strategy used traditionally in identifying hazards in nuclear installations and power industries.Keywords: expert judgment, probability assessment, fault tree analysis, risk analysis, oil pipelines, subsea production system, drilling, quantitative risk analysis, leakage failure, top event, off-shore industry
Procedia PDF Downloads 191991 Effects of Family Order and Informal Social Control on Protecting against Child Maltreatment: A Comparative Study of Seoul and Kathmandu
Authors: Thapa Sirjana, Clifton R. Emery
Abstract:
This paper examines the family order and Informal Social Control (ISC) by the extended families as a protective factor against Child Maltreatment. The findings are discussed using the main effects and the interaction effects of family order and informal social control by the extended families. The findings suggest that IPV mothers are associated with child abuse and child neglect. The children are neglected in the home more and physical abuse occurs in the case, if mothers are abused by their husbands. The mother’s difficulties of being abused may lead them to neglect their children. The findings suggest that ‘family order’ is a significant protective factor against child maltreatment. The results suggest that if the family order is neither too high nor too low than that can play a role as a protective factor. Soft type of ISC is significantly associated with child maltreatment. This study suggests that the soft type of ISC by the extended families is a helpful approach to develop child protection in both the countries. This study is analyzed the data collected from Seoul and Kathmandu families and neighborhood study (SKFNS). Random probability cluster sample of married or partnered women in 20 Kathmandu wards and in Seoul 34 dongs were selected using probability proportional to size (PPS) sampling. Overall, the study is to make a comparative study of Korea and Nepal and examine how the cultural differences and similarities associate with the child maltreatment.Keywords: child maltreatment, intimate partner violence, informal social control and family order Seoul, Kathmandu
Procedia PDF Downloads 248990 Sequence Polymorphism and Haplogroup Distribution of Mitochondrial DNA Control Regions HVS1 and HVS2 in a Southwestern Nigerian Population
Authors: Ogbonnaya O. Iroanya, Samson T. Fakorede, Osamudiamen J. Edosa, Hadiat A. Azeez
Abstract:
The human mitochondrial DNA (mtDNA) is about 17 kbp circular DNA fragments found within the mitochondria together with smaller fragments of 1200 bp known as the control region. Knowledge of variation within populations has been employed in forensic and molecular anthropology studies. The study was aimed at investigating the polymorphic nature of the two hypervariable segments (HVS) of the mtDNA, i.e., HVS1 and HVS2, and to determine the haplogroup distribution among individuals resident in Lagos, Southwestern Nigeria. Peripheral blood samples were obtained from sixty individuals who are not related maternally, followed by DNA extraction and amplification of the extracted DNA using primers specific for the regions under investigation. DNA amplicons were sequenced, and sequenced data were aligned and compared to the revised Cambridge Reference Sequence (rCRS) GenBank Accession number: NC_012920.1) using BioEdit software. Results obtained showed 61 and 52 polymorphic nucleotide positions for HVS1 and HVS2, respectively. While a total of three indels mutation were recorded for HVS1, there were seven for HVS2. Also, transition mutations predominate nucleotide change observed in the study. Genetic diversity (GD) values for HVS1 and HVS2 were estimated to be 84.21 and 90.4%, respectively, while random match probability was 0.17% for HVS1 and 0.89% for HVS2. The study also revealed mixed haplogroups specific to the African (L1-L3) and the Eurasians (U and H) lineages. New polymorphic sites obtained from the study are promising for human identification purposes.Keywords: hypervariable region, indels, mitochondrial DNA, polymorphism, random match probability
Procedia PDF Downloads 115989 Corrosion Risk Assessment/Risk Based Inspection (RBI)
Authors: Lutfi Abosrra, Alseddeq Alabaoub, Nuri Elhaloudi
Abstract:
Corrosion processes in the Oil & Gas industry can lead to failures that are usually costly to repair, costly in terms of loss of contaminated product, in terms of environmental damage and possibly costly in terms of human safety. This article describes the results of the corrosion review and criticality assessment done at Mellitah Gas (SRU unit) for pressure equipment and piping system. The information gathered through the review was intended for developing a qualitative RBI study. The corrosion criticality assessment has been carried out by applying company procedures and industrial recommended practices such as API 571, API 580/581, ASME PCC 3, which provides a guideline for establishing corrosion integrity assessment. The corrosion review is intimately related to the probability of failure (POF). During the corrosion study, the process units are reviewed by following the applicable process flow diagrams (PFDs) in the presence of Mellitah’s personnel from process engineering, inspection, and corrosion/materials and reliability engineers. The expected corrosion damage mechanism (internal and external) was identified, and the corrosion rate was estimated for every piece of equipment and corrosion loop in the process units. A combination of both Consequence and Likelihood of failure was used for determining the corrosion risk. A qualitative consequence of failure (COF) for each individual item was assigned based on the characteristics of the fluid as per its flammability, toxicity, and pollution into three levels (High, Medium, and Low). A qualitative probability of failure (POF)was applied to evaluate the internal and external degradation mechanism, a high-level point-based (0 to 10) for the purpose of risk prioritizing in the range of Low, Medium, and High.Keywords: corrosion, criticality assessment, RBI, POF, COF
Procedia PDF Downloads 82988 The Moment of the Optimal Average Length of the Multivariate Exponentially Weighted Moving Average Control Chart for Equally Correlated Variables
Authors: Edokpa Idemudia Waziri, Salisu S. Umar
Abstract:
The Hotellng’s T^2 is a well-known statistic for detecting a shift in the mean vector of a multivariate normal distribution. Control charts based on T have been widely used in statistical process control for monitoring a multivariate process. Although it is a powerful tool, the T statistic is deficient when the shift to be detected in the mean vector of a multivariate process is small and consistent. The Multivariate Exponentially Weighted Moving Average (MEWMA) control chart is one of the control statistics used to overcome the drawback of the Hotellng’s T statistic. In this paper, the probability distribution of the Average Run Length (ARL) of the MEWMA control chart when the quality characteristics exhibit substantial cross correlation and when the process is in-control and out-of-control was derived using the Markov Chain algorithm. The derivation of the probability functions and the moments of the run length distribution were also obtained and they were consistent with some existing results for the in-control and out-of-control situation. By simulation process, the procedure identified a class of ARL for the MEWMA control when the process is in-control and out-of-control. From our study, it was observed that the MEWMA scheme is quite adequate for detecting a small shift and a good way to improve the quality of goods and services in a multivariate situation. It was also observed that as the in-control average run length ARL0¬ or the number of variables (p) increases, the optimum value of the ARL0pt increases asymptotically and as the magnitude of the shift σ increases, the optimal ARLopt decreases. Finally, we use the example from the literature to illustrate our method and demonstrate its efficiency.Keywords: average run length, markov chain, multivariate exponentially weighted moving average, optimal smoothing parameter
Procedia PDF Downloads 422987 From Responses of Macroinvertebrate Metrics to the Definition of Reference Thresholds
Authors: Hounyèmè Romuald, Mama Daouda, Argillier Christine
Abstract:
The present study focused on the use of benthic macrofauna to define the reference state of an anthropized lagoon (Nokoué-Benin) from the responses of relevant metrics to proxies. The approach used is a combination of a joint species distribution model and Bayesian networks. The joint species distribution model was used to select the relevant metrics and generate posterior probabilities that were then converted into posterior response probabilities for each of the quality classes (pressure levels), which will constitute the conditional probability tables allowing the establishment of the probabilistic graph representing the different causal relationships between metrics and pressure proxies. For the definition of the reference thresholds, the predicted responses for low-pressure levels were read via probability density diagrams. Observations collected during high and low water periods spanning 03 consecutive years (2004-2006), sampling 33 macroinvertebrate taxa present at all seasons and sampling points, and measurements of 14 environmental parameters were used as application data. The study demonstrated reliable inferences, selection of 07 relevant metrics and definition of quality thresholds for each environmental parameter. The relevance of the metrics as well as the reference thresholds for ecological assessment despite the small sample size, suggests the potential for wider applicability of the approach for aquatic ecosystem monitoring and assessment programs in developing countries generally characterized by a lack of monitoring data.Keywords: pressure proxies, bayesian inference, bioindicators, acadjas, functional traits
Procedia PDF Downloads 84986 Considerations for Effectively Using Probability of Failure as a Means of Slope Design Appraisal for Homogeneous and Heterogeneous Rock Masses
Authors: Neil Bar, Andrew Heweston
Abstract:
Probability of failure (PF) often appears alongside factor of safety (FS) in design acceptance criteria for rock slope, underground excavation and open pit mine designs. However, the design acceptance criteria generally provide no guidance relating to how PF should be calculated for homogeneous and heterogeneous rock masses, or what qualifies a ‘reasonable’ PF assessment for a given slope design. Observational and kinematic methods were widely used in the 1990s until advances in computing permitted the routine use of numerical modelling. In the 2000s and early 2010s, PF in numerical models was generally calculated using the point estimate method. More recently, some limit equilibrium analysis software offer statistical parameter inputs along with Monte-Carlo or Latin-Hypercube sampling methods to automatically calculate PF. Factors including rock type and density, weathering and alteration, intact rock strength, rock mass quality and shear strength, the location and orientation of geologic structure, shear strength of geologic structure and groundwater pore pressure influence the stability of rock slopes. Significant engineering and geological judgment, interpretation and data interpolation is usually applied in determining these factors and amalgamating them into a geotechnical model which can then be analysed. Most factors are estimated ‘approximately’ or with allowances for some variability rather than ‘exactly’. When it comes to numerical modelling, some of these factors are then treated deterministically (i.e. as exact values), while others have probabilistic inputs based on the user’s discretion and understanding of the problem being analysed. This paper discusses the importance of understanding the key aspects of slope design for homogeneous and heterogeneous rock masses and how they can be translated into reasonable PF assessments where the data permits. A case study from a large open pit gold mine in a complex geological setting in Western Australia is presented to illustrate how PF can be calculated using different methods and obtain markedly different results. Ultimately sound engineering judgement and logic is often required to decipher the true meaning and significance (if any) of some PF results.Keywords: probability of failure, point estimate method, Monte-Carlo simulations, sensitivity analysis, slope stability
Procedia PDF Downloads 208985 Estimation of Effective Radiation Dose Following Computed Tomography Urography at Aminu Kano Teaching Hospital, Kano Nigeria
Authors: Idris Garba, Aisha Rabiu Abdullahi, Mansur Yahuza, Akintade Dare
Abstract:
Background: CT urography (CTU) is efficient radiological examination for the evaluation of the urinary system disorders. However, patients are exposed to a significant radiation dose which is in a way associated with increased cancer risks. Objectives: To determine Computed Tomography Dose Index following CTU, and to evaluate organs equivalent doses. Materials and Methods: A prospective cohort study was carried at a tertiary institution located in Kano northwestern. Ethical clearance was sought and obtained from the research ethics board of the institution. Demographic, scan parameters and CT radiation dose data were obtained from patients that had CTU procedure. Effective dose, organ equivalent doses, and cancer risks were estimated using SPSS statistical software version 16 and CT dose calculator software. Result: A total of 56 patients were included in the study, consisting of 29 males and 27 females. The common indication for CTU examination was found to be renal cyst seen commonly among young adults (15-44yrs). CT radiation dose values in DLP, CTDI and effective dose for CTU were 2320 mGy cm, CTDIw 9.67 mGy and 35.04 mSv respectively. The probability of cancer risks was estimated to be 600 per a million CTU examinations. Conclusion: In this study, the radiation dose for CTU is considered significantly high, with increase in cancer risks probability. Wide radiation dose variations between patient doses suggest that optimization is not fulfilled yet. Patient radiation dose estimate should be taken into consideration when imaging protocols are established for CT urography.Keywords: CT urography, cancer risks, effective dose, radiation exposure
Procedia PDF Downloads 345984 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'
Authors: Anthony Coogan
Abstract:
Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle
Procedia PDF Downloads 200983 Comprehensive Approach to Control Virus Infection and Energy Consumption in An Occupant Classroom
Authors: SeyedKeivan Nateghi, Jan Kaczmarczyk
Abstract:
People nowadays spend most of their time in buildings. Accordingly, maintaining a good quality of indoor air is very important. New universal matters related to the prevalence of Covid-19 also highlight the importance of indoor air conditioning in reducing the risk of virus infection. Cooling and Heating of a house will provide a suitable zone of air temperature for residents. One of the significant factors in energy demand is energy consumption in the building. In general, building divisions compose more than 30% of the world's fundamental energy requirement. As energy demand increased, greenhouse effects emerged that caused global warming. Regardless of the environmental damage to the ecosystem, it can spread infectious diseases such as malaria, cholera, or dengue to many other parts of the world. With the advent of the Covid-19 phenomenon, the previous instructions to reduce energy consumption are no longer responsive because they increase the risk of virus infection among people in the room. Two problems of high energy consumption and coronavirus infection are opposite. A classroom with 30 students and one teacher in Katowice, Poland, considered controlling two objectives simultaneal. The probability of transmission of the disease is calculated from the carbon dioxide concentration of people. Also, in a certain period, the amount of energy consumption is estimated by EnergyPlus. The effect of three parameters of number, angle, and time or schedule of opening windows on the probability of infection transmission and energy consumption of the class were investigated. Parameters were examined widely to determine the best possible condition for simultaneous control of infection spread and energy consumption. The number of opening windows is discrete (0,3), and two other parameters are continuous (0,180) and (8 AM, 2 PM). Preliminary results show that changes in the number, angle, and timing of window openings significantly impact the likelihood of virus transmission and class energy consumption. The greater the number, tilt, and timing of window openings, the less likely the student will transmit the virus. But energy consumption is increasing. When all the windows were closed at all hours of the class, the energy consumption for the first day of January was only 0.2 megajoules. In comparison, the probability of transmitting the virus per person in the classroom is more than 45%. But when all windows were open at maximum angles during class, the chance of transmitting the infection was reduced to 0.35%. But the energy consumption will be 36 megajoules. Therefore, school classrooms need an optimal schedule to control both functions. In this article, we will present a suitable plan for the classroom with natural ventilation through windows to control energy consumption and the possibility of infection transmission at the same time.Keywords: Covid-19, energy consumption, building, carbon dioxide, energyplus
Procedia PDF Downloads 102982 Census and Mapping of Oil Palms Over Satellite Dataset Using Deep Learning Model
Authors: Gholba Niranjan Dilip, Anil Kumar
Abstract:
Conduct of accurate reliable mapping of oil palm plantations and census of individual palm trees is a huge challenge. This study addresses this challenge and developed an optimized solution implemented deep learning techniques on remote sensing data. The oil palm is a very important tropical crop. To improve its productivity and land management, it is imperative to have accurate census over large areas. Since, manual census is costly and prone to approximations, a methodology for automated census using panchromatic images from Cartosat-2, SkySat and World View-3 satellites is demonstrated. It is selected two different study sites in Indonesia. The customized set of training data and ground-truth data are created for this study from Cartosat-2 images. The pre-trained model of Single Shot MultiBox Detector (SSD) Lite MobileNet V2 Convolutional Neural Network (CNN) from the TensorFlow Object Detection API is subjected to transfer learning on this customized dataset. The SSD model is able to generate the bounding boxes for each oil palm and also do the counting of palms with good accuracy on the panchromatic images. The detection yielded an F-Score of 83.16 % on seven different images. The detections are buffered and dissolved to generate polygons demarcating the boundaries of the oil palm plantations. This provided the area under the plantations and also gave maps of their location, thereby completing the automated census, with a fairly high accuracy (≈100%). The trained CNN was found competent enough to detect oil palm crowns from images obtained from multiple satellite sensors and of varying temporal vintage. It helped to estimate the increase in oil palm plantations from 2014 to 2021 in the study area. The study proved that high-resolution panchromatic satellite image can successfully be used to undertake census of oil palm plantations using CNNs.Keywords: object detection, oil palm tree census, panchromatic images, single shot multibox detector
Procedia PDF Downloads 161981 Evolutionary Analysis of Green Credit Regulation on Greenwashing Behavior in Dual-Layer Network
Authors: Bo-wen Zhu, Bin Wu, Feng Chen
Abstract:
It has become a common measure among governments to support green development of enterprises through Green Credit policies. In China, the Central Bank of China and other authorities even put forward corresponding assessment requirements for proportion of green credit in commercial banks. Policy changes might raise concerns about commercial banks turning a blind eye to greenwashing behavior by enterprises. The lack of effective regulation may lead to a diffusion of such behavior, and eventually result in the phenomenon of “bad money driving out good money”, which could dampen the incentive effect of Green Credit policies. This paper employs a complex network model based on an evolutionary game analysis framework involving enterprises, banks, and regulatory authorities to investigate inhibitory effect of the Green Credit regulation on enterprises’ greenwashing behavior, banks’ opportunistic and collusive behaviors. The findings are as follows: (1) Banking opportunism rises with Green Credit evaluation criteria and requirements for the proportion of credit balance. Restrictive regulation against violating banks is necessary as there is an increasing trend of banks adopting opportunistic strategy. (2) Raising penalties and probability of regulatory inspections can effectively suppress banks’ opportunistic behavior, however, it cannot entirely eradicate the opportunistic behavior on the bank side. (3) Although maintaining a certain inspection probability can inhibit enterprises from adopting greenwashing behavior, enterprises choose a catering production strategy instead. (4) One-time rewards from local government have limited effects on the equilibrium state and diffusion trend of bank regulatory decision-making.Keywords: green credit, greenwashing behavior, regulation, diffusion effect
Procedia PDF Downloads 28980 Don't Just Guess and Slip: Estimating Bayesian Knowledge Tracing Parameters When Observations Are Scant
Authors: Michael Smalenberger
Abstract:
Intelligent tutoring systems (ITS) are computer-based platforms which can incorporate artificial intelligence to provide step-by-step guidance as students practice problem-solving skills. ITS can replicate and even exceed some benefits of one-on-one tutoring, foster transactivity in collaborative environments, and lead to substantial learning gains when used to supplement the instruction of a teacher or when used as the sole method of instruction. A common facet of many ITS is their use of Bayesian Knowledge Tracing (BKT) to estimate parameters necessary for the implementation of the artificial intelligence component, and for the probability of mastery of a knowledge component relevant to the ITS. While various techniques exist to estimate these parameters and probability of mastery, none directly and reliably ask the user to self-assess these. In this study, 111 undergraduate students used an ITS in a college-level introductory statistics course for which detailed transaction-level observations were recorded, and users were also routinely asked direct questions that would lead to such a self-assessment. Comparisons were made between these self-assessed values and those obtained using commonly used estimation techniques. Our findings show that such self-assessments are particularly relevant at the early stages of ITS usage while transaction level data are scant. Once a user’s transaction level data become available after sufficient ITS usage, these can replace the self-assessments in order to eliminate the identifiability problem in BKT. We discuss how these findings are relevant to the number of exercises necessary to lead to mastery of a knowledge component, the associated implications on learning curves, and its relevance to instruction time.Keywords: Bayesian Knowledge Tracing, Intelligent Tutoring System, in vivo study, parameter estimation
Procedia PDF Downloads 174979 Influence of Travel Time Reliability on Elderly Drivers Crash Severity
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling
Procedia PDF Downloads 495978 Evaluating Probable Bending of Frames for Near-Field and Far-Field Records
Authors: Majid Saaly, Shahriar Tavousi Tafreshi, Mehdi Nazari Afshar
Abstract:
Most reinforced concrete structures are designed only under heavy loads have large transverse reinforcement spacing values, and therefore suffer severe failure after intense ground movements. The main goal of this paper is to compare the shear- and axial failure of concrete bending frames available in Tehran using incremental dynamic analysis under near- and far-field records. For this purpose, IDA analyses of 5, 10, and 15-story concrete structures were done under seven far-fault records and five near-faults records. The results show that in two-dimensional models of short-rise, mid-rise and high-rise reinforced concrete frames located on Type-3 soil, increasing the distance of the transverse reinforcement can increase the maximum inter-story drift ratio values up to 37%. According to the existing results on 5, 10, and 15-story reinforced concrete models located on Type-3 soil, records with characteristics such as fling-step and directivity create maximum drift values between floors more than far-fault earthquakes. The results indicated that in the case of seismic excitation modes under earthquake encompassing directivity or fling-step, the probability values of failure and failure possibility increasing rate values are much smaller than the corresponding values of far-fault earthquakes. However, in near-fault frame records, the probability of exceedance occurs at lower seismic intensities compared to far-fault records.Keywords: IDA, failure curve, directivity, maximum floor drift, fling step, evaluating probable bending of frames, near-field and far-field earthquake records
Procedia PDF Downloads 109977 Sustainability of Heritage Management in Aksum: Focus on Heritage Conservation and Interpretation
Authors: Gebrekiros Welegebriel Asfaw
Abstract:
The management of the fragile, unique and irreplaceable cultural heritage from different perspectives is becoming a major challenge as important elements of culture are vanishing throughout the globe. The major purpose of this study is to assess how the cultural heritages of Aksum are managed for their future sustainability from heritage conservation and interpretation perspectives. Descriptive type of research design inculcating both quantitative and qualitative research methods is employed. Primary quantitative data was collected from 189 respondents (19 professionals, 88 tourism service providers and 82 tourists) and interview was conducted with 33 targeted informants from heritage and related professions, security employees, local community, service providers and church representatives by applying probability and non probability sampling methods. Findings of the study reveal that the overall sustainable management status of the cultural heritage of Aksum is below average. It is found that the sustainability of cultural heritage management in Aksum is facing a lot of unfavorable factors like lack of long term planning, incompatible system of heritage administration, limited capacity and number of professionals, scant attention to community based heritage and tourism development, dirtiness and drainage problems, problems with stakeholder involvement and cooperation, lack of organized interpretation and presentation systems and others. So, re-organization of the management system, creating platform for coordination among stakeholders and developing appropriate interpretation system can be good remedies. Introducing community based heritage and tourism development concept is also recommendable for a long term win-win success in Aksum.Keywords: Aksum, conservation, interpretation, Sustainable Cultural Heritage Management
Procedia PDF Downloads 325976 Destination of the PhDs: Determinants of International Mobility of UK PhD Graduates
Authors: Anna Siuda-Bak
Abstract:
This paper adopts a comparative approach to examining the determinants of international mobility of German, Italian and British researchers who completed their doctoral education in the UK. Structured sampling and data collection techniques have been developed in order to retrieve information on participants from publicly available sources. This systematically collected data was supplemented with an on-line survey which captures participants’ job trajectories, including movements between positions, institutions and countries. In total, data on 949 German, Italian and British PhDs was collected. Logistic regression was employed to identify factors associated with one’s probability of moving outside the UK after his or her graduation. The predictor variables included factors associated with one’s PhD (field of study, ranking of the university which awarded the PhD degree) and family factors (having a child, nationality of the partner). Then, 9 constrained models were estimated to test the effect each variable has on probability of going to a specific destination, being English-speaking country, non-English speaking country or returning to the home country. The results show that females, arts and humanities graduates, and respondents with a partner from the UK are less mobile than their counterparts. The effect of the ranking of the university differed in two groups. The UK graduates from higher ranked universities were more likely to move abroad than to stay in the UK after their graduation. In contrast, non-UK natives from the same universities were less likely to be internationally mobile than non-UK natives from lower ranked universities. The nationality of the partner was the most important predictor of the specific destination choices. Graduates with partner from the home county were more likely to return home and those with a partner from the third country least likely to return.Keywords: doctoral graduates, international mobility, nationality, UK
Procedia PDF Downloads 323975 Evaluation of the Mechanical Behavior of a Retaining Wall Structure on a Weathered Soil through Probabilistic Methods
Authors: P. V. S. Mascarenhas, B. C. P. Albuquerque, D. J. F. Campos, L. L. Almeida, V. R. Domingues, L. C. S. M. Ozelim
Abstract:
Retaining slope structures are increasingly considered in geotechnical engineering projects due to extensive urban cities growth. These kinds of engineering constructions may present instabilities over the time and may require reinforcement or even rebuilding of the structure. In this context, statistical analysis is an important tool for decision making regarding retaining structures. This study approaches the failure probability of the construction of a retaining wall over the debris of an old and collapsed one. The new solution’s extension length will be of approximately 350 m and will be located over the margins of the Lake Paranoá, Brasilia, in the capital of Brazil. The building process must also account for the utilization of the ruins as a caisson. A series of in situ and laboratory experiments defined local soil strength parameters. A Standard Penetration Test (SPT) defined the in situ soil stratigraphy. Also, the parameters obtained were verified using soil data from a collection of masters and doctoral works from the University of Brasília, which is similar to the local soil. Initial studies show that the concrete wall is the proper solution for this case, taking into account the technical, economic and deterministic analysis. On the other hand, in order to better analyze the statistical significance of the factor-of-safety factors obtained, a Monte Carlo analysis was performed for the concrete wall and two more initial solutions. A comparison between the statistical and risk results generated for the different solutions indicated that a Gabion solution would better fit the financial and technical feasibility of the project.Keywords: economical analysis, probability of failure, retaining walls, statistical analysis
Procedia PDF Downloads 406974 Bias-Corrected Estimation Methods for Receiver Operating Characteristic Surface
Authors: Khanh To Duc, Monica Chiogna, Gianfranco Adimari
Abstract:
With three diagnostic categories, assessment of the performance of diagnostic tests is achieved by the analysis of the receiver operating characteristic (ROC) surface, which generalizes the ROC curve for binary diagnostic outcomes. The volume under the ROC surface (VUS) is a summary index usually employed for measuring the overall diagnostic accuracy. When the true disease status can be exactly assessed by means of a gold standard (GS) test, unbiased nonparametric estimators of the ROC surface and VUS are easily obtained. In practice, unfortunately, disease status verification via the GS test could be unavailable for all study subjects, due to the expensiveness or invasiveness of the GS test. Thus, often only a subset of patients undergoes disease verification. Statistical evaluations of diagnostic accuracy based only on data from subjects with verified disease status are typically biased. This bias is known as verification bias. Here, we consider the problem of correcting for verification bias when continuous diagnostic tests for three-class disease status are considered. We assume that selection for disease verification does not depend on disease status, given test results and other observed covariates, i.e., we assume that the true disease status, when missing, is missing at random. Under this assumption, we discuss several solutions for ROC surface analysis based on imputation and re-weighting methods. In particular, verification bias-corrected estimators of the ROC surface and of VUS are proposed, namely, full imputation, mean score imputation, inverse probability weighting and semiparametric efficient estimators. Consistency and asymptotic normality of the proposed estimators are established, and their finite sample behavior is investigated by means of Monte Carlo simulation studies. Two illustrations using real datasets are also given.Keywords: imputation, missing at random, inverse probability weighting, ROC surface analysis
Procedia PDF Downloads 416973 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.Keywords: road safety, crash prediction, exploratory analysis, machine learning
Procedia PDF Downloads 113972 Sizing Residential Solar Power Systems Based on Site-Specific Energy Statistics
Authors: Maria Arechavaleta, Mark Halpin
Abstract:
In the United States, costs of solar energy systems have declined to the point that they are viable options for most consumers. However, there are no consistent procedures for specifying sufficient systems. The factors that must be considered are energy consumption, potential solar energy production, and cost. The traditional method of specifying solar energy systems is based on assumed daily levels of available solar energy and average amounts of daily energy consumption. The mismatches between energy production and consumption are usually mitigated using battery energy storage systems, and energy use is curtailed when necessary. The main consumer decision question that drives the total system cost is how much unserved (or curtailed) energy is acceptable? Of course additional solar conversion equipment can be installed to provide greater peak energy production and extra energy storage capability can be added to mitigate longer lasting low solar energy production periods. Each option increases total cost and provides a benefit which is difficult to quantify accurately. An approach to quantify the cost-benefit of adding additional resources, either production or storage or both, based on the statistical concepts of loss-of-energy probability and expected unserved energy, is presented in this paper. Relatively simple calculations, based on site-specific energy availability and consumption data, can be used to show the value of each additional increment of production or storage. With this incremental benefit-cost information, consumers can select the best overall performance combination for their application at a cost they are comfortable paying. The approach is based on a statistical analysis of energy consumption and production characteristics over time. The characteristics are in the forms of curves with each point on the curve representing an energy consumption or production value over a period of time; a one-minute period is used for the work in this paper. These curves are measured at the consumer location under the conditions that exist at the site and the duration of the measurements is a minimum of one week. While greater accuracy could be obtained with longer recording periods, the examples in this paper are based on a single week for demonstration purposes. The weekly consumption and production curves are overlaid on each other and the mismatches are used to size the battery energy storage system. Loss-of-energy probability and expected unserved energy indices are calculated in addition to the total system cost. These indices allow the consumer to recognize and quantify the benefit (probably a reduction in energy consumption curtailment) available for a given increase in cost. Consumers can then make informed decisions that are accurate for their location and conditions and which are consistent with their available funds.Keywords: battery energy storage systems, loss of load probability, residential renewable energy, solar energy systems
Procedia PDF Downloads 235971 Does Clinical Guidelines Affect Healthcare Quality and Populational Health: Quebec Colorectal Cancer Screening Program
Authors: Nizar Ghali, Bernard Fortin, Guy Lacroix
Abstract:
In Quebec, colonoscopies volumes have continued to rise in recent years in the absence of effective monitoring mechanism for the appropriateness and the quality of these exams. In 2010, November, Quebec Government introduced the colorectal cancer-screening program in the objective to control for volume and cost imperfection. This program is based on clinical standards and was initiated for first group of institutions. One year later, Government adds financial incentives for participants institutions. In this analysis, we want to assess for the causal effect of the two components of this program: clinical pathways and financial incentives. Especially we assess for the reform effect on healthcare quality and population health in the context that medical remuneration is not directly dependent on this additional funding offered by the program. We have data on admissions episodes and deaths for 8 years. We use multistate model analog to difference in difference approach to estimate reform effect on the transition probability between different states for each patient. Our results show that the reform reduced length of stay without deterioration in hospital mortality or readmission rate. In the other hand, the program contributed to decrease the hospitalization rate and a less invasive treatment approach for colorectal surgeries. This is a sign of healthcare quality and population health improvement. We demonstrate in this analysis that physicians’ behavior can be affected by both clinical standards and financial incentives even if offered to facilities.Keywords: multi-state and multi-episode transition model, healthcare quality, length of stay, transition probability, difference in difference
Procedia PDF Downloads 214970 Development of a Framework for Assessing Public Health Risk Due to Pluvial Flooding: A Case Study of Sukhumvit, Bangkok
Authors: Pratima Pokharel
Abstract:
When sewer overflow due to rainfall in urban areas, this leads to public health risks when an individual is exposed to that contaminated floodwater. Nevertheless, it is still unclear the extent to which the infections pose a risk to public health. This study analyzed reported diarrheal cases by month and age in Bangkok, Thailand. The results showed that the cases are reported higher in the wet season than in the dry season. It was also found that in Bangkok, the probability of infection with diarrheal diseases in the wet season is higher for the age group between 15 to 44. However, the probability of infection is highest for kids under 5 years, but they are not influenced by wet weather. Further, this study introduced a vulnerability that leads to health risks from urban flooding. This study has found some vulnerability variables that contribute to health risks from flooding. Thus, for vulnerability analysis, the study has chosen two variables, economic status, and age, that contribute to health risk. Assuming that the people's economic status depends on the types of houses they are living in, the study shows the spatial distribution of economic status in the vulnerability maps. The vulnerability map result shows that people living in Sukhumvit have low vulnerability to health risks with respect to the types of houses they are living in. In addition, from age the probability of infection of diarrhea was analyzed. Moreover, a field survey was carried out to validate the vulnerability of people. It showed that health vulnerability depends on economic status, income level, and education. The result depicts that people with low income and poor living conditions are more vulnerable to health risks. Further, the study also carried out 1D Hydrodynamic Advection-Dispersion modelling with 2-year rainfall events to simulate the dispersion of fecal coliform concentration in the drainage network as well as 1D/2D Hydrodynamic model to simulate the overland flow. The 1D result represents higher concentrations for dry weather flows and a large dilution of concentration on the commencement of a rainfall event, resulting in a drop of the concentration due to runoff generated after rainfall, whereas the model produced flood depth, flood duration, and fecal coliform concentration maps, which were transferred to ArcGIS to produce hazard and risk maps. In addition, the study also simulates the 5-year and 10-year rainfall simulations to show the variation in health hazards and risks. It was found that even though the hazard coverage is very high with a 10-year rainfall events among three rainfall events, the risk was observed to be the same with a 5-year and 10-year rainfall events.Keywords: urban flooding, risk, hazard, vulnerability, health risk, framework
Procedia PDF Downloads 76969 Predictors of Pelvic Vascular Injuries in Patients with Pelvic Fractures from Major Blunt Trauma
Authors: Osama Zayed
Abstract:
Aim of the work: The aim of this study is to assess the predictors of pelvic vascular injuries in patients with pelvic fractures from major blunt trauma. Methods: This study was conducted as a tool-assessment study. Forty six patients with pelvic fractures from major blunt trauma will be recruited to the study arriving to department of emergency, Suez Canal University Hospital. Data were collected from questionnaire including; personal data of the studied patients and full medical history, clinical examinations, outcome measures (The Physiological and Operative Severity Score for enumeration of Mortality and morbidity (POSSUM), laboratory and imaging studies. Patients underwent surgical interventions or further investigations based on the conventional standards for interventions. All patients were followed up during conservative, operative and post-operative periods in the hospital for interpretation the predictive scores of vascular injuries. Results: Significant predictors of vascular injuries according to computed tomography (CT) scan include age, male gender, lower Glasgow coma (GCS) scores, occurrence of hypotension, mortality rate, higher physical POSSUM scores, presence of ultrasound collection, type of management, higher systolic blood pressure (SBP) and diastolic blood pressure (DBP) POSSUM scores, presence of abdominal injuries, and poor outcome. Conclusions: There was higher frequency of males than females in the studied patients. There were high probability of morbidity and low probability of mortality among patients. Our study demonstrates that POSSUM score can be used as a predictor of vascular injury in pelvis fracture patients.Keywords: predictors, pelvic vascular injuries, pelvic fractures, major blunt trauma, POSSUM
Procedia PDF Downloads 342968 Effect of Dimensional Reinforcement Probability on Discrimination of Visual Compound Stimuli by Pigeons
Authors: O. V. Vyazovska
Abstract:
Behavioral efficiency is one of the main principles to be successful in nature. Accuracy of visual discrimination is determined by the attention, learning experience, and memory. In the experimental condition, pigeons’ responses to visual stimuli presented on the screen of the monitor are behaviorally manifested by pecking or not pecking the stimulus, by the number of pecking, reaction time, etc. The higher the probability of rewarding is, the more likely pigeons will respond to the stimulus. We trained 8 pigeons (Columba livia) on a stagewise go/no-go visual discrimination task.16 visual stimuli were created from all possible combinations of four binary dimensions: brightness (dark/bright), size (large/small), line orientation (vertical/horizontal), and shape (circle/square). In the first stage, we presented S+ and 4 S-stimuli: the first that differed in all 4-dimensional values from S+, the second with brightness dimension sharing with S+, the third sharing brightness and orientation with S+, the fourth sharing brightness, orientation and size. Then all 16 stimuli were added. Pigeons rejected correctly 6-8 of 11 new added S-stimuli at the beginning of the second stage. The results revealed that pigeons’ behavior at the beginning of the second stage was controlled by probabilities of rewarding for 4 dimensions learned in the first stage. More or fewer mistakes with dimension discrimination at the beginning of the second stage depended on the number S- stimuli sharing the dimension with S+ in the first stage. A significant inverse correlation between the number of S- stimuli sharing dimension values with S+ in the first stage and the dimensional learning rate at the beginning of the second stage was found. Pigeons were more confident in discrimination of shape and size dimensions. They made mistakes at the beginning of the second stage, which were not associated with these dimensions. Thus, the received results help elucidate the principles of dimensional stimulus control during learning compound multidimensional visual stimuli.Keywords: visual go/no go discrimination, selective attention, dimensional stimulus control, pigeon
Procedia PDF Downloads 142967 Assessing Children’s Probabilistic and Creative Thinking in a Non-formal Learning Context
Authors: Ana Breda, Catarina Cruz
Abstract:
Daily, we face unpredictable events, often attributed to chance, as there is no justification for such an occurrence. Chance, understood as a source of uncertainty, is present in several aspects of human life, such as weather forecasts, dice rolling, and lottery. Surprisingly, humans and some animals can quickly adjust their behavior to handle efficiently doubly stochastic processes (random events with two layers of randomness, like unpredictable weather affecting dice rolling). This adjustment ability suggests that the human brain has built-in mechanisms for perceiving, understanding, and responding to simple probabilities. It also explains why current trends in mathematics education include probability concepts in official curriculum programs, starting from the third year of primary education onwards. In the first years of schooling, children learn to use a certain type of (specific) vocabulary, such as never, always, rarely, perhaps, likely, and unlikely, to help them to perceive and understand the probability of some events. These are keywords of crucial importance for their perception and understanding of probabilities. The development of the probabilistic concepts comes from facts and cause-effect sequences resulting from the subject's actions, as well as the notion of chance and intuitive estimates based on everyday experiences. As part of a junior summer school program, which took place at a Portuguese university, a non-formal learning experiment was carried out with 18 children in the 5th and 6th grades. This experience was designed to be implemented in a dynamic of a serious ice-breaking game, to assess their levels of probabilistic, critical, and creative thinking in understanding impossible, certain, equally probable, likely, and unlikely events, and also to gain insight into how the non-formal learning context influenced their achievements. The criteria used to evaluate probabilistic thinking included the creative ability to conceive events classified in the specified categories, the ability to properly justify the categorization, the ability to critically assess the events classified by other children, and the ability to make predictions based on a given probability. The data analysis employs a qualitative, descriptive, and interpretative-methods approach based on students' written productions, audio recordings, and researchers' field notes. This methodology allowed us to conclude that such an approach is an appropriate and helpful formative assessment tool. The promising results of this initial exploratory study require a future research study with children from these levels of education, from different regions, attending public or private schools, to validate and expand our findings.Keywords: critical and creative thinking, non-formal mathematics learning, probabilistic thinking, serious game
Procedia PDF Downloads 28966 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 579965 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients
Authors: Ainura Tursunalieva, Irene Hudson
Abstract:
Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence
Procedia PDF Downloads 153964 Aging and Falls Profile from Hospital Databases
Authors: Nino Chikhladze, Tamar Dochviri, Nato Pitskhelauri, Maia Bitskhinashvili
Abstract:
Population aging is a key social and demographic trend of the 21st century. Falls represent a prevalent geriatric syndrome that poses significant risks to the health and independence of older adults. The World Health Organization notes a lack of comprehensive data on falls in low- and middle-income countries, complicating the creation of effective prevention programs. To the authors’ best knowledge, no such studies have been conducted in Georgia. The aim of the study is to explore the epidemiology of falls in the elderly population. The hospitalization database of the National Center for Disease Control and Public Health of Georgia was used for the retrospective study. Falls-related injuries were identified using ICD-10 classifications using the class XIX (S and T codes) and class XX for the type of injury (V-Y codes). Statistical data analyses were done using SPSS software version 23.0. The total number of fall-related hospitalizations for individuals aged 65 and older from 2015 to 2021 was 29,697. The study revealed that falls accounted for an average of 63% (ranging from 59% to 66%) of all hospitalizations and 68% (ranging from 65% to 70%) of injury-related hospitalizations during this period. The 69% of all patients were women and 31%-men (Chi2=4482.1, p<0.001). The highest rate of hospitalization was in the age groups 80-84 and 75-79. The probability of fall-related hospitalization was significantly higher in women (p<0.001) compared to men in all age groups except 65-69 years. In the target age group of 65 years and older, the probability of hospitalization increased significantly with an increase in age (p<0.001). The study's results can be leveraged to create evidence-based awareness programs, design targeted multi-domain interventions addressing specific risk factors, and enhance the quality of geriatric healthcare services in Georgia.Keywords: elderly population, falls, geriatric patients, hospitalization, injuries
Procedia PDF Downloads 31963 Crack Growth Life Prediction of a Fighter Aircraft Wing Splice Joint Under Spectrum Loading Using Random Forest Regression and Artificial Neural Networks with Hyperparameter Optimization
Authors: Zafer Yüce, Paşa Yayla, Alev Taşkın
Abstract:
There are heaps of analytical methods to estimate the crack growth life of a component. Soft computing methods have an increasing trend in predicting fatigue life. Their ability to build complex relationships and capability to handle huge amounts of data are motivating researchers and industry professionals to employ them for challenging problems. This study focuses on soft computing methods, especially random forest regressors and artificial neural networks with hyperparameter optimization algorithms such as grid search and random grid search, to estimate the crack growth life of an aircraft wing splice joint under variable amplitude loading. TensorFlow and Scikit-learn libraries of Python are used to build the machine learning models for this study. The material considered in this work is 7050-T7451 aluminum, which is commonly preferred as a structural element in the aerospace industry, and regarding the crack type; corner crack is used. A finite element model is built for the joint to calculate fastener loads and stresses on the structure. Since finite element model results are validated with analytical calculations, findings of the finite element model are fed to AFGROW software to calculate analytical crack growth lives. Based on Fighter Aircraft Loading Standard for Fatigue (FALSTAFF), 90 unique fatigue loading spectra are developed for various load levels, and then, these spectrums are utilized as inputs to the artificial neural network and random forest regression models for predicting crack growth life. Finally, the crack growth life predictions of the machine learning models are compared with analytical calculations. According to the findings, a good correlation is observed between analytical and predicted crack growth lives.Keywords: aircraft, fatigue, joint, life, optimization, prediction.
Procedia PDF Downloads 178