Search results for: cohort study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 50433

Search results for: cohort study

50343 The Efficacy of Pre-Hospital Packed Red Blood Cells in the Treatment of Severe Trauma: A Retrospective, Matched, Cohort Study

Authors: Ryan Adams

Abstract:

Introduction: Major trauma is the leading cause of death in 15-45 year olds and a significant human, social and economic costs. Resuscitation is a stalwart of trauma management, especially in the pre-hospital environment and packed red blood cells (pRBC) are being increasingly used with the advent of permissive hypotension. The evidence in this area is lacking and further research is required to determine its efficacy. Aim: The aim of this retrospective, matched cohort study was to determine if major trauma patients, who received pre-hospital pRBC, have a difference in their initial emergency department cardiovascular status; when compared with injury-profile matched controls. Methods: The trauma databases of the Royal Brisbane and Women's Hospital, Royal Children's Hospital (Herston) and Queensland Ambulance Service were accessed and major trauma patient (ISS>12) data, who received pre-hospital pRBC, from January 2011 to August 2014 was collected. Patients were then matched against control patients that had not received pRBC, by their injury profile. The primary outcomes was cardiovascular status; defined as shock index and Revised Trauma Score. Results: Data for 25 patients who received pre-hospital pRBC was accessed and the injury profiles matched against suitable controls. On admittance to the emergency department, a statistically significant difference was seen in the blood group (Blood = 1.42 and Control = 0.97, p-value = 0.0449). However, the same was not seen with the RTS (Blood = 4.15 and Control 5.56, p-value = 0.291). Discussion: A worsening shock index and revised trauma score was associated with pre-hospital administration of pRBC. However, due to the small sample size, limited matching protocol and associated confounding factors it is difficult to draw any solid conclusions. Further studies, with larger patient numbers, are required to enable adequate conclusions to be drawn on the efficacy of pre-hospital packed red blood cell transfusion.

Keywords: pre-hospital, packed red blood cells, severe trauma, emergency medicine

Procedia PDF Downloads 394
50342 A Comparison of the Microbiology Profile for Periprosthetic Joint Infection (PJI) of Knee Arthroplasty and Lower Limb Endoprostheses in Tumour Surgery

Authors: Amirul Adlan, Robert A McCulloch, Neil Jenkins, MIchael Parry, Jonathan Stevenson, Lee Jeys

Abstract:

Background and Objectives: The current antibiotic prophylaxis for oncological patients is based upon evidence from primary arthroplasty despite significant differences in both patient group and procedure. The aim of this study was to compare the microbiology organisms responsible for PJI in patients who underwent two-stage revision for infected primary knee replacement with those of infected oncological endoprostheses of the lower limb in a single institution. This will subsequently guide decision making regarding antibiotic prophylaxis at primary implantation for oncological procedures and empirical antibiotics for infected revision procedures (where the infecting organism(s) are unknown). Patient and Methods: 118 patients were treated with two-stage revision surgery for infected knee arthroplasty and lower limb endoprostheses between 1999 and 2019. 74 patients had two-stage revision for PJI of knee arthroplasty, and 44 had two-stage revision of lower limb endoprostheses. There were 68 males and 50 females. The mean age for the knee arthroplasty cohort and lower limb endoprostheses cohort were 70.2 years (50-89) and 36.1 years (12-78), respectively (p<0.01). Patient host and extremity criteria were categorised according to the MSIS Host and Extremity Staging System. Patient microbiological culture, the incidence of polymicrobial infection and multi-drug resistance (MDR) were analysed and recorded. Results: Polymicrobial infection was reported in 16% (12 patients) from knee arthroplasty PJI and 14.5% (8 patients) in endoprostheses PJI (p=0.783). There was a significantly higher incidence of MDR in endoprostheses PJI, isolated in 36.4% of cultures, compared to knee arthroplasty PJI (17.2%) (p=0.01). Gram-positive organisms were isolated in more than 80% of cultures from both cohorts. Coagulase-negative Staphylococcus (CoNS) was the commonest gram-positive organism, and Escherichia coli was the commonest Gram-negative organism in both groups. According to the MSIS staging system, the host and extremity grade of knee arthroplasty PJI cohort were significantly better than endoprostheses PJI(p<0.05). Conclusion: Empirical antibiotic management of PJI in orthopaedic oncology is based upon PJI in arthroplasty despite differences in both host and microbiology. Our results show a significant increase in MDR pathogens within the oncological group despite CoNS being the most common infective organism in both groups. Endoprosthetic patients presented with poorer host and extremity criteria. These factors should be considered when managing this complex patient group, emphasising the importance of broad-spectrum antibiotic prophylaxis and preoperative sampling to ensure appropriate perioperative antibiotic cover.

Keywords: microbiology, periprosthetic Joint infection, knee arthroplasty, endoprostheses

Procedia PDF Downloads 118
50341 Treatment of feline infectious peritonitis in cats with molnupiravir: Outcomes for 54 cases

Authors: TM Clark, SJ Coggins, R korman, J King, R Malik

Abstract:

Objective To evaluate the clinical applications and treatment outcomes using molnupiravir (MPV) for the treatment of naturally occurring feline infectious peritonitis. Methods , 92 client-owned cats with confirmed or presumptive FIP were retrospectively recruited from 35 veterinary practices between February 2023 and March 2024, primarily in Australia. Cats were categorised based on treatment received: Cohort A: Molnupiravir treatment: monotherapy, maintenance, and rescue therapy. Cohort B: Nucleoside analogue treatment: remdesivir and/or GS-441524. Seventy-eight cats were enrolled. Molnupiravir was administered orally for a median of 84 days, at a median dose of 13.3 mg/kg BID. Remission was defined as (i) the resolution of FIP-related clinical signs and normalisation of serum globulin concentrations and A:G ratio (to ≥0.6) or (ii) sustained clinical remission for at least 100 days post-treatment. Cure rate was defined as the percentage of cats achieving sustained remission, without requiring rescue therapy or experiencing a relapse event. Results Molnuparivir monotherapy resulted in a cure rate of 72% (13/18) while maintenance therapy resulted in a cure rate of 86% (25/29). Molnupiravir, utilised as rescue therapy, resulted in a cure rate of 100% (7/7). Treatment with remdesivir and/or GS-441524 resulted in a cure rate of 71% (17/24). Survival analysis revealed no significant difference in outcomes between cats treated with MPV monotherapy and those treated with nucleoside analogues. Adverse effects were uncommon but included neutropenia, and transient elevations in hepatic enzymes. Conclusion and Relevance In our study, molnupiravir demonstrated comparable outcomes to treatment with remdesivir and/or GS-441524 for treating FIP and serves as an accessible, effective option across various presentations, including ocular and neurological forms.

Keywords: FIP, molnupiravir, antiviral, nucleoside analogue

Procedia PDF Downloads 8
50340 Program of Health/Safety Integration and the Total Worker Health Concept in the Improvement of Absenteeism of the Work Accommodation Management

Authors: L. R. Ferreira, R. Biscaro, C. C. Danziger, C. M. Galhardi, L. C. Biscaro, R. C. Biscaro, I. S. Vasconcelos, L. C. R. Ferreira, R. Reis, L. H. Oliveira

Abstract:

Introduction: There is a worldwide trend for the employer to be aware of investing in health promotion that goes beyond occupational hygiene approaches with the implementation of a comprehensive program with integration between occupational health and safety, and social/psychosocial responsibility in the workplace. Work accommodation is a necessity in most companies as it allows the worker to return to its function respecting its physical limitations. This study had the objective to verify if the integration of health and safety in the companies, with the inclusion of the concept of TWH promoted by an occupational health service has impacted in the management of absenteeism of workers in work accommodation. Method: A retrospective and paired cohort study was used, in which the impact of the implementation of the Program for the Health/Safety Integration and Total Worker Health Concept (PHSITWHC) was evaluated using the indices of absenteeism, health attestations, days and hours of sick leave of workers that underwent job accommodation/rehabilitation. This was a cohort study and the data were collected from January to September of 2017, prior to the initiation of the integration program, and compared with the data obtained from January to September of 2018, after the implementation of the program. For the statistical analysis, the student's t-test was used, with statistically significant differences being made at p < 0.05. Results: The results showed a 35% reduction in the number of absenteeism rate in 2018 compared to the same period in 2017. There was also a significant reduction in the total numbers of days of attestations/absences (mean of 2,8) as well as days of attestations, absence and sick leaves (mean of 5,2) in 2018 data after the implementation of PHSITWHC compared to 2017 data, means of 4,3 and 25,1, respectively, prior to the program. Conclusion: It can be concluded that the inclusion of the PHSITWHC was associated with a reduction in the rate of absenteeism of workers that underwent job accommodation. It was observed that, once health and safety were approached and integrated with the inclusion of the TWH concept, it was possible to reduce absenteeism, and improve worker’s quality of life and wellness, and work accommodation management.

Keywords: absenteeism, health/safety integration, work accommodation management, total worker health

Procedia PDF Downloads 159
50339 Risk of Fractures at Different Anatomic Sites in Patients with Irritable Bowel Syndrome: A Nationwide Population-Based Cohort Study

Authors: Herng-Sheng Lee, Chi-Yi Chen, Wan-Ting Huang, Li-Jen Chang, Solomon Chih-Cheng Chen, Hsin-Yi Yang

Abstract:

A variety of gastrointestinal disorders, such as Crohn’s disease, ulcerative colitis, and coeliac disease, are recognized as risk factors for osteoporosis and osteoporotic fractures. One recent study suggests that individuals with irritable bowel syndrome (IBS) might also be at increased risk of osteoporosis and osteoporotic fractures. Up to now, the association between IBS and the risk of fractures at different anatomic sites occurrences is not completely clear. We conducted a population-based cohort analysis to investigate the fracture risk of IBS in comparison with non-IBS group. We identified 29,505 adults aged ≥ 20 years with newly diagnosed IBS using the Taiwan National Health Insurance Research Database in 2000-2012. A comparison group was constructed of patients without IBS who were matched according to gender and age. The occurrence of fracture was monitored until the end of 2013. We analyzed the risk of fracture events to occur in IBS by using Cox proportional hazards regression models. Patients with IBS had a higher incidence of osteoporotic fractures compared with non-IBS group (12.34 versus 9.45 per 1,000 person-years) and an increased risk of osteoporotic fractures (adjusted hazard ratio [aHR] = 1.27, 95 % confidence interval [CI] = 1.20 – 1.35). Site specific analysis showed that the IBS group had a higher risk of fractures for spine, forearm, hip and hand than did the non-IBS group. With further stratification for gender and age, a higher aHR value for osteoporotic fractures in IBS group was seen across all age groups in males, but seen in elderly females. In addition, female, elderly, low income, hypertension, coronary artery disease, cerebrovascular disease, and depressive disorders as independent osteoporotic fracture risk factors in IBS patients. The IBS is considered as a risk factor for osteoporotic fractures, particularly in female individuals and fracture sites located at the spine, forearm, hip and hand.

Keywords: irritable bowel syndrome, fracture, gender difference, longitudinal health insurance database, public health

Procedia PDF Downloads 230
50338 Development of Chronic Obstructive Pulmonary Disease (COPD) Proforma (E-ICP) to Improve Guideline Adherence in Emergency Department: Modified Delphi Study

Authors: Hancy Issac, Gerben Keijzers, Ian Yang, Clint Moloney, Jackie Lea, Melissa Taylor

Abstract:

Introduction: Chronic obstructive pulmonary disease guideline non-adherence is associated with a reduction in health-related quality of life in patients (HRQoL). Improving guideline adherence has the potential to mitigate fragmented care thereby sustaining pulmonary function, preventing acute exacerbations, reducing economic health burdens, and enhancing HRQoL. The development of an electronic proforma stemming from expert consensus, including digital guideline resources and direct interdisciplinary referrals is hypothesised to improve guideline adherence and patient outcomes for emergency department (ED) patients with COPD. Aim: The aim of this study was to develop consensus among ED and respiratory staff for the correct composition of a COPD electronic proforma that aids in guideline adherence and management in the ED. Methods: This study adopted a mixed-method design to develop the most important indicators of care in the ED. The study involved three phases: (1) a systematic literature review and qualitative interdisciplinary staff interviews to assess barriers and solutions for guideline adherence and qualitative interdisciplinary staff interviews, (2) a modified Delphi panel to select interventions for the proforma, and (3) a consensus process through three rounds of scoring through a quantitative survey (ED and Respiratory consensus) and qualitative thematic analysis on each indicator. Results: The electronic proforma achieved acceptable and good internal consistency through all iterations from national emergency department and respiratory department interdisciplinary experts. Cronbach’s alpha score for internal consistency (α) in iteration 1 emergency department cohort (EDC) (α = 0.80 [CI = 0.89%]), respiratory department cohort (RDC) (α = 0.95 [CI = 0.98%]). Iteration 2 reported EDC (α = 0.85 [CI = 0.97%]) and RDC (α = 0.86 [CI = 0.97%]). Iteration 3 revealed EDC (α = 0.73 [CI = 0.91%]) and RDC (α = 0.86 [CI = 0.95%]), respectively. Conclusion: Electronic proformas have the potential to facilitate direct referrals from the ED leading to reduced hospital admissions, reduced length of hospital stays, holistic care, improved health care and quality of life and improved interdisciplinary guideline adherence.

Keywords: COPD, electronic proforma, modified delphi study, interdisciplinary, guideline adherence, COPD-X plan

Procedia PDF Downloads 63
50337 Prevalence and Risk Factors Associated with Nutrition Related Non-Communicable Diseases in a Cohort of Males in the Central Province of Sri Lanka

Authors: N. W. I. A. Jayawardana, W. A. T. A. Jayalath, W. M. T. Madhujith, U. Ralapanawa, R. S. Jayasekera, S. A. S. B. Alagiyawanna, A. M. K. R. Bandara, N. S. Kalupahana

Abstract:

There is mounting evidence to the effect that dietary and lifestyle changes affect the incidence of non-communicable diseases (NCDs). This study was conducted to investigate the association of diet, physical activity, smoking, alcohol consumption and duration of sleep with overweight, obesity, hypertension and diabetes in a cohort of males from the Central Province of Sri Lanka. A total of 2694 individuals aged between 17 – 68 years (Mean = 31) were included in the study. Body Mass Index cutoff values for Asians were used to categorize the participants as normal, overweight and obese. The dietary data were collected using a food frequency questionnaire [FFQ] and data on the level of physical activity, smoking, alcohol consumption and sleeping hours were obtained using a self-administered validated questionnaire. Systolic and diastolic blood pressure, random blood glucose levels were measured to determine the incidence of hypertension and diabetes. Among the individuals, the prevalence of overweight and obesity were 34% and 16.4% respectively. Approximately 37% of the participants suffered from hypertension. Overweight and obesity were associated with older age men (P<0.0001), frequency of smoking (P=0.0434), alcohol consumption level (P=0.0287) and the quantity of lipid intake (P=0.0081). Consumption of fish (P=0.6983) and salty snacks (P=0.8327), sleeping hours (P=0.6847) and the level of physical activity were not significantly (P=0.3301) associated with the incidence of overweight and obesity. Based on the fitted model, only age was significantly associated with hypertension (P < 0.001). Further, age (P < 0.0001), sleeping hours (P=0.0953) and consumption of fatty foods (P=0.0930) were significantly associated with diabetes. Age was associated with higher odds of pre diabetes (OR:1.089;95% CI:1.053,1.127) and diabetes (OR:1.077;95% CI:1.055,1.1) whereas 7-8 hrs. of sleep per day was associated with lesser odds of diabetes (OR:0.403;95% CI:0.184,0.884). High prevalence of overweight, obesity and hypertension in working-age males is a threatening sign for this area. As this population ages in the future and urbanization continues, the prevalence of above risk factors will likely to escalate.

Keywords: age, males, non-communicable diseases, obesity

Procedia PDF Downloads 337
50336 Gestational Vitamin D Levels Mitigate the Effect of Pre-pregnancy Obesity on Gestational Diabetes Mellitus: A Birth Cohort Study

Authors: Majeda S. Hammoud

Abstract:

Background and Aim: Gestational diabetes mellitus (GDM) is a common pregnancy complication affecting around 14% of pregnancies globally that carries short and long-term consequences to the mother and her child. Pre-pregnancy overweight or obesity is the most consistently and strongly associated modifiable risk factor with GDM development. This analysis aimed to determine whether vitamin D status during pregnancy modulates the effect of pre-pregnancy obesity/overweight on GDM risk while stratifying by maternal age. Methods: Data from the Kuwait Birth Cohort (KBC) study were analyzed, which enrolled pregnant women in the second or third trimester of gestation. Pre-pregnancy body mass index (BMI; kg/m2) was categorized as under/normal weight (<25.0), overweight (25.0 to <30.0), and obesity (≥30.0). 25 hydroxyvitamin D levels were measured in blood samples that were collected at recruitment and categorized as deficiency (<50 nmol/L) and insufficiency/sufficiency (≥50 nmol/L). GDM status was ascertained according to international guidelines. Logistic regression was used to evaluate associations, and adjusted odds ratios (aOR) and 95% confidence intervals (CI) were estimated. Results: The analyzed study sample included a total of 982 pregnant women, with a mean (SD) age of 31.4 (5.2) years. The prevalence of GDM was estimated to be 17.3% (95% CI: 14.9-19.7), and the prevalence of pre-pregnancy overweight and obesity was 37.8% (95% CI: 34.8-40.8) and 28.8% (95% CI: 26.0-31.7), respectively. The prevalence of gestational vitamin D deficiency was estimated to be 55.3% (95% CI: 52.2-58.4). The association between pre-pregnancy overweight or obesity with GDM risk differed according to maternal age and gestational vitamin D status (Pinteraction[BMI × age × vitamin D = 0.047). Among pregnant women aged <35 years, prepregnancy obesity compared to under/normal weight was associated with increased GDM risk among women with gestational vitamin D deficiency (aOR: 3.65, 95% CI: 1.50-8.86, p = 0.004) and vitamin D insufficiency/sufficiency (aOR: 2.55, 95% CI: 1.16-5.61, p = 0.019). In contrast, among pregnant women aged ≥35 years, pre-pregnancy obesity compared to under/normal weight was associated with increased GDM risk among women with gestational vitamin D deficiency (aOR: 9.70, 95% CI: 2.01-46.69, p = 0.005), but not among women with vitamin D insufficiency/sufficiency (aOR: 1.46, 95% CI: 0.42-5.16, p = 0.553). Conclusion: The effect of pre-pregnancy obesity on GDM risk is modulated by maternal age and gestational vitamin D status, with the effect of pre-pregnancy obesity being more pronounced among older pregnant women (aged ≥35 years) with gestational vitamin D deficiency compared to those with vitamin D insufficiency/sufficiency. Whereas, among younger women (aged <35 years), the effect of pre-pregnancy obesity on GDM risk was not modulated by gestational vitamin D status. Therefore, vitamin D supplementation among pregnant women, specifically older women with pre-pregnancy obesity, may mitigate the effect of pre-pregnancy obesity on GDM risk.

Keywords: gestational diabetes mellitus, vitamin D, obesity, body mass index

Procedia PDF Downloads 42
50335 Stigmatization of Individuals Who Receive Mental Health Treatment and the Role of Social Media: A Cross-Generational Cohort Design and Extension

Authors: Denise Ben-Porath, Tracy Masterson

Abstract:

In the past, individuals who struggled with and sought treatment for mental health difficulties were stigmatized. However, the current generation holds more open attitudes around mental health issues. Indeed, public figures such as Demi Lovato, Naomi Osaka, and Simone Biles have taken to social media to break the silence around mental health, discussing their own struggles and the benefits of treatment. Thus, there is considerable reason to believe that this generation would hold fewer stigmatizing attitudes toward mental health difficulties and treatment compared to previous ones. In this study, we explored possible changes in stigma on mental health diagnosis and treatment seeking behavior between two generations: Gen Z, the current generation, and Gen X, those born between 1965-1980. It was hypothesized that Gen Z would hold less stigmatizing views on mental illness than Gen X. To examine possible changes in stigma attitudes between these two generations, we conducted a cross-generational cohort design by using the same methodology employed 20 years ago from the Ben-Porath (2002) study. Thus, participants were randomly assigned to read one of the following four case vignettes employed in the Ben-Porath (2002) study: (a) “Tom” who has received psychotherapy due to depression (b) “Tom” who has been depressed but received no psychological help, (c) “Tom” who has received medical treatment due to a back pain, or (d) “Tom” who had a back pain but did not receive medical attention. After reading the vignette, participants rated “Tom” on various personality dimensions using the IFQ Questionnaire and answered questions about their frequency of social media use and willingness to seek mental health treatment on a scale from 1-10. Identical to the results 20 years prior, a significant main effect was found for diagnosis with “Tom” being viewed in more negative terms when he was described as having depression vs. a medical condition (back pain) [F (1, 376) = 126.53, p < .001]. However, in the study conducted 20 years earlier, a significant interaction was found between diagnosis and help-seeking behavior [F (1, 376) = 8.28, p < .005]. Specifically, “Tom” was viewed in the most negative terms when described as depressed and seeking treatment. Alternatively, the current study failed to find a significant interaction between depression and help seeking behavior. These findings suggest that while individuals who hold a mental health diagnosis may still be stigmatized as they were 20 years prior, seeking treatment for mental health issues may be less so. Findings are discussed in the context of social media use and its impact on destigmatization.

Keywords: stigma, mental illness, help-seeking, social media

Procedia PDF Downloads 82
50334 A Systematic Review of Patient-Reported Outcomes and Return to Work after Surgical vs. Non-surgical Midshaft Humerus Fracture

Authors: Jamal Alasiri, Naif Hakeem, Saoud Almaslmani

Abstract:

Background: Patients with humeral shaft fractures have two different treatment options. Surgical therapy has lesser risks of non-union, mal-union, and re-intervention than non-surgical therapy. These positive clinical outcomes of the surgical approach make it a preferable treatment option despite the risks of radial nerve palsy and additional surgery-related risk. We aimed to evaluate patients’ outcomes and return to work after surgical vs. non-surgical management of shaft humeral fracture. Methods: We used databases, including PubMed, Medline, and Cochrane Register of Controlled Trials, from 2010 to January 2022 to search for potential randomised controlled trials (RCTs) and cohort studies comparing the patients’ related outcome measures and return to work between surgical and non-surgical management of humerus fracture. Results: After carefully evaluating 1352 articles, we included three RCTs (232 patients) and one cohort study (39 patients). The surgical intervention used plate/nail fixation, while the non-surgical intervention used a splint or brace procedure to manage shaft humeral fracture. The pooled DASH effects of all three RCTs at six (M.D: -7.5 [-13.20, -1.89], P: 0.009) I2:44%) and 12 months (M.D: -1.32 [-3.82, 1.17], p:0.29, I2: 0%) were higher in patients treated surgically than in non-surgical procedures. The pooled constant Murley score at six (M.D: 7.945[2.77,13.10], P: 0.003) I2: 0%) and 12 months (M.D: 1.78 [-1.52, 5.09], P: 0.29, I2: 0%) were higher in patients who received non-surgical than surgical therapy. However, pooled analysis for patients returning to work for both groups remained inconclusive. Conclusion: Altogether, we found no significant evidence supporting the clinical benefits of surgical over non-surgical therapy. Thus, the non-surgical approach remains the preferred therapeutic choice for managing shaft humeral fractures due to its lesser side effects.

Keywords: shaft humeral fracture, surgical treatment, Patient-related outcomes, return to work, DASH

Procedia PDF Downloads 99
50333 Management of Femoral Neck Stress Fractures at a Specialist Centre and Predictive Factors to Return to Activity Time: An Audit

Authors: Charlotte K. Lee, Henrique R. N. Aguiar, Ralph Smith, James Baldock, Sam Botchey

Abstract:

Background: Femoral neck stress fractures (FNSF) are uncommon, making up 1 to 7.2% of stress fractures in healthy subjects. FNSFs are prevalent in young women, military recruits, endurance athletes, and individuals with energy deficiency syndrome or female athlete triad. Presentation is often non-specific and is often misdiagnosed following the initial examination. There is limited research addressing the return–to–activity time after FNSF. Previous studies have demonstrated prognostic time predictions based on various imaging techniques. Here, (1) OxSport clinic FNSF practice standards are retrospectively reviewed, (2) FNSF cohort demographics are examined, (3) Regression models were used to predict return–to–activity prognosis and consequently determine bone stress risk factors. Methods: Patients with a diagnosis of FNSF attending Oxsport clinic between 01/06/2020 and 01/01/2020 were selected from the Rheumatology Assessment Database Innovation in Oxford (RhADiOn) and OxSport Stress Fracture Database (n = 14). (1) Clinical practice was audited against five criteria based on local and National Institute for Health Care Excellence guidance, with a 100% standard. (2) Demographics of the FNSF cohort were examined with Student’s T-Test. (3) Lastly, linear regression and Random Forest regression models were used on this patient cohort to predict return–to–activity time. Consequently, an analysis of feature importance was conducted after fitting each model. Results: OxSport clinical practice met standard (100%) in 3/5 criteria. The criteria not met were patient waiting times and documentation of all bone stress risk factors. Importantly, analysis of patient demographics showed that of the population with complete bone stress risk factor assessments, 53% were positive for modifiable bone stress risk factors. Lastly, linear regression analysis was utilized to identify demographic factors that predicted return–to–activity time [R2 = 79.172%; average error 0.226]. This analysis identified four key variables that predicted return-to-activity time: vitamin D level, total hip DEXA T value, femoral neck DEXA T value, and history of an eating disorder/disordered eating. Furthermore, random forest regression models were employed for this task [R2 = 97.805%; average error 0.024]. Analysis of the importance of each feature again identified a set of 4 variables, 3 of which matched with the linear regression analysis (vitamin D level, total hip DEXA T value, and femoral neck DEXA T value) and the fourth: age. Conclusion: OxSport clinical practice could be improved by more comprehensively evaluating bone stress risk factors. The importance of this evaluation is demonstrated by the population found positive for these risk factors. Using this cohort, potential bone stress risk factors that significantly impacted return-to-activity prognosis were predicted using regression models.

Keywords: eating disorder, bone stress risk factor, femoral neck stress fracture, vitamin D

Procedia PDF Downloads 183
50332 Factors Associated with Death during Tuberculosis Treatment of Patients Co-Infected with HIV at a Tertiary Care Setting in Cameroon: An 8-Year Hospital-Based Retrospective Cohort Study (2006-2013)

Authors: A. A. Agbor, Jean Joel R. Bigna, Serges Clotaire Billong, Mathurin Cyrille Tejiokem, Gabriel L. Ekali, Claudia S. Plottel, Jean Jacques N. Noubiap, Hortence Abessolo, Roselyne Toby, Sinata Koulla-Shiro

Abstract:

Background: Contributors to fatal outcomes in patients undergoing tuberculosis (TB) treatment in the setting of HIV co-infection are poorly characterized, especially in sub-Saharan Africa. Our study’s aim was to assess factors associated with death in TB/HIV co-infected patients during the first 6 months their TB treatment. Methods: We conducted a tertiary-care hospital-based retrospective cohort study from January 2006 to December 2013 at the Yaoundé Central Hospital, Cameroon. We reviewed medical records to identify hospitalized co-infected TB/HIV patients aged 15 years and older. Death was defined as any death occurring during TB treatment, as per the World Health Organization’s recommendations. Logistic regression analysis identified factors associated with death. Magnitudes of associations were expressed by adjusted odds ratio (aOR) with 95% confidence interval. A p value < 0.05 was considered statistically significant. Results: The 337 patients enrolled had a mean age of 39.3 (+/- 10.3) years and more (54.3%) were women. TB treatment outcomes included: treatment success in 60.8% (n=205), death in 29.4% (n=99), not evaluated in 5.3% (n=18), loss to follow-up in 5.3% (n=14), and failure in 0.3% (n=1) . After exclusion of patients lost to follow-up and not evaluated, death in TB/HIV co-infected patients during TB treatment was associated with: a TB diagnosis made before national implementation of guidelines regarding initiation of antiretroviral therapy (aOR = 2.50 [1.31-4.78]; p = 0.006), the presence of other AIDS-defining infections (aOR = 2.73 [1.27-5.86]; p = 0.010), non-AIDS comorbidities (aOR = 3.35 [1.37-8.21]; p = 0.008), not receiving co-trimoxazole prophylaxis (aOR = 3.61 [1.71-7.63]; p = 0.001), not receiving antiretroviral therapy (aOR = 2.45 [1.18-5.08]; p = 0.016), and CD4 cell counts < 50 cells/mm3 (aOR = 16.43 [1.05-258.04]; p = 0.047). Conclusions: The success rate of anti-tuberculosis treatment among hospitalized TB/HIV co-infected patients in our setting is low. Mortality in the first 6 months of treatment was high and strongly associated with specific clinical factors including states of greater immunosuppression, highlighting the urgent need for targeted interventions, including provision of anti-retroviral therapy and co-trimoxazole prophylaxis in order to enhance patient outcomes.

Keywords: TB/HIV co-infection, death, treatment outcomes, factors

Procedia PDF Downloads 446
50331 Jelly and Beans: Appropriate Use of Ultrasound in Acute Kidney Injury

Authors: Raja Ezman Raja Shariff

Abstract:

Acute kidney injury (AKI) is commonly seen in inpatients, and places a great cost on the NHS and patients. Timely and appropriate management is both nephron sparing and potentially life-saving. Ultrasound scanning (USS) is a well-recognised method for stratifying patients. Subsequently, the NICE AKI guidance has defined groups in whom scanning is recommended within 6 hours of request (pyonephrosis), within 24 hours (obstruction/cause unknown), and in whom routine scanning isn't recommended (cause for AKI identified). The audit looks into whether Stockport NHS Trust USS practice was in line with such recommendations. The audit evaluated 92 patients with AKI who had USS, between 01/01/14 to 30/04/14. Data collection was divided into 2 parts. Firstly, radiology request cards and the online imaging software (PACS) were evaluated. Then, the electronic case notes (ADVANTIS) was evaluated further. Based on request cards, 10% of requests were for pyonephrosis. Only 33% were scanned within 6hours and a further 33% within 24hours. 75% were requested for possible obstructions and unknown cause collectively. Of those due to possible obstruction, 71% of patients were scanned within 24 hours. Of those with unknown cause, 50% were scanned within 24 hours. 15% of requests had a cause declared and so potentially did not require scanning. Evaluation of the patients’ notes suggested further interesting findings. Firstly, potentially 39% of patients had a known cause for AKI, therefore, did not need USS. Subsequently, the cohort of unknown cause and possible obstruction was collectively reduced to 45%. Alarmingly the patient cohort with possible pyonephrosis went up to 16%, suggesting an under-recognition of this life-threatening condition. We plan to highlight these findings within our institution and make changes to encourage more appropriate requesting and timely scanning. Time will tell if we manage to save or increase our costs in this cost-conscious NHS. Patient benefits, though, seem to be guaranteed.

Keywords: AKI, ARF, kidney, renal

Procedia PDF Downloads 401
50330 Occupational Heat Stress Related Adverse Pregnancy Outcome: A Pilot Study in South India Workplaces

Authors: Rekha S., S. J. Nalini, S. Bhuvana, S. Kanmani, Vidhya Venugopal

Abstract:

Introduction: Pregnant women's occupational heat exposure has been linked to foetal abnormalities and pregnancy complications. The presence of heat in the workplace is expected to lead to Adverse Pregnancy Outcomes (APO), especially in tropical countries where temperatures are rising and workplace cooling interventions are minimal. For effective interventions, in-depth understanding and evidence about occupational heat stress and APO are required. Methodology: Approximately 800 pregnant women in and around Chennai who were employed in jobs requiring moderate to hard labour participated in the cohort research. During the study period (2014-2019), environmental heat exposures were measured using a Questemp WBGT monitor, and heat strain markers, such as Core Body Temperature (CBT) and Urine Specific Gravity (USG), were evaluated using an Infrared Thermometer and a refractometer, respectively. Using a valid HOTHAPS questionnaire, self-reported health symptoms were collected. In addition, a postpartum follow-up with the mothers was done to collect APO-related data. Major findings of the study: Approximately 47.3% of pregnant workers have workplace WBGTs over the safe manual work threshold value for moderate/heavy employment (Average WBGT of 26.6°C±1.0°C). About 12.5% of the workers had CBT levels above the usual range, and 24.8% had USG levels above 1.020, both of which suggested mild dehydration. Miscarriages (3%), stillbirths/preterm births (3.5%), and low birth weights (8.8%) were the most common unfavorable outcomes among pregnant employees. In addition, WBGT exposures above TLVs during all trimesters were associated with a 2.3-fold increased risk of adverse fetal/maternal outcomes (95% CI: 1.4-3.8), after adjusting for potential confounding variables including age, education, socioeconomic status, abortion history, stillbirth, preterm, LBW, and BMI. The study determined that WBGTs in the workplace had direct short- and long-term effects on the health of both the mother and the foetus. Despite the study's limited scope, the findings provided valuable insights and highlighted the need for future comprehensive cohort studies and extensive data in order to establish effective policies to protect vulnerable pregnant women from the dangers of heat stress and to promote reproductive health.

Keywords: adverse outcome, heat stress, interventions, physiological strain, pregnant women

Procedia PDF Downloads 73
50329 Pregnancy Rate and Outcomes after Uterine Fibroid Embolization Single Centre Experience in the Middle East from the United Arab Emirates at Alain Hospital

Authors: Jamal Alkoteesh, Mohammed Zeki, Mouza Alnaqbi

Abstract:

Objective: To evaluate pregnancy outcomes, complications and neonatal outcomes in women who had previously undergone uterine arterial embolization. Design: Retrospective study. In this study, most women opted for UFE as a fertility treatment after failure of myomectomy or in vitro fertilization, or because hysterectomy was the only suggested option. Background. Myomectomy is the standard approach in patients with fibroids desiring a future pregnancy. However, myomectomy may be difficult in cases of numerous interstitial and/or submucous fibroids.In these cases, UFE has the advantage of embolizing all fibroids in one procedure. This procedure is an accepted nonsurgical treatment for symptomatic uterine fibroids. Study Methods: A retrospective study of 210 patients treated with UFE for symptomatic uterine fibroids between 2011-2016 was performed. UFE was performed using ((PVA; Embozen, Beadblock) (500-900 µm in diameter). Pregnancies were identified using screening questionnaires and the study database. Of the 210 patients who received UFE treatment, 35 women younger than the age of 40 wanted to conceive and had been unable. All women in our study were advised to wait six months or more after UFE before attempting to become pregnant, of which the reported time range before attempting to conceive was seven to 33 months (average 20 months). RESULTS: In a retrospective chart review of patients younger than the age of 40 (35 patients,18 patients reported 23 pregnancies, of which five were miscarriages. Two more pregnancies were complicated by premature labor. Of the 23 pregnancies, 16 were normal full-term pregnancies, 15 women had conceived once, and four had become pregnant twice. The remaining patients did not conceive. In the study, there was no reported intrauterine growth retardation in the prenatal period, fetal distress during labor, or problems related to uterine integrity. Two patients reported minor problems during pregnancy that were borderline oligohydramnios and low-lying placenta. In the cohort of women who did conceive, overall, 16 out of 18 births proceeded normally without any complications (86%). Eight women delivered by cesarean section, and 10 women had normal vaginal delivery. In this study of 210 women, UFE had a fertility rate of 47%. Our group of 23 pregnancies was small, but did confirm successful pregnancy after UFE. The 45.7% pregnancy rate in women below the age of 40 years old who completed a term pregnancy compares favorably with women who underwent myomectomy via other method. Of the women in the cohort who did conceive, subsequent birth proceeded normally (86%). Conclusion: Pregnancy after UFE is well-documented. The risks of infertility following embolization, premature menopause, and hysterectomy are small, as is the radiation exposure during embolization. Fertility rates appear similar to patients undergoing myomectomy.UFE should not be contraindicated in patients who want to conceive and they should be able to choose between surgical options and UFE.

Keywords: fibroid, pregnancy, therapeutic embolization, uterine artery

Procedia PDF Downloads 228
50328 Multilevel of Factors Affected Optimal Adherence to Antiretroviral Therapy and Viral Suppression amongst HIV-Infected Prisoners in South Ethiopia: A Prospective Cohort Study

Authors: Terefe Fuge, George Tsourtos , Emma Miller

Abstract:

Objectives: Maintaining optimal adherence and viral suppression in people living with HIV (PLWHA) is essential to ensure both preventative and therapeutic benefits of antiretroviral therapy (ART). Prisoners bear a particularly high burden of HIV infection and are highly likely to transmit to others during and after incarceration. However, the level of adherence and viral suppression, as well as its associated factors in incarcerated populations in low-income countries is unknown. This study aimed to determine the prevalence of non-adherence and viral failure, and contributing factors to this amongst prisoners in South Ethiopia. Methods: A prospective cohort study was conducted between June 1, 2019 and July 31, 2020 to compare the level of adherence and viral suppression between incarcerated and non-incarcerated PLWHA. The study involved 74 inmates living with HIV (ILWHA) and 296 non-incarcerated PLWHA. Background information including sociodemographic, socioeconomic, psychosocial, behavioural, and incarceration-related characteristics was collected using a structured questionnaire. Adherence was determined based on participants’ self-report and pharmacy refill records, and plasma viral load measurements which were undertaken within the study period were prospectively extracted to determine viral suppression. Various univariate and multivariate regression models were used to analyse data. Results: Self-reported dose adherence was approximately similar between ILWHA and non-incarcerated PLWHA (81% and 83% respectively), but ILWHA had a significantly higher medication possession ratio (MPR) (89% vs 75%). The prevalence of viral failure (VF) was slightly higher (6%) in ILWHA compared to non-incarcerated PLWHA (4.4%). The overall dose non-adherence (NA) was significantly associated with missing ART appointments, level of satisfaction with ART services, patient’s ability to comply with a specified medication schedule and types of methods used to monitor the schedule. In ILWHA specifically, accessing ART services from a hospital compared to a health centre, an inability to always attend clinic appointments, experience of depression and a lack of social support predicted NA. VF was significantly higher in males, people of age 31-35 years and in those who experienced social stigma, regardless of their incarceration status. Conclusions: This study revealed that HIV-infected prisoners in South Ethiopia were more likely to be non-adherent to doses and so to develop viral failure compared to their non-incarcerated counterparts. A multitude of factors was found to be responsible for this requiring multilevel intervention strategies focusing on the specific needs of prisoners.

Keywords: Adherence , Antiretroviral therapy, Incarceration, South Ethiopia, Viral suppression

Procedia PDF Downloads 135
50327 Neuroimaging Markers for Screening Former NFL Players at Risk for Developing Alzheimer's Disease / Dementia Later in Life

Authors: Vijaykumar M. Baragi, Ramtilak Gattu, Gabriela Trifan, John L. Woodard, K. Meyers, Tim S. Halstead, Eric Hipple, Ewart Mark Haacke, Randall R. Benson

Abstract:

NFL players, by virtue of their exposure to repetitive head injury, are at least twice as likely to develop Alzheimer's disease (AD) and dementia as the general population. Early recognition and intervention prior to onset of clinical symptoms could potentially avert/delay the long-term consequences of these diseases. Since AD is thought to have a long preclinical incubation period, the aim of the current research was to determine whether former NFL players, referred to a depression center, showed evidence of incipient dementia in their structural imaging prior to diagnosis of dementia. Thus, to identify neuroimaging markers of AD, against which former NFL players would be compared, we conducted a comprehensive volumetric analysis using a cohort of early stage AD patients (ADNI) to produce a set of brain regions demonstrating sensitivity to early AD pathology (i.e., the “AD fingerprint”). A cohort of 46 former NFL players’ brain MRIs were then interrogated using the AD fingerprint. Brain scans were done using a T1-weighted MPRAGE sequence. The Free Surfer image analysis suite (version 6.0) was used to obtain the volumetric and cortical thickness data. A total of 55 brain regions demonstrated significant atrophy or ex vacuo dilatation bilaterally in AD patients vs. healthy controls. Of the 46 former NFL players, 19 (41%) demonstrated a greater than expected number of atrophied/dilated AD regions when compared with age-matched controls, presumably reflecting AD pathology.

Keywords: alzheimers, neuroimaging biomarkers, traumatic brain injury, free surfer, ADNI

Procedia PDF Downloads 154
50326 Effect of Malnutrition at Admission on Length of Hospital Stay among Adult Surgical Patients in Wolaita Sodo University Comprehensive Specialized Hospital, South Ethiopia: Prospective Cohort Study, 2022

Authors: Yoseph Halala Handiso, Zewdi Gebregziabher

Abstract:

Background: Malnutrition in hospitalized patients remains a major public health problem in both developed and developing countries. Despite the fact that malnourished patients are more prone to stay longer in hospital, there is limited data regarding the magnitude of malnutrition and its effect on length of stay among surgical patients in Ethiopia, while nutritional assessment is also often a neglected component of the health service practice. Objective: This study aimed to assess the prevalence of malnutrition at admission and its effect on the length of hospital stay among adult surgical patients in Wolaita Sodo University Comprehensive Specialized Hospital, South Ethiopia, 2022. Methods: A facility-based prospective cohort study was conducted among 398 adult surgical patients admitted to the hospital. Participants in the study were chosen using a convenient sampling technique. Subjective global assessment was used to determine the nutritional status of patients with a minimum stay of 24 hours within 48 hours after admission (SGA). Data were collected using the open data kit (ODK) version 2022.3.3 software, while Stata version 14.1 software was employed for statistical analysis. The Cox regression model was used to determine the effect of malnutrition on the length of hospital stay (LOS) after adjusting for several potential confounders taken at admission. Adjusted hazard ratio (HR) with a 95% confidence interval was used to show the effect of malnutrition. Results: The prevalence of hospital malnutrition at admission was 64.32% (95% CI: 59%-69%) according to the SGA classification. Adult surgical patients who were malnourished at admission had higher median LOS (12 days: 95% CI: 11-13) as compared to well-nourished patients (8 days: 95% CI: 8-9), means adult surgical patients who were malnourished at admission were at higher risk of reduced chance of discharge with improvement (prolonged LOS) (AHR: 0.37, 95% CI: 0.29-0.47) as compared to well-nourished patients. Presence of comorbidity (AHR: 0.68, 95% CI: 0.50-90), poly medication (AHR: 0.69, 95% CI: 0.55-0.86), and history of admission (AHR: 0.70, 95% CI: 0.55-0.87) within the previous five years were found to be the significant covariates of the length of hospital stay (LOS). Conclusion: The magnitude of hospital malnutrition at admission was found to be high. Malnourished patients at admission had a higher risk of prolonged length of hospital stay as compared to well-nourished patients. The presence of comorbidity, polymedication, and history of admission were found to be the significant covariates of LOS. All stakeholders should give attention to reducing the magnitude of malnutrition and its covariates to improve the burden of LOS.

Keywords: effect of malnutrition, length of hospital stay, surgical patients, Ethiopia

Procedia PDF Downloads 66
50325 Implementation of Enhanced Recovery after Surgery (ERAS) Protocols in Laparoscopic Sleeve Gastrectomy (LSG); A Systematic Review and Meta-analysis

Authors: Misbah Nizamani, Saira Malik

Abstract:

Introduction: Bariatric surgery is the most effective treatment for patients suffering from morbid obesity. Laparoscopic sleeve gastrectomy (LSG) accounts for over 50% of total bariatric procedures. The aim of our meta-analysis is to investigate the effectiveness and safety of Enhanced Recovery After Surgery (ERAS) protocols for patients undergoing laparoscopic sleeve gastrectomy. Method: To gather data, we searched PubMed, Google Scholar, ScienceDirect, and Cochrane Central. Eligible studies were randomized controlled trials and cohort studies involving adult patients (≥18 years) undergoing bariatric surgeries, i.e., Laparoscopic sleeve gastrectomy. Outcome measures included LOS, postoperative narcotic usage, postoperative pain score, postoperative nausea and vomiting, postoperative complications and mortality, emergency department visits and readmission rates. RevMan version 5.4 was used to analyze outcomes. Results: Three RCTs and three cohorts with 1522 patients were included in this study. ERAS group and control group were compared for eight outcomes. LOS was reduced significantly in the intervention group (p=0.00001), readmission rates had borderline differences (p=0.35) and higher postoperative complications in the control group, but the result was non-significant (p=0.68), whereas postoperative pain score was significantly reduced (p=0.005). Total MME requirements became significant after performing sensitivity analysis (p= 0.0004). Postoperative mortality could not be analyzed on account of invalid data showing 0% mortality in two cohort studies. Conclusion: This systemic review indicated the effectiveness of the application of ERAS protocols in LSG in reducing the length of stay, post-operative pain and total MME requirements postoperatively, indicating the feasibility and assurance of its application.

Keywords: eras protocol, sleeve gastrectomy, bariatric surgery, enhanced recovery after surgery

Procedia PDF Downloads 45
50324 Nutrition, Dental Status and Post-Traumatic Stress Disorder among Underage Refugees in Germany

Authors: Marios Loucas, Rafael Loucas, Oliver Muensterer

Abstract:

Aim of the Study: Over the last two years, there has been a substantial rise of refugees entering Germany, of which approximately one-third are underage. Little is known about the general state of health such as nutrition, dental status and post-traumatic stress disorder among underage refugees. Our study assesses the general health status of underage refugees based on a large sample cohort. Methods: After ethics board approval, we used a structured questionnaire to collect demographic information and health-related elements in 3 large refugee accommodation centers, focusing on nutritional and dental status, as well as symptoms of posttraumatic stress disorder. Main results: A total of 461 minor refugees were included. The majority were boys (54.5%), average age was 8 years. Out of the 8 recorded countries of origin, most children came from Syria (33.6%), followed by Afghanistan (23.2%). Of the participants, 50.3% reported DSM-5 criteria of Posttraumatic stress disorder and presented mental health-related problems. The most frequently reported mental abnormalities were concentration disturbances (15.2%), sleep disorders (6.9%), unclear headaches (5.4%). The majority of the participants showed an unfavorable nutritional and dental status. According to the family, the majority of the children rarely eat healthy foods such as fruits, vegetables and fish. However, the majority of these children (over 90%) consume a large quantity of sugary foods and sweetened drinks such as soft drinks and confectionery at least daily. Caries was found in 63% of the minor children included in the study. A large proportion (47%) reported never brushing their teeth. According to the family, 78.3% of refugee children have never been evaluated by a dentist in Germany. The remainder visited a dentist mainly because of unbearable toothache. Conclusions: Minor refugees have specific psychological, nutritional and dental problems that must be considered in order to ensure appropriate medical care. Posttraumatic stress disorder is mainly caused by physical and emotional trauma suffered either during the flight or in the refugee camp in Germany. These data call for widespread screening of psychological, dental and nutritional problems in underage refugees. Dental care of this cohort is completely inadequate. Nutritional programs should focus on educating the families and providing the means to obtain healthy foods for these children.

Keywords: children, nutrition, posttraumatic stress disorder, refugee

Procedia PDF Downloads 173
50323 Management of Urinary Tract Infections by Nurse Practitioners in a Canadian Pediatric Emergency Department: A Rretrospective Cohort Study

Authors: T. Mcgraw, F. N. Morin, N. Desai

Abstract:

Background: Antimicrobial resistance is a critical issue in global health care and a significant contributor to increased patient morbidity and mortality. Suspected urinary tract infection (UTI) is a key area of inappropriate antibiotic prescription in pediatrics. Management patterns of infectious diseases have been shown to vary by provider type within a single setting. The aim of this study was to assess compliance with national UTI management guidelines by nurse practitioners in a pediatric emergency department (ED). Methods: This was a post-hoc analysis of a retrospective cohort study to review and evaluate visits to a tertiary care freestanding pediatric emergency department. Patients were included if they were 60 days to 36 months old and discharged with a diagnosis of UTI or ‘rule-out UTI’ between July 2015 and July 2020. Primary outcome measure was proportion of visits seen by Nurse Practitioners (NP) which were associated with national guideline compliance in the diagnosis and treatment of suspected UTI. We performed descriptive statistics and comparative analyses to determine differences in practice patterns between NPs, and physicians. Results: A total of 636 charts were reviewed, of which 402 patients met inclusion criteria. 17 patients were treated by NPs, 385 were treated by either Pediatric Emergency Medicine physicians (PEM) or non-PEM physicians. Overall, the proportion of infants receiving guideline-compliant care was 25.9% (21.8-30.4%). Of those who were prescribed antibiotics, 79.6% (74.7-83.8%) received first line guideline recommended therapy and 58.9% (53.8-63.8%) received fully compliant therapy with respect to age, dose, duration, and frequency. In patients treated by NPs, 16/17 (94%(95% CI:73.0-99.0)) required antibiotics, 15/16 (93%(95% CI: 71.7-98.9)) were treated with first line agent (cephalexin), 8/16 (50%(95% CI:28-72)) were guideline compliant of dose and duration. 5/8 (63%(95% CI:30.6-86.3)) were noncompliant for dose being too high. There was no difference in receiving guideline compliant empiric antibiotic therapy between physicians and nurse practitioners (OR: 0.837 CI: 0.302-2.69). Conclusion: In this post-hoc analysis, guideline noncompliance by nurse practitioners is common in children tested and treated for UTIs in a pediatric emergency department. Care by a Nurse Practitioner was not associated with greater rate of noncompliance than care by a Pediatric Emergency Medicine physician. Future appropriately powered studies may focus on confirming these results.

Keywords: antibiotic stewardship, infectious disease, nurse practitioner, urinary tract infection

Procedia PDF Downloads 105
50322 Unveiling Comorbidities in Irritable Bowel Syndrome: A UK BioBank Study utilizing Supervised Machine Learning

Authors: Uswah Ahmad Khan, Muhammad Moazam Fraz, Humayoon Shafique Satti, Qasim Aziz

Abstract:

Approximately 10-14% of the global population experiences a functional disorder known as irritable bowel syndrome (IBS). The disorder is defined by persistent abdominal pain and an irregular bowel pattern. IBS significantly impairs work productivity and disrupts patients' daily lives and activities. Although IBS is widespread, there is still an incomplete understanding of its underlying pathophysiology. This study aims to help characterize the phenotype of IBS patients by differentiating the comorbidities found in IBS patients from those in non-IBS patients using machine learning algorithms. In this study, we extracted samples coding for IBS from the UK BioBank cohort and randomly selected patients without a code for IBS to create a total sample size of 18,000. We selected the codes for comorbidities of these cases from 2 years before and after their IBS diagnosis and compared them to the comorbidities in the non-IBS cohort. Machine learning models, including Decision Trees, Gradient Boosting, Support Vector Machine (SVM), AdaBoost, Logistic Regression, and XGBoost, were employed to assess their accuracy in predicting IBS. The most accurate model was then chosen to identify the features associated with IBS. In our case, we used XGBoost feature importance as a feature selection method. We applied different models to the top 10% of features, which numbered 50. Gradient Boosting, Logistic Regression and XGBoost algorithms yielded a diagnosis of IBS with an optimal accuracy of 71.08%, 71.427%, and 71.53%, respectively. Among the comorbidities most closely associated with IBS included gut diseases (Haemorrhoids, diverticular diseases), atopic conditions(asthma), and psychiatric comorbidities (depressive episodes or disorder, anxiety). This finding emphasizes the need for a comprehensive approach when evaluating the phenotype of IBS, suggesting the possibility of identifying new subsets of IBS rather than relying solely on the conventional classification based on stool type. Additionally, our study demonstrates the potential of machine learning algorithms in predicting the development of IBS based on comorbidities, which may enhance diagnosis and facilitate better management of modifiable risk factors for IBS. Further research is necessary to confirm our findings and establish cause and effect. Alternative feature selection methods and even larger and more diverse datasets may lead to more accurate classification models. Despite these limitations, our findings highlight the effectiveness of Logistic Regression and XGBoost in predicting IBS diagnosis.

Keywords: comorbidities, disease association, irritable bowel syndrome (IBS), predictive analytics

Procedia PDF Downloads 119
50321 The Promise of Social Enterprise to Improve Health Outcomes in Trafficking Survivors: A Quantitative Case Study

Authors: Sean Roy, Mercedes Miller

Abstract:

A study was conducted to assess the positive outcomes related to Filipino human trafficking survivors working at a social enterprise. As most existing research on human survivors pertains to the adverse outcomes of victims, the researchers were seeking to fill the dearth of existing data related to positive outcomes. A quantitative study was conducted using a convenience sample of 41 participants within three staggered cohorts of the social enterprise. A Kruskal-Wallis H test was conducted and indicated that participants in the third cohort (who were employed at the social enterprise the longest) had significantly lower anxiety scores than participants in other cohorts. This study indicates that social enterprises hold the promise of positively impacting anxiety of human trafficking survivors and provides a starting point for researchers looking to assess ways to positively influence the lives of survivors.

Keywords: human trafficking, Philippines, quantitative analysis, self-identity

Procedia PDF Downloads 166
50320 Stress Hyperglycaemia and Glycaemic Control Post Cardiac Surgery: Relaxed Targets May Be Acceptable

Authors: Nicholas Bayfield, Liam Bibo, Charley Budgeon, Robert Larbalestier, Tom Briffa

Abstract:

Introduction: Stress hyperglycaemia is common following cardiac surgery. Its optimal management is uncertain and may differ by diabetic status. This study assesses the in-hospital glycaemic management of cardiac surgery patients and associated postoperative outcomes. Methods: A retrospective cohort analysis of all patients undergoing cardiac surgery at Fiona Stanley Hospital from February 2015 to May 2019 was undertaken. Management and outcomes of hyperglycaemia following cardiac surgery were assessed. Follow-up was assessed to 1 year postoperatively. Multivariate regression modelling was utilised. Results: 1050 non-diabetic patients and 689 diabetic patients were included. In the non-diabetic cohort, patients with mild (peak blood sugar level [BSL] < 14.3), transient stress hyperglycaemia managed without insulin were not at an increased risk of wound-related morbidity (P=0.899) or mortality at 1 year (P=0.483). Insulin management was associated with wound-related readmission to hospital (P=0.004) and superficial sternal wound infection (P=0.047). Prolonged or severe stress hyperglycaemia was predictive of hospital re-admission (P=0.050) but not morbidity or mortality (P=0.546). Diabetes mellitus was an independent risk factor 1-year mortality (OR; 1.972 [1.041–3.736], P=0.037), graft harvest site wound infection (OR; 1.810 [1.134–2.889], P=0.013) and wound-related readmission (OR; 1.866 [1.076–3.236], P=0.026). In diabetics, postoperative peak BSL > 13.9mmol/L was predictive of graft harvest site infections (OR; 3.528 [1.724-7.217], P=0.001) and wound-related readmission OR; 3.462 [1.540-7.783], P=0.003) regardless of modality of management. A peak BSL of 10.0-13.9 did not increase the risk of morbidity/mortality compared to a peak BSL of < 10.0 (P=0.557). Diabetics with a peak BSL of 13.9 or less did not have significantly increased morbidity/mortality outcomes compared to non-diabetics (P=0.418). Conclusion: In non-diabetic patients, transient mild stress hyperglycaemia following cardiac surgery does not uniformly require treatment. In diabetic patients, postoperative hyperglycaemia with peak BSL exceeding 13.9mmol/L was associated with wound-related morbidity and hospital readmission following cardiac surgery.

Keywords: cardiac surgery, pulmonary embolism, pulmonary embolectomy, cardiopulmonary bypass

Procedia PDF Downloads 163
50319 Intergenerational Trauma: Patterns of Child Abuse and Neglect Across Two Generations in a Barbados Cohort

Authors: Rebecca S. Hock, Cyralene P. Bryce, Kevin Williams, Arielle G. Rabinowitz, Janina R. Galler

Abstract:

Background: Findings have been mixed regarding whether offspring of parents who were abused or neglected as children have a greater risk of experiencing abuse or neglect themselves. In addition, many studies on this topic are restricted to physical abuse and take place in a limited number of countries, representing a small segment of the world's population. Methods: We examined relationships between childhood maltreatment history assessed in a subset (N=68) of the original longitudinal birth cohort (G1) of the Barbados Nutrition Study and their now-adult offspring (G2) (N=111) using the Childhood Trauma Questionnaire-Short Form (CTQ-SF). We used Pearson correlations to assess relationships between parent and offspring CTQ-SF total and subscale scores (physical, emotional, and sexual abuse; physical and emotional neglect). Next, we ran multiple regression analyses, using the parental CTQ-SF total score and the parental Sexual Abuse score as primary predictors separately in our models of G2 CTQ-SF (total and subscale scores). Results: G1 total CTQ-SF scores were correlated with G2 offspring Emotional Neglect and total scores. G1 Sexual Abuse history was significantly correlated with G2 Emotional Abuse, Sexual Abuse, Emotional Neglect, and Total Score. In fully-adjusted regression models, parental (G1) total CTQ-SF scores remained significantly associated with G2 offspring reports of Emotional Neglect, and parental (G1) Sexual Abuse was associated with offspring (G2) reports of Emotional Abuse, Physical Abuse, Emotional Neglect, and overall CTQ-SF scores. Conclusions: Our findings support a link between parental exposure to childhood maltreatment and their offspring's self-reported exposure to childhood maltreatment. Of note, there was not an exact correspondence between the subcategory of maltreatment experienced from one generation to the next. Compared with other subcategories, G1 Sexual Abuse history was the most likely to predict G2 offspring maltreatment. Further studies are needed to delineate underlying mechanisms and to develop intervention strategies aimed at preventing intergenerational transmission.

Keywords: trauma, family, adolescents, intergenerational trauma, child abuse, child neglect, global mental health, North America

Procedia PDF Downloads 85
50318 Flipped Classroom in a European Public Health Program: The Need for Students' Self-Directness

Authors: Nynke de Jong, Inge G. P. Duimel-Peeters

Abstract:

The flipped classroom as an instructional strategy and a type of blended learning that reverses the traditional learning environment by delivering instructional content, off- and online, in- and outside the classroom, has been implemented in a 4-weeks module focusing on ageing in Europe at the Maastricht University. The main aim regarding the organization of this module was implementing flipped classroom-principles in order to create meaningful learning opportunities, while educational technologies are used to deliver content outside of the classroom. Technologies used in this module were an online interactive real time lecture from England, two interactive face-to-face lectures with visual supports, one group session including role plays and team-based learning meetings. The cohort of 2015-2016, using educational technologies, was compared with the cohort of 2014-2015 on module evaluation such as organization and instructiveness of the module, who studied the same content, although conforming the problem-based educational strategy, i.e. educational base of the Maastricht University. The cohort of 2015-2016 with its specific organization, was also more profound evaluated on outcomes as (1) experienced duration of the lecture by students, (2) experienced content of the lecture, (3) experienced the extent of the interaction and (4) format of lecturing. It was important to know how students reflected on duration and content taken into account their background knowledge so far, in order to distinguish between sufficient enough regarding prior knowledge and therefore challenging or not fitting into the course. For the evaluation, a structured online questionnaire was used, whereby above mentioned topics were asked for to evaluate by scoring them on a 4-point Likert scale. At the end, there was room for narrative feedback so that interviewees could express more in detail, if they wanted, what they experienced as good or not regarding the content of the module and its organization parts. Eventually, the response rate of the evaluation was lower than expected (54%), however, due to written feedback and exam scores, we dare to state that it gives a good and reliable overview that encourages to work further on it. Probably, the response rate may be explained by the fact that resit students were included as well, and that there maybe is too much evaluation as some time points in the program. However, overall students were excited about the organization and content of the module, but the level of self-directed behavior, necessary for this kind of educational strategy, was too low. They need to be more trained in self-directness, therefore the module will be simplified in 2016-2017 with more clear and fewer topics and extra guidance (step by step procedure). More specific information regarding the used technologies will be explained at the congress, as well as the outcomes (min and max rankings, mean and standard deviation).

Keywords: blended learning, flipped classroom, public health, self-directness

Procedia PDF Downloads 221
50317 Congenital Diaphragmatic Hernia Outcomes in a Low-Volume Center

Authors: Michael Vieth, Aric Schadler, Hubert Ballard, J. A. Bauer, Pratibha Thakkar

Abstract:

Introduction: Congenital diaphragmatic hernia (CDH) is a condition characterized by the herniation of abdominal contents into the thoracic cavity requiring postnatal surgical repair. Previous literature suggests improved CDH outcomes at high-volume regional referral centers compared to low-volume centers. The purpose of this study was to examine CDH outcomes at Kentucky Children’s Hospital (KCH), a low-volume center, compared to the Congenital Diaphragmatic Hernia Study Group (CDHSG). Methods: A retrospective chart review was performed at KCH from 2007-2019 for neonates with CDH, and then subdivided into two cohorts: those requiring ECMO therapy and those not requiring ECMO therapy. Basic demographic data and measures of mortality and morbidity including ventilator days and length of stay were compared to the CDHSG. Measures of morbidity for the ECMO cohort including duration of ECMO, clinical bleeding, intracranial hemorrhage, sepsis, need for continuous renal replacement therapy (CRRT), need for sildenafil at discharge, timing of surgical repair, and total ventilator days were collected. Statistical analysis was performed using IBM SPSS Statistics version 28. One-sample t-tests and one-sample Wilcoxon Signed Rank test were utilized as appropriate.Results: There were a total of 27 neonatal patients with CDH at KCH from 2007-2019; 9 of the 27 required ECMO therapy. The birth weight and gestational age were similar between KCH and the CDHSG (2.99 kg vs 2.92 kg, p =0.655; 37.0 weeks vs 37.4 weeks, p =0.51). About half of the patients were inborn in both cohorts (52% vs 56%, p =0.676). KCH cohort had significantly more Caucasian patients (96% vs 55%, p=<0.001). Unadjusted mortality was similar in both groups (KCH 70% vs CDHSG 72%, p =0.857). Using ECMO utilization (KCH 78% vs CDHSG 52%, p =0.118) and need for surgical repair (KCH 95% vs CDHSG 85%, p =0.060) as proxy for severity, both groups’ mortality were comparable. No significant difference was noted for pulmonary outcomes such as average ventilator days (KCH 43.2 vs. CDHSG 17.3, p =0.078) and home oxygen dependency (KCH 44% vs. CDHSG 24%, p =0.108). Average length of hospital stay for patients treated at KCH was similar to CDHSG (64.4 vs 49.2, p=1.000). Conclusion: Our study demonstrates that outcome in CDH patients is independent of center’s case volume status. Management of CDH with a standardized approach in a low-volume center can yield similar outcomes. This data supports the treatment of patients with CDH at low-volume centers as opposed to transferring to higher-volume centers.

Keywords: ECMO, case volume, congenital diaphragmatic hernia, congenital diaphragmatic hernia study group, neonate

Procedia PDF Downloads 96
50316 The Efficacy of an Ideal RGP Fitting on Higher Order Aberrations (HOA) in 65 Keratoconus Patients

Authors: Ghandehari-Motlagh, Mohammad

Abstract:

Purpose: To evaluate of the effect of an ideal fit of RGPs on HOA and keratoconus indices. Methods: In this cohort study, 65 keratoconus eyes with more than 3 lines(Snellen)improvement between BSCVA and BCVA(RGP) were imaged with Pentacam HR and their topometric and Zernike analysis findings without RGP were recorded. After 6 months or later of RGP fitting (Rose-K,Boston XO2), imaging with pentacam was repeated and the above information were recorded. Results: 65 different grades of keratoconus eyes with mean age of 27.32 yrs/old(SD +_5.51)enrolled including M 28(43.1%) and F 37(56.9%). 44(67.7%) with family Hx of Kc and 21(31.25%)without any Kc in their families. 54 (83.1%) with and 11 (16.9%) without any ocular allergy Hx. Maximum percent of age of onset of kc was 15 ys/old(29.2%).This study showed there are meaningful correlations between with and without RGP Pentacam indices and HOA in each grade of Kc.92.3% of patients had foreign body sensation but 96.9% had 11-20 hours/day RGP wear that confirms on psychologic effect of an ideal fit on patient’s motivation. Conclusion: With the three points touch principle of RGP fitting in Kc corneas, the patients will have a decrease in HOA and so delayed need for PK or LK.

Keywords: keratoconus, rigid gas permeable lens, aberration, fitting

Procedia PDF Downloads 416
50315 PhenoScreen: Development of a Systems Biology Tool for Decision Making in Recurrent Urinary Tract Infections

Authors: Jonathan Josephs-Spaulding, Hannah Rettig, Simon Graspeunter, Jan Rupp, Christoph Kaleta

Abstract:

Background: Recurrent urinary tract infections (rUTIs) are a global cause of emergency room visits and represent a significant burden for public health systems. Therefore, metatranscriptomic approaches to investigate metabolic exchange and crosstalk between uropathogenic Escherichia coli (UPEC), which is responsible for 90% of UTIs, and collaborating pathogens of the urogenital microbiome is necessary to better understand the pathogenetic processes underlying rUTIs. Objectives: This study aims to determine the level in which uropathogens optimize the host urinary metabolic environment to succeed during invasion. By developing patient-specific metabolic models of infection, these observations can be taken advantage of for the precision treatment of human disease. Methods: To date, we have set up an rUTI patient cohort and observed various urine-associated pathogens. From this cohort, we developed patient-specific metabolic models to predict bladder microbiome metabolism during rUTIs. This was done by creating an in silico metabolomic urine environment, which is representative of human urine. Metabolic models of uptake and cross-feeding of rUTI pathogens were created from genomes in relation to the artificial urine environment. Finally, microbial interactions were constrained by metatranscriptomics to indicate patient-specific metabolic requirements of pathogenic communities. Results: Metabolite uptake and cross-feeding are essential for strain growth; therefore, we plan to design patient-specific treatments by adjusting urinary metabolites through nutritional regimens to counteract uropathogens by depleting essential growth metabolites. These methods will provide mechanistic insights into the metabolic components of rUTI pathogenesis to provide an evidence-based tool for infection treatment.

Keywords: recurrent urinary tract infections, human microbiome, uropathogenic Escherichia coli, UPEC, microbial ecology

Procedia PDF Downloads 136
50314 Identification of New Familial Breast Cancer Susceptibility Genes: Are We There Yet?

Authors: Ian Campbell, Gillian Mitchell, Paul James, Na Li, Ella Thompson

Abstract:

The genetic cause of the majority of multiple-case breast cancer families remains unresolved. Next generation sequencing has emerged as an efficient strategy for identifying predisposing mutations in individuals with inherited cancer. We are conducting whole exome sequence analysis of germ line DNA from multiple affected relatives from breast cancer families, with the aim of identifying rare protein truncating and non-synonymous variants that are likely to include novel cancer predisposing mutations. Data from more than 200 exomes show that on average each individual carries 30-50 protein truncating mutations and 300-400 rare non-synonymous variants. Heterogeneity among our exome data strongly suggest that numerous moderate penetrance genes remain to be discovered, with each gene individually accounting for only a small fraction of families (~0.5%). This scenario marks validation of candidate breast cancer predisposing genes in large case-control studies as the rate-limiting step in resolving the missing heritability of breast cancer. The aim of this study is to screen genes that are recurrently mutated among our exome data in a larger cohort of cases and controls to assess the prevalence of inactivating mutations that may be associated with breast cancer risk. We are using the Agilent HaloPlex Target Enrichment System to screen the coding regions of 168 genes in 1,000 BRCA1/2 mutation-negative familial breast cancer cases and 1,000 cancer-naive controls. To date, our interim analysis has identified 21 genes which carry an excess of truncating mutations in multiple breast cancer families versus controls. Established breast cancer susceptibility gene PALB2 is the most frequently mutated gene (13/998 cases versus 0/1009 controls), but other interesting candidates include NPSR1, GSN, POLD2, and TOX3. These and other genes are being validated in a second cohort of 1,000 cases and controls. Our experience demonstrates that beyond PALB2, the prevalence of mutations in the remaining breast cancer predisposition genes is likely to be very low making definitive validation exceptionally challenging.

Keywords: predisposition, familial, exome sequencing, breast cancer

Procedia PDF Downloads 494