Search results for: Population-base cohort study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 48346

Search results for: Population-base cohort study

48256 Vancomycin Resistance Enterococcus and Implications to Trauma and Orthopaedic Care

Authors: O. Davies, K. Veravalli, P. Panwalkar, M. Tofighi, P. Butterick, B. Healy, A. Mofidi

Abstract:

Vancomycin resistant enterococcus infection is a condition that usually impacts ICUs, transplant, dialysis, and cancer units, often as a nosocomial infection. After an outbreak in the acute trauma and orthopaedic unit in Morriston hospital, we aimed to access the conditions that predispose VRE infections in our unit. Thirteen cases of VRE infection and five cases of VRE colonisations were identified in patients who were treated for orthopaedic care between 1/1/2020 and 1/11/2021. Cases were reviewed to identify predisposing factors, specifically looking at age, presenting condition and treatment, presence of infection and antibiotic care, active haemo-oncological condition, long term renal dialysis, previous hospitalisation, VRE predisposition, and clearance (PREVENT) scores, and outcome of care. The presenting condition, treatment, presence of postoperative infection, VRE scores, age was compared between colonised and the infected cohort. VRE type in both colonised and infection group was Enterococcus Faecium in all but one patient. The colonised group had the same age (T=0.6 P>0.05) and sex (2=0.115, p=0.74), presenting condition and treatment which consisted of peri-femoral fixation or arthroplasty in all patients. The infected group had one case of myelodysplasia and four cases of chronic renal failure requiring dialysis. All of the infected patient had sustained an infected complication of their fracture fixation or arthroplasty requiring reoperation and antibiotics. The infected group had an average VRE predisposition score of 8.5 versus the score of 3 in the colonised group (F=36, p<0.001). PREVENT score was 7 in the infected group and 2 in the colonised group(F=153, p<0.001). Six patients(55%) succumbed to their infection, and one VRE infection resulted in limb loss. In the orthopaedic cohort, VRE infection is a nosocomial condition that has peri-femoral predilection and is seen in association with immunosuppression or renal failure. The VRE infection cohort has been treated for infective complication of original surgery weeks prior to VRE infection. Based on our findings, we advise avoidance of infective complications, change of practice in use of antibiotics and use radical surgery and surveillance for VRE infections beyond infective precautions. PREVENT score shows that the infected group are unlikely to clear their VRE in the future but not the colonised group.

Keywords: surgical site infection, enterococcus, orthopaedic surgery, vancomycin resistance

Procedia PDF Downloads 112
48255 Pregnancy Rate and Outcomes after Uterine Fibroid Embolization Single Centre Experience in the Middle East from the United Arab Emirates at Alain Hospital

Authors: Jamal Alkoteesh, Mohammed Zeki, Mouza Alnaqbi

Abstract:

Objective: To evaluate pregnancy outcomes, complications and neonatal outcomes in women who had previously undergone uterine arterial embolization. Design: Retrospective study. In this study, most women opted for UFE as a fertility treatment after failure of myomectomy or in vitro fertilization, or because hysterectomy was the only suggested option. Background. Myomectomy is the standard approach in patients with fibroids desiring a future pregnancy. However, myomectomy may be difficult in cases of numerous interstitial and/or submucous fibroids.In these cases, UFE has the advantage of embolizing all fibroids in one procedure. This procedure is an accepted nonsurgical treatment for symptomatic uterine fibroids. Study Methods: A retrospective study of 210 patients treated with UFE for symptomatic uterine fibroids between 2011-2016 was performed. UFE was performed using ((PVA; Embozen, Beadblock) (500-900 µm in diameter). Pregnancies were identified using screening questionnaires and the study database. Of the 210 patients who received UFE treatment, 35 women younger than the age of 40 wanted to conceive and had been unable. All women in our study were advised to wait six months or more after UFE before attempting to become pregnant, of which the reported time range before attempting to conceive was seven to 33 months (average 20 months). RESULTS: In a retrospective chart review of patients younger than the age of 40 (35 patients,18 patients reported 23 pregnancies, of which five were miscarriages. Two more pregnancies were complicated by premature labor. Of the 23 pregnancies, 16 were normal full-term pregnancies, 15 women had conceived once, and four had become pregnant twice. The remaining patients did not conceive. In the study, there was no reported intrauterine growth retardation in the prenatal period, fetal distress during labor, or problems related to uterine integrity. Two patients reported minor problems during pregnancy that were borderline oligohydramnios and low-lying placenta. In the cohort of women who did conceive, overall, 16 out of 18 births proceeded normally without any complications (86%). Eight women delivered by cesarean section, and 10 women had normal vaginal delivery. In this study of 210 women, UFE had a fertility rate of 47%. Our group of 23 pregnancies was small, but did confirm successful pregnancy after UFE. The 45.7% pregnancy rate in women below the age of 40 years old who completed a term pregnancy compares favorably with women who underwent myomectomy via other method. Of the women in the cohort who did conceive, subsequent birth proceeded normally (86%). Conclusion: Pregnancy after UFE is well-documented. The risks of infertility following embolization, premature menopause, and hysterectomy are small, as is the radiation exposure during embolization. Fertility rates appear similar to patients undergoing myomectomy.UFE should not be contraindicated in patients who want to conceive and they should be able to choose between surgical options and UFE.

Keywords: fibroid, pregnancy, therapeutic embolization, uterine artery

Procedia PDF Downloads 208
48254 Occupational Heat Stress Related Adverse Pregnancy Outcome: A Pilot Study in South India Workplaces

Authors: Rekha S., S. J. Nalini, S. Bhuvana, S. Kanmani, Vidhya Venugopal

Abstract:

Introduction: Pregnant women's occupational heat exposure has been linked to foetal abnormalities and pregnancy complications. The presence of heat in the workplace is expected to lead to Adverse Pregnancy Outcomes (APO), especially in tropical countries where temperatures are rising and workplace cooling interventions are minimal. For effective interventions, in-depth understanding and evidence about occupational heat stress and APO are required. Methodology: Approximately 800 pregnant women in and around Chennai who were employed in jobs requiring moderate to hard labour participated in the cohort research. During the study period (2014-2019), environmental heat exposures were measured using a Questemp WBGT monitor, and heat strain markers, such as Core Body Temperature (CBT) and Urine Specific Gravity (USG), were evaluated using an Infrared Thermometer and a refractometer, respectively. Using a valid HOTHAPS questionnaire, self-reported health symptoms were collected. In addition, a postpartum follow-up with the mothers was done to collect APO-related data. Major findings of the study: Approximately 47.3% of pregnant workers have workplace WBGTs over the safe manual work threshold value for moderate/heavy employment (Average WBGT of 26.6°C±1.0°C). About 12.5% of the workers had CBT levels above the usual range, and 24.8% had USG levels above 1.020, both of which suggested mild dehydration. Miscarriages (3%), stillbirths/preterm births (3.5%), and low birth weights (8.8%) were the most common unfavorable outcomes among pregnant employees. In addition, WBGT exposures above TLVs during all trimesters were associated with a 2.3-fold increased risk of adverse fetal/maternal outcomes (95% CI: 1.4-3.8), after adjusting for potential confounding variables including age, education, socioeconomic status, abortion history, stillbirth, preterm, LBW, and BMI. The study determined that WBGTs in the workplace had direct short- and long-term effects on the health of both the mother and the foetus. Despite the study's limited scope, the findings provided valuable insights and highlighted the need for future comprehensive cohort studies and extensive data in order to establish effective policies to protect vulnerable pregnant women from the dangers of heat stress and to promote reproductive health.

Keywords: adverse outcome, heat stress, interventions, physiological strain, pregnant women

Procedia PDF Downloads 44
48253 Prevalence and Risk Factors Associated with Nutrition Related Non-Communicable Diseases in a Cohort of Males in the Central Province of Sri Lanka

Authors: N. W. I. A. Jayawardana, W. A. T. A. Jayalath, W. M. T. Madhujith, U. Ralapanawa, R. S. Jayasekera, S. A. S. B. Alagiyawanna, A. M. K. R. Bandara, N. S. Kalupahana

Abstract:

There is mounting evidence to the effect that dietary and lifestyle changes affect the incidence of non-communicable diseases (NCDs). This study was conducted to investigate the association of diet, physical activity, smoking, alcohol consumption and duration of sleep with overweight, obesity, hypertension and diabetes in a cohort of males from the Central Province of Sri Lanka. A total of 2694 individuals aged between 17 – 68 years (Mean = 31) were included in the study. Body Mass Index cutoff values for Asians were used to categorize the participants as normal, overweight and obese. The dietary data were collected using a food frequency questionnaire [FFQ] and data on the level of physical activity, smoking, alcohol consumption and sleeping hours were obtained using a self-administered validated questionnaire. Systolic and diastolic blood pressure, random blood glucose levels were measured to determine the incidence of hypertension and diabetes. Among the individuals, the prevalence of overweight and obesity were 34% and 16.4% respectively. Approximately 37% of the participants suffered from hypertension. Overweight and obesity were associated with older age men (P<0.0001), frequency of smoking (P=0.0434), alcohol consumption level (P=0.0287) and the quantity of lipid intake (P=0.0081). Consumption of fish (P=0.6983) and salty snacks (P=0.8327), sleeping hours (P=0.6847) and the level of physical activity were not significantly (P=0.3301) associated with the incidence of overweight and obesity. Based on the fitted model, only age was significantly associated with hypertension (P < 0.001). Further, age (P < 0.0001), sleeping hours (P=0.0953) and consumption of fatty foods (P=0.0930) were significantly associated with diabetes. Age was associated with higher odds of pre diabetes (OR:1.089;95% CI:1.053,1.127) and diabetes (OR:1.077;95% CI:1.055,1.1) whereas 7-8 hrs. of sleep per day was associated with lesser odds of diabetes (OR:0.403;95% CI:0.184,0.884). High prevalence of overweight, obesity and hypertension in working-age males is a threatening sign for this area. As this population ages in the future and urbanization continues, the prevalence of above risk factors will likely to escalate.

Keywords: age, males, non-communicable diseases, obesity

Procedia PDF Downloads 313
48252 Multilevel of Factors Affected Optimal Adherence to Antiretroviral Therapy and Viral Suppression amongst HIV-Infected Prisoners in South Ethiopia: A Prospective Cohort Study

Authors: Terefe Fuge, George Tsourtos , Emma Miller

Abstract:

Objectives: Maintaining optimal adherence and viral suppression in people living with HIV (PLWHA) is essential to ensure both preventative and therapeutic benefits of antiretroviral therapy (ART). Prisoners bear a particularly high burden of HIV infection and are highly likely to transmit to others during and after incarceration. However, the level of adherence and viral suppression, as well as its associated factors in incarcerated populations in low-income countries is unknown. This study aimed to determine the prevalence of non-adherence and viral failure, and contributing factors to this amongst prisoners in South Ethiopia. Methods: A prospective cohort study was conducted between June 1, 2019 and July 31, 2020 to compare the level of adherence and viral suppression between incarcerated and non-incarcerated PLWHA. The study involved 74 inmates living with HIV (ILWHA) and 296 non-incarcerated PLWHA. Background information including sociodemographic, socioeconomic, psychosocial, behavioural, and incarceration-related characteristics was collected using a structured questionnaire. Adherence was determined based on participants’ self-report and pharmacy refill records, and plasma viral load measurements which were undertaken within the study period were prospectively extracted to determine viral suppression. Various univariate and multivariate regression models were used to analyse data. Results: Self-reported dose adherence was approximately similar between ILWHA and non-incarcerated PLWHA (81% and 83% respectively), but ILWHA had a significantly higher medication possession ratio (MPR) (89% vs 75%). The prevalence of viral failure (VF) was slightly higher (6%) in ILWHA compared to non-incarcerated PLWHA (4.4%). The overall dose non-adherence (NA) was significantly associated with missing ART appointments, level of satisfaction with ART services, patient’s ability to comply with a specified medication schedule and types of methods used to monitor the schedule. In ILWHA specifically, accessing ART services from a hospital compared to a health centre, an inability to always attend clinic appointments, experience of depression and a lack of social support predicted NA. VF was significantly higher in males, people of age 31-35 years and in those who experienced social stigma, regardless of their incarceration status. Conclusions: This study revealed that HIV-infected prisoners in South Ethiopia were more likely to be non-adherent to doses and so to develop viral failure compared to their non-incarcerated counterparts. A multitude of factors was found to be responsible for this requiring multilevel intervention strategies focusing on the specific needs of prisoners.

Keywords: Adherence , Antiretroviral therapy, Incarceration, South Ethiopia, Viral suppression

Procedia PDF Downloads 96
48251 Factors Associated with Death during Tuberculosis Treatment of Patients Co-Infected with HIV at a Tertiary Care Setting in Cameroon: An 8-Year Hospital-Based Retrospective Cohort Study (2006-2013)

Authors: A. A. Agbor, Jean Joel R. Bigna, Serges Clotaire Billong, Mathurin Cyrille Tejiokem, Gabriel L. Ekali, Claudia S. Plottel, Jean Jacques N. Noubiap, Hortence Abessolo, Roselyne Toby, Sinata Koulla-Shiro

Abstract:

Background: Contributors to fatal outcomes in patients undergoing tuberculosis (TB) treatment in the setting of HIV co-infection are poorly characterized, especially in sub-Saharan Africa. Our study’s aim was to assess factors associated with death in TB/HIV co-infected patients during the first 6 months their TB treatment. Methods: We conducted a tertiary-care hospital-based retrospective cohort study from January 2006 to December 2013 at the Yaoundé Central Hospital, Cameroon. We reviewed medical records to identify hospitalized co-infected TB/HIV patients aged 15 years and older. Death was defined as any death occurring during TB treatment, as per the World Health Organization’s recommendations. Logistic regression analysis identified factors associated with death. Magnitudes of associations were expressed by adjusted odds ratio (aOR) with 95% confidence interval. A p value < 0.05 was considered statistically significant. Results: The 337 patients enrolled had a mean age of 39.3 (+/- 10.3) years and more (54.3%) were women. TB treatment outcomes included: treatment success in 60.8% (n=205), death in 29.4% (n=99), not evaluated in 5.3% (n=18), loss to follow-up in 5.3% (n=14), and failure in 0.3% (n=1) . After exclusion of patients lost to follow-up and not evaluated, death in TB/HIV co-infected patients during TB treatment was associated with: a TB diagnosis made before national implementation of guidelines regarding initiation of antiretroviral therapy (aOR = 2.50 [1.31-4.78]; p = 0.006), the presence of other AIDS-defining infections (aOR = 2.73 [1.27-5.86]; p = 0.010), non-AIDS comorbidities (aOR = 3.35 [1.37-8.21]; p = 0.008), not receiving co-trimoxazole prophylaxis (aOR = 3.61 [1.71-7.63]; p = 0.001), not receiving antiretroviral therapy (aOR = 2.45 [1.18-5.08]; p = 0.016), and CD4 cell counts < 50 cells/mm3 (aOR = 16.43 [1.05-258.04]; p = 0.047). Conclusions: The success rate of anti-tuberculosis treatment among hospitalized TB/HIV co-infected patients in our setting is low. Mortality in the first 6 months of treatment was high and strongly associated with specific clinical factors including states of greater immunosuppression, highlighting the urgent need for targeted interventions, including provision of anti-retroviral therapy and co-trimoxazole prophylaxis in order to enhance patient outcomes.

Keywords: TB/HIV co-infection, death, treatment outcomes, factors

Procedia PDF Downloads 421
48250 The Promise of Social Enterprise to Improve Health Outcomes in Trafficking Survivors: A Quantitative Case Study

Authors: Sean Roy, Mercedes Miller

Abstract:

A study was conducted to assess the positive outcomes related to Filipino human trafficking survivors working at a social enterprise. As most existing research on human survivors pertains to the adverse outcomes of victims, the researchers were seeking to fill the dearth of existing data related to positive outcomes. A quantitative study was conducted using a convenience sample of 41 participants within three staggered cohorts of the social enterprise. A Kruskal-Wallis H test was conducted and indicated that participants in the third cohort (who were employed at the social enterprise the longest) had significantly lower anxiety scores than participants in other cohorts. This study indicates that social enterprises hold the promise of positively impacting anxiety of human trafficking survivors and provides a starting point for researchers looking to assess ways to positively influence the lives of survivors.

Keywords: human trafficking, Philippines, quantitative analysis, self-identity

Procedia PDF Downloads 140
48249 Effect of Malnutrition at Admission on Length of Hospital Stay among Adult Surgical Patients in Wolaita Sodo University Comprehensive Specialized Hospital, South Ethiopia: Prospective Cohort Study, 2022

Authors: Yoseph Halala Handiso, Zewdi Gebregziabher

Abstract:

Background: Malnutrition in hospitalized patients remains a major public health problem in both developed and developing countries. Despite the fact that malnourished patients are more prone to stay longer in hospital, there is limited data regarding the magnitude of malnutrition and its effect on length of stay among surgical patients in Ethiopia, while nutritional assessment is also often a neglected component of the health service practice. Objective: This study aimed to assess the prevalence of malnutrition at admission and its effect on the length of hospital stay among adult surgical patients in Wolaita Sodo University Comprehensive Specialized Hospital, South Ethiopia, 2022. Methods: A facility-based prospective cohort study was conducted among 398 adult surgical patients admitted to the hospital. Participants in the study were chosen using a convenient sampling technique. Subjective global assessment was used to determine the nutritional status of patients with a minimum stay of 24 hours within 48 hours after admission (SGA). Data were collected using the open data kit (ODK) version 2022.3.3 software, while Stata version 14.1 software was employed for statistical analysis. The Cox regression model was used to determine the effect of malnutrition on the length of hospital stay (LOS) after adjusting for several potential confounders taken at admission. Adjusted hazard ratio (HR) with a 95% confidence interval was used to show the effect of malnutrition. Results: The prevalence of hospital malnutrition at admission was 64.32% (95% CI: 59%-69%) according to the SGA classification. Adult surgical patients who were malnourished at admission had higher median LOS (12 days: 95% CI: 11-13) as compared to well-nourished patients (8 days: 95% CI: 8-9), means adult surgical patients who were malnourished at admission were at higher risk of reduced chance of discharge with improvement (prolonged LOS) (AHR: 0.37, 95% CI: 0.29-0.47) as compared to well-nourished patients. Presence of comorbidity (AHR: 0.68, 95% CI: 0.50-90), poly medication (AHR: 0.69, 95% CI: 0.55-0.86), and history of admission (AHR: 0.70, 95% CI: 0.55-0.87) within the previous five years were found to be the significant covariates of the length of hospital stay (LOS). Conclusion: The magnitude of hospital malnutrition at admission was found to be high. Malnourished patients at admission had a higher risk of prolonged length of hospital stay as compared to well-nourished patients. The presence of comorbidity, polymedication, and history of admission were found to be the significant covariates of LOS. All stakeholders should give attention to reducing the magnitude of malnutrition and its covariates to improve the burden of LOS.

Keywords: effect of malnutrition, length of hospital stay, surgical patients, Ethiopia

Procedia PDF Downloads 21
48248 A Systematic Review of Patient-Reported Outcomes and Return to Work after Surgical vs. Non-surgical Midshaft Humerus Fracture

Authors: Jamal Alasiri, Naif Hakeem, Saoud Almaslmani

Abstract:

Background: Patients with humeral shaft fractures have two different treatment options. Surgical therapy has lesser risks of non-union, mal-union, and re-intervention than non-surgical therapy. These positive clinical outcomes of the surgical approach make it a preferable treatment option despite the risks of radial nerve palsy and additional surgery-related risk. We aimed to evaluate patients’ outcomes and return to work after surgical vs. non-surgical management of shaft humeral fracture. Methods: We used databases, including PubMed, Medline, and Cochrane Register of Controlled Trials, from 2010 to January 2022 to search for potential randomised controlled trials (RCTs) and cohort studies comparing the patients’ related outcome measures and return to work between surgical and non-surgical management of humerus fracture. Results: After carefully evaluating 1352 articles, we included three RCTs (232 patients) and one cohort study (39 patients). The surgical intervention used plate/nail fixation, while the non-surgical intervention used a splint or brace procedure to manage shaft humeral fracture. The pooled DASH effects of all three RCTs at six (M.D: -7.5 [-13.20, -1.89], P: 0.009) I2:44%) and 12 months (M.D: -1.32 [-3.82, 1.17], p:0.29, I2: 0%) were higher in patients treated surgically than in non-surgical procedures. The pooled constant Murley score at six (M.D: 7.945[2.77,13.10], P: 0.003) I2: 0%) and 12 months (M.D: 1.78 [-1.52, 5.09], P: 0.29, I2: 0%) were higher in patients who received non-surgical than surgical therapy. However, pooled analysis for patients returning to work for both groups remained inconclusive. Conclusion: Altogether, we found no significant evidence supporting the clinical benefits of surgical over non-surgical therapy. Thus, the non-surgical approach remains the preferred therapeutic choice for managing shaft humeral fractures due to its lesser side effects.

Keywords: shaft humeral fracture, surgical treatment, Patient-related outcomes, return to work, DASH

Procedia PDF Downloads 71
48247 Nutrition, Dental Status and Post-Traumatic Stress Disorder among Underage Refugees in Germany

Authors: Marios Loucas, Rafael Loucas, Oliver Muensterer

Abstract:

Aim of the Study: Over the last two years, there has been a substantial rise of refugees entering Germany, of which approximately one-third are underage. Little is known about the general state of health such as nutrition, dental status and post-traumatic stress disorder among underage refugees. Our study assesses the general health status of underage refugees based on a large sample cohort. Methods: After ethics board approval, we used a structured questionnaire to collect demographic information and health-related elements in 3 large refugee accommodation centers, focusing on nutritional and dental status, as well as symptoms of posttraumatic stress disorder. Main results: A total of 461 minor refugees were included. The majority were boys (54.5%), average age was 8 years. Out of the 8 recorded countries of origin, most children came from Syria (33.6%), followed by Afghanistan (23.2%). Of the participants, 50.3% reported DSM-5 criteria of Posttraumatic stress disorder and presented mental health-related problems. The most frequently reported mental abnormalities were concentration disturbances (15.2%), sleep disorders (6.9%), unclear headaches (5.4%). The majority of the participants showed an unfavorable nutritional and dental status. According to the family, the majority of the children rarely eat healthy foods such as fruits, vegetables and fish. However, the majority of these children (over 90%) consume a large quantity of sugary foods and sweetened drinks such as soft drinks and confectionery at least daily. Caries was found in 63% of the minor children included in the study. A large proportion (47%) reported never brushing their teeth. According to the family, 78.3% of refugee children have never been evaluated by a dentist in Germany. The remainder visited a dentist mainly because of unbearable toothache. Conclusions: Minor refugees have specific psychological, nutritional and dental problems that must be considered in order to ensure appropriate medical care. Posttraumatic stress disorder is mainly caused by physical and emotional trauma suffered either during the flight or in the refugee camp in Germany. These data call for widespread screening of psychological, dental and nutritional problems in underage refugees. Dental care of this cohort is completely inadequate. Nutritional programs should focus on educating the families and providing the means to obtain healthy foods for these children.

Keywords: children, nutrition, posttraumatic stress disorder, refugee

Procedia PDF Downloads 147
48246 Unveiling Comorbidities in Irritable Bowel Syndrome: A UK BioBank Study utilizing Supervised Machine Learning

Authors: Uswah Ahmad Khan, Muhammad Moazam Fraz, Humayoon Shafique Satti, Qasim Aziz

Abstract:

Approximately 10-14% of the global population experiences a functional disorder known as irritable bowel syndrome (IBS). The disorder is defined by persistent abdominal pain and an irregular bowel pattern. IBS significantly impairs work productivity and disrupts patients' daily lives and activities. Although IBS is widespread, there is still an incomplete understanding of its underlying pathophysiology. This study aims to help characterize the phenotype of IBS patients by differentiating the comorbidities found in IBS patients from those in non-IBS patients using machine learning algorithms. In this study, we extracted samples coding for IBS from the UK BioBank cohort and randomly selected patients without a code for IBS to create a total sample size of 18,000. We selected the codes for comorbidities of these cases from 2 years before and after their IBS diagnosis and compared them to the comorbidities in the non-IBS cohort. Machine learning models, including Decision Trees, Gradient Boosting, Support Vector Machine (SVM), AdaBoost, Logistic Regression, and XGBoost, were employed to assess their accuracy in predicting IBS. The most accurate model was then chosen to identify the features associated with IBS. In our case, we used XGBoost feature importance as a feature selection method. We applied different models to the top 10% of features, which numbered 50. Gradient Boosting, Logistic Regression and XGBoost algorithms yielded a diagnosis of IBS with an optimal accuracy of 71.08%, 71.427%, and 71.53%, respectively. Among the comorbidities most closely associated with IBS included gut diseases (Haemorrhoids, diverticular diseases), atopic conditions(asthma), and psychiatric comorbidities (depressive episodes or disorder, anxiety). This finding emphasizes the need for a comprehensive approach when evaluating the phenotype of IBS, suggesting the possibility of identifying new subsets of IBS rather than relying solely on the conventional classification based on stool type. Additionally, our study demonstrates the potential of machine learning algorithms in predicting the development of IBS based on comorbidities, which may enhance diagnosis and facilitate better management of modifiable risk factors for IBS. Further research is necessary to confirm our findings and establish cause and effect. Alternative feature selection methods and even larger and more diverse datasets may lead to more accurate classification models. Despite these limitations, our findings highlight the effectiveness of Logistic Regression and XGBoost in predicting IBS diagnosis.

Keywords: comorbidities, disease association, irritable bowel syndrome (IBS), predictive analytics

Procedia PDF Downloads 86
48245 Management of Urinary Tract Infections by Nurse Practitioners in a Canadian Pediatric Emergency Department: A Rretrospective Cohort Study

Authors: T. Mcgraw, F. N. Morin, N. Desai

Abstract:

Background: Antimicrobial resistance is a critical issue in global health care and a significant contributor to increased patient morbidity and mortality. Suspected urinary tract infection (UTI) is a key area of inappropriate antibiotic prescription in pediatrics. Management patterns of infectious diseases have been shown to vary by provider type within a single setting. The aim of this study was to assess compliance with national UTI management guidelines by nurse practitioners in a pediatric emergency department (ED). Methods: This was a post-hoc analysis of a retrospective cohort study to review and evaluate visits to a tertiary care freestanding pediatric emergency department. Patients were included if they were 60 days to 36 months old and discharged with a diagnosis of UTI or ‘rule-out UTI’ between July 2015 and July 2020. Primary outcome measure was proportion of visits seen by Nurse Practitioners (NP) which were associated with national guideline compliance in the diagnosis and treatment of suspected UTI. We performed descriptive statistics and comparative analyses to determine differences in practice patterns between NPs, and physicians. Results: A total of 636 charts were reviewed, of which 402 patients met inclusion criteria. 17 patients were treated by NPs, 385 were treated by either Pediatric Emergency Medicine physicians (PEM) or non-PEM physicians. Overall, the proportion of infants receiving guideline-compliant care was 25.9% (21.8-30.4%). Of those who were prescribed antibiotics, 79.6% (74.7-83.8%) received first line guideline recommended therapy and 58.9% (53.8-63.8%) received fully compliant therapy with respect to age, dose, duration, and frequency. In patients treated by NPs, 16/17 (94%(95% CI:73.0-99.0)) required antibiotics, 15/16 (93%(95% CI: 71.7-98.9)) were treated with first line agent (cephalexin), 8/16 (50%(95% CI:28-72)) were guideline compliant of dose and duration. 5/8 (63%(95% CI:30.6-86.3)) were noncompliant for dose being too high. There was no difference in receiving guideline compliant empiric antibiotic therapy between physicians and nurse practitioners (OR: 0.837 CI: 0.302-2.69). Conclusion: In this post-hoc analysis, guideline noncompliance by nurse practitioners is common in children tested and treated for UTIs in a pediatric emergency department. Care by a Nurse Practitioner was not associated with greater rate of noncompliance than care by a Pediatric Emergency Medicine physician. Future appropriately powered studies may focus on confirming these results.

Keywords: antibiotic stewardship, infectious disease, nurse practitioner, urinary tract infection

Procedia PDF Downloads 78
48244 The Efficacy of an Ideal RGP Fitting on Higher Order Aberrations (HOA) in 65 Keratoconus Patients

Authors: Ghandehari-Motlagh, Mohammad

Abstract:

Purpose: To evaluate of the effect of an ideal fit of RGPs on HOA and keratoconus indices. Methods: In this cohort study, 65 keratoconus eyes with more than 3 lines(Snellen)improvement between BSCVA and BCVA(RGP) were imaged with Pentacam HR and their topometric and Zernike analysis findings without RGP were recorded. After 6 months or later of RGP fitting (Rose-K,Boston XO2), imaging with pentacam was repeated and the above information were recorded. Results: 65 different grades of keratoconus eyes with mean age of 27.32 yrs/old(SD +_5.51)enrolled including M 28(43.1%) and F 37(56.9%). 44(67.7%) with family Hx of Kc and 21(31.25%)without any Kc in their families. 54 (83.1%) with and 11 (16.9%) without any ocular allergy Hx. Maximum percent of age of onset of kc was 15 ys/old(29.2%).This study showed there are meaningful correlations between with and without RGP Pentacam indices and HOA in each grade of Kc.92.3% of patients had foreign body sensation but 96.9% had 11-20 hours/day RGP wear that confirms on psychologic effect of an ideal fit on patient’s motivation. Conclusion: With the three points touch principle of RGP fitting in Kc corneas, the patients will have a decrease in HOA and so delayed need for PK or LK.

Keywords: keratoconus, rigid gas permeable lens, aberration, fitting

Procedia PDF Downloads 369
48243 Congenital Diaphragmatic Hernia Outcomes in a Low-Volume Center

Authors: Michael Vieth, Aric Schadler, Hubert Ballard, J. A. Bauer, Pratibha Thakkar

Abstract:

Introduction: Congenital diaphragmatic hernia (CDH) is a condition characterized by the herniation of abdominal contents into the thoracic cavity requiring postnatal surgical repair. Previous literature suggests improved CDH outcomes at high-volume regional referral centers compared to low-volume centers. The purpose of this study was to examine CDH outcomes at Kentucky Children’s Hospital (KCH), a low-volume center, compared to the Congenital Diaphragmatic Hernia Study Group (CDHSG). Methods: A retrospective chart review was performed at KCH from 2007-2019 for neonates with CDH, and then subdivided into two cohorts: those requiring ECMO therapy and those not requiring ECMO therapy. Basic demographic data and measures of mortality and morbidity including ventilator days and length of stay were compared to the CDHSG. Measures of morbidity for the ECMO cohort including duration of ECMO, clinical bleeding, intracranial hemorrhage, sepsis, need for continuous renal replacement therapy (CRRT), need for sildenafil at discharge, timing of surgical repair, and total ventilator days were collected. Statistical analysis was performed using IBM SPSS Statistics version 28. One-sample t-tests and one-sample Wilcoxon Signed Rank test were utilized as appropriate.Results: There were a total of 27 neonatal patients with CDH at KCH from 2007-2019; 9 of the 27 required ECMO therapy. The birth weight and gestational age were similar between KCH and the CDHSG (2.99 kg vs 2.92 kg, p =0.655; 37.0 weeks vs 37.4 weeks, p =0.51). About half of the patients were inborn in both cohorts (52% vs 56%, p =0.676). KCH cohort had significantly more Caucasian patients (96% vs 55%, p=<0.001). Unadjusted mortality was similar in both groups (KCH 70% vs CDHSG 72%, p =0.857). Using ECMO utilization (KCH 78% vs CDHSG 52%, p =0.118) and need for surgical repair (KCH 95% vs CDHSG 85%, p =0.060) as proxy for severity, both groups’ mortality were comparable. No significant difference was noted for pulmonary outcomes such as average ventilator days (KCH 43.2 vs. CDHSG 17.3, p =0.078) and home oxygen dependency (KCH 44% vs. CDHSG 24%, p =0.108). Average length of hospital stay for patients treated at KCH was similar to CDHSG (64.4 vs 49.2, p=1.000). Conclusion: Our study demonstrates that outcome in CDH patients is independent of center’s case volume status. Management of CDH with a standardized approach in a low-volume center can yield similar outcomes. This data supports the treatment of patients with CDH at low-volume centers as opposed to transferring to higher-volume centers.

Keywords: ECMO, case volume, congenital diaphragmatic hernia, congenital diaphragmatic hernia study group, neonate

Procedia PDF Downloads 63
48242 Determinants of Quality of Life in Patients with Atypical Prarkinsonian Syndromes: 1-Year Follow-Up Study

Authors: Tatjana Pekmezovic, Milica Jecmenica-Lukic, Igor Petrovic, Vladimir Kostic

Abstract:

Background: A group of atypical parkinsonian syndromes (APS) includes a variety of rare neurodegenerative disorders characterized by reduced life expectancy, increasing disability, and considerable impact on health-related quality of life (HRQoL). Aim: In this study we wanted to answer two questions: a) which demographic and clinical factors are main contributors of HRQoL in our cohort of patients with APS, and b) how does quality of life of these patients change over 1-year follow-up period. Patients and Methods: We conducted a prospective cohort study in hospital settings. The initial study comprised all consecutive patients who were referred to the Department of Movement Disorders, Clinic of Neurology, Clinical Centre of Serbia, Faculty of Medicine, University of Belgrade (Serbia), from January 31, 2000 to July 31, 2013, with the initial diagnoses of ‘Parkinson’s disease’, ‘parkinsonism’, ‘atypical parkinsonism’ and ‘parkinsonism plus’ during the first 8 months from the appearance of first symptom(s). The patients were afterwards regularly followed in 4-6 month intervals and eventually the diagnoses were established for 46 patients fulfilling the criteria for clinically probable progressive supranuclear palsy (PSP) and 36 patients for probable multiple system atrophy (MSA). The health-related quality of life was assessed by using the SF-36 questionnaire (Serbian translation). Hierarchical multiple regression analysis was conducted to identify predictors of composite scores of SF-36. The importance of changes in quality of life scores of patients with APS between baseline and follow-up time-point were quantified using Wilcoxon Signed Ranks Test. The magnitude of any differences for the quality of life changes was calculated as an effect size (ES). Results: The final models of hierarchical regression analysis showed that apathy measured by the Apathy evaluation scale (AES) score accounted for 59% of the variance in the Physical Health Composite Score of SF-36 and 14% of the variance in the Mental Health Composite Score of SF-36 (p<0.01). The changes in HRQoL were assessed in 52 patients with APS who completed 1-year follow-up period. The analysis of magnitude for changes in HRQoL during one-year follow-up period have shown sustained medium ES (0.50-0.79) for both Physical and Mental health composite scores, total quality of life as well as for the Physical Health, Vitality, Role Emotional and Social Functioning. Conclusion: This study provides insight into new potential predictors of HRQoL and its changes over time in patients with APS. Additionally, identification of both prognostic markers of a poor HRQoL and magnitude of its changes should be considered when developing comprehensive treatment-related strategies and health care programs aimed at improving HRQoL and well-being in patients with APS.

Keywords: atypical parkinsonian syndromes, follow-up study, quality of life, APS

Procedia PDF Downloads 278
48241 Management of Femoral Neck Stress Fractures at a Specialist Centre and Predictive Factors to Return to Activity Time: An Audit

Authors: Charlotte K. Lee, Henrique R. N. Aguiar, Ralph Smith, James Baldock, Sam Botchey

Abstract:

Background: Femoral neck stress fractures (FNSF) are uncommon, making up 1 to 7.2% of stress fractures in healthy subjects. FNSFs are prevalent in young women, military recruits, endurance athletes, and individuals with energy deficiency syndrome or female athlete triad. Presentation is often non-specific and is often misdiagnosed following the initial examination. There is limited research addressing the return–to–activity time after FNSF. Previous studies have demonstrated prognostic time predictions based on various imaging techniques. Here, (1) OxSport clinic FNSF practice standards are retrospectively reviewed, (2) FNSF cohort demographics are examined, (3) Regression models were used to predict return–to–activity prognosis and consequently determine bone stress risk factors. Methods: Patients with a diagnosis of FNSF attending Oxsport clinic between 01/06/2020 and 01/01/2020 were selected from the Rheumatology Assessment Database Innovation in Oxford (RhADiOn) and OxSport Stress Fracture Database (n = 14). (1) Clinical practice was audited against five criteria based on local and National Institute for Health Care Excellence guidance, with a 100% standard. (2) Demographics of the FNSF cohort were examined with Student’s T-Test. (3) Lastly, linear regression and Random Forest regression models were used on this patient cohort to predict return–to–activity time. Consequently, an analysis of feature importance was conducted after fitting each model. Results: OxSport clinical practice met standard (100%) in 3/5 criteria. The criteria not met were patient waiting times and documentation of all bone stress risk factors. Importantly, analysis of patient demographics showed that of the population with complete bone stress risk factor assessments, 53% were positive for modifiable bone stress risk factors. Lastly, linear regression analysis was utilized to identify demographic factors that predicted return–to–activity time [R2 = 79.172%; average error 0.226]. This analysis identified four key variables that predicted return-to-activity time: vitamin D level, total hip DEXA T value, femoral neck DEXA T value, and history of an eating disorder/disordered eating. Furthermore, random forest regression models were employed for this task [R2 = 97.805%; average error 0.024]. Analysis of the importance of each feature again identified a set of 4 variables, 3 of which matched with the linear regression analysis (vitamin D level, total hip DEXA T value, and femoral neck DEXA T value) and the fourth: age. Conclusion: OxSport clinical practice could be improved by more comprehensively evaluating bone stress risk factors. The importance of this evaluation is demonstrated by the population found positive for these risk factors. Using this cohort, potential bone stress risk factors that significantly impacted return-to-activity prognosis were predicted using regression models.

Keywords: eating disorder, bone stress risk factor, femoral neck stress fracture, vitamin D

Procedia PDF Downloads 159
48240 Implementation of Enhanced Recovery after Surgery (ERAS) Protocols in Laparoscopic Sleeve Gastrectomy (LSG); A Systematic Review and Meta-analysis

Authors: Misbah Nizamani, Saira Malik

Abstract:

Introduction: Bariatric surgery is the most effective treatment for patients suffering from morbid obesity. Laparoscopic sleeve gastrectomy (LSG) accounts for over 50% of total bariatric procedures. The aim of our meta-analysis is to investigate the effectiveness and safety of Enhanced Recovery After Surgery (ERAS) protocols for patients undergoing laparoscopic sleeve gastrectomy. Method: To gather data, we searched PubMed, Google Scholar, ScienceDirect, and Cochrane Central. Eligible studies were randomized controlled trials and cohort studies involving adult patients (≥18 years) undergoing bariatric surgeries, i.e., Laparoscopic sleeve gastrectomy. Outcome measures included LOS, postoperative narcotic usage, postoperative pain score, postoperative nausea and vomiting, postoperative complications and mortality, emergency department visits and readmission rates. RevMan version 5.4 was used to analyze outcomes. Results: Three RCTs and three cohorts with 1522 patients were included in this study. ERAS group and control group were compared for eight outcomes. LOS was reduced significantly in the intervention group (p=0.00001), readmission rates had borderline differences (p=0.35) and higher postoperative complications in the control group, but the result was non-significant (p=0.68), whereas postoperative pain score was significantly reduced (p=0.005). Total MME requirements became significant after performing sensitivity analysis (p= 0.0004). Postoperative mortality could not be analyzed on account of invalid data showing 0% mortality in two cohort studies. Conclusion: This systemic review indicated the effectiveness of the application of ERAS protocols in LSG in reducing the length of stay, post-operative pain and total MME requirements postoperatively, indicating the feasibility and assurance of its application.

Keywords: eras protocol, sleeve gastrectomy, bariatric surgery, enhanced recovery after surgery

Procedia PDF Downloads 18
48239 Jelly and Beans: Appropriate Use of Ultrasound in Acute Kidney Injury

Authors: Raja Ezman Raja Shariff

Abstract:

Acute kidney injury (AKI) is commonly seen in inpatients, and places a great cost on the NHS and patients. Timely and appropriate management is both nephron sparing and potentially life-saving. Ultrasound scanning (USS) is a well-recognised method for stratifying patients. Subsequently, the NICE AKI guidance has defined groups in whom scanning is recommended within 6 hours of request (pyonephrosis), within 24 hours (obstruction/cause unknown), and in whom routine scanning isn't recommended (cause for AKI identified). The audit looks into whether Stockport NHS Trust USS practice was in line with such recommendations. The audit evaluated 92 patients with AKI who had USS, between 01/01/14 to 30/04/14. Data collection was divided into 2 parts. Firstly, radiology request cards and the online imaging software (PACS) were evaluated. Then, the electronic case notes (ADVANTIS) was evaluated further. Based on request cards, 10% of requests were for pyonephrosis. Only 33% were scanned within 6hours and a further 33% within 24hours. 75% were requested for possible obstructions and unknown cause collectively. Of those due to possible obstruction, 71% of patients were scanned within 24 hours. Of those with unknown cause, 50% were scanned within 24 hours. 15% of requests had a cause declared and so potentially did not require scanning. Evaluation of the patients’ notes suggested further interesting findings. Firstly, potentially 39% of patients had a known cause for AKI, therefore, did not need USS. Subsequently, the cohort of unknown cause and possible obstruction was collectively reduced to 45%. Alarmingly the patient cohort with possible pyonephrosis went up to 16%, suggesting an under-recognition of this life-threatening condition. We plan to highlight these findings within our institution and make changes to encourage more appropriate requesting and timely scanning. Time will tell if we manage to save or increase our costs in this cost-conscious NHS. Patient benefits, though, seem to be guaranteed.

Keywords: AKI, ARF, kidney, renal

Procedia PDF Downloads 372
48238 Stress Perception, Social Supports and Family Function among Military Inpatients with Adjustment Disorders in Taiwan

Authors: Huey-Fang Sun, Wei-Kai Weng, Mei-Kuang Chao, Hui-Shan Hsu, Tsai-Yin Shih

Abstract:

Psycho-social stress is important for mental illness and the presence of emotional and behavioral symptoms to an identifiable event is the central feature of adjustment disorders. However, whether patients with adjustment disorders have been raised in family with poor family functions and social supports and have higher stress perception than their peer group when they both experienced a similar stressful environment remains unknown. The specific aims of the study are to investigate the correlation among the family function, social supports and the level of stress perception and to test the hypothesis that military patients with adjustment disorders would have lower family function, lower social supports and higher stress perception than their healthy colleagues recruited in the same cohort for military services given their common exposure to similar stressful environments. Methods: The study was conducted in four hospitals of northern part of Taiwan from July 1, 2015 to June 30, 2017 and a matched case-control study design was used. The inclusion criteria for potential patient participants were psychiatric inpatients that serviced in military during the study period and met the diagnosis of adjustment disorders. Patients who had been admitted to psychiatric ward before or had illiteracy problem were excluded. A healthy military control sample matched by the same military service unit, gender, and recruited cohort was invited to participate the study as well. Totally 74 participants (37 patients and 37 controls) completed the consent forms and filled out the research questionnaires. Questionnaires used in the study included Perceived Stress Scale (PSS) as a measure of stress perception; Family APGAR as a measure of family function, and Multidimensional Scale of Perceived Social Support (MSPSS) as a measure of social supports. Pearson correlation analysis and t-test were applied for statistical analysis. Results: The analysis results showed that PSS level significantly negatively correlated with three social support subscales (family subscale, r= -.37, P < .05; friend subscale, r= -.38, P < .05; significant other subscale, r= -.39, P < .05). A negative correlation between PSS level and Family APGAR only reached a borderline significant level (P= .06). The t-test results for PSS scores, Family APGAR levels, and three subscale scores of MSPSS between patient and control participants were all significantly different (P < .001, P < .05, P < .05, P < .05, P < .05, respectively) and the patient participants had higher stress perception scores, lower social supports and lower family function scores than the healthy control participants. Conclusions: Our study suggested that family function and social supports were negatively correlated with patients’ subjective stress perception. Military patients with adjustment disorders tended to have higher stress perception and lower family function and social supports than those military peers who remained healthy and still provided services in their military units.

Keywords: adjustment disorders, family function, social support, stress perception

Procedia PDF Downloads 169
48237 Stress Hyperglycaemia and Glycaemic Control Post Cardiac Surgery: Relaxed Targets May Be Acceptable

Authors: Nicholas Bayfield, Liam Bibo, Charley Budgeon, Robert Larbalestier, Tom Briffa

Abstract:

Introduction: Stress hyperglycaemia is common following cardiac surgery. Its optimal management is uncertain and may differ by diabetic status. This study assesses the in-hospital glycaemic management of cardiac surgery patients and associated postoperative outcomes. Methods: A retrospective cohort analysis of all patients undergoing cardiac surgery at Fiona Stanley Hospital from February 2015 to May 2019 was undertaken. Management and outcomes of hyperglycaemia following cardiac surgery were assessed. Follow-up was assessed to 1 year postoperatively. Multivariate regression modelling was utilised. Results: 1050 non-diabetic patients and 689 diabetic patients were included. In the non-diabetic cohort, patients with mild (peak blood sugar level [BSL] < 14.3), transient stress hyperglycaemia managed without insulin were not at an increased risk of wound-related morbidity (P=0.899) or mortality at 1 year (P=0.483). Insulin management was associated with wound-related readmission to hospital (P=0.004) and superficial sternal wound infection (P=0.047). Prolonged or severe stress hyperglycaemia was predictive of hospital re-admission (P=0.050) but not morbidity or mortality (P=0.546). Diabetes mellitus was an independent risk factor 1-year mortality (OR; 1.972 [1.041–3.736], P=0.037), graft harvest site wound infection (OR; 1.810 [1.134–2.889], P=0.013) and wound-related readmission (OR; 1.866 [1.076–3.236], P=0.026). In diabetics, postoperative peak BSL > 13.9mmol/L was predictive of graft harvest site infections (OR; 3.528 [1.724-7.217], P=0.001) and wound-related readmission OR; 3.462 [1.540-7.783], P=0.003) regardless of modality of management. A peak BSL of 10.0-13.9 did not increase the risk of morbidity/mortality compared to a peak BSL of < 10.0 (P=0.557). Diabetics with a peak BSL of 13.9 or less did not have significantly increased morbidity/mortality outcomes compared to non-diabetics (P=0.418). Conclusion: In non-diabetic patients, transient mild stress hyperglycaemia following cardiac surgery does not uniformly require treatment. In diabetic patients, postoperative hyperglycaemia with peak BSL exceeding 13.9mmol/L was associated with wound-related morbidity and hospital readmission following cardiac surgery.

Keywords: cardiac surgery, pulmonary embolism, pulmonary embolectomy, cardiopulmonary bypass

Procedia PDF Downloads 135
48236 Outcome of Naive SGLT2 Inhibitors Among ICU Admitted Acute Stroke with T2DM Patients a Prospective Cohort Study in NCMultispecialty Hospital, Biratnagar, Nepal

Authors: Birendra Kumar Bista, Rhitik Bista, Prafulla Koirala, Lokendra Mandal, Nikrsh Raj Shrestha, Vivek Kattel

Abstract:

Introduction: Poorly controlled diabetes is associated with cause and poor outcome of stroke. High blood sugar reduces cerebral blood flow, increases intracranial pressure, cerebral edema and neuronal death, especially among patients with poorly controlled diabetes.1 SGLT2 inhibitors are associated with 50% reduction in hemorrhagic stroke compared with placebo. SGLT2 inhibitors decrease cardiovascular events via reducing glucose, blood pressure, weight, arteriosclerosis, albuminuria and reduction of atrial fibrillation.2,3 No study has been documented in low income countries to see the role of post stroke SGLT2 inhibitors on diabetic patients at and after ICU admission. Aims: The aim of the study was to measure the 12 months outcome of diabetic patients with acute stroke admitted in ICU set up with naïve SGLT2 inhibitors add on therapy. Method: It was prospective cohort study carried out in a 250 bedded tertiary neurology care hospital at the province capital Biratnagar Nepal. Diabetic patient with acute stroke admitted in ICU from 1st January 2022 to 31st December 2022 who were not under SGLT2 inhibitors were included in the study. These patients were managed as per hospital protocol. Empagliflozin was added to the alternate enrolled patients. Empagliflozin was continued at the time of discharged and during follow up unless contraindicated. These patients were followed up for 12 months. Outcome measured were mortality, morbidity requiring readmission or hospital visit other than regular follow up, SGLT2 inhibitors related adverse events, neuropsychiatry comorbidity, functional status and biochemical parameters. Ethical permission was taken from hospital administration and ethical board. Results: Among 147 diabetic cases 68 were not treated with empagliflozin whereas 67 cases were started the SGLT2 inhibitors. HbA1c level and one year mortality was significantly low among patients on empaglifozin arm. Over a period of 12 months 427 acute stroke patients were admitted in the ICU. Out of them 44% were female, 61% hypertensive, 34% diabetic, 57% dyslipidemia, 26% smoker and with median age of 45 years. Among 427 cases 4% required neurosurgical interventions and 76% had hemorrhagic CVA. The most common reason for ICU admission was GCS<8 (51%). The median ICU stay was 5 days. ICU mortality was 21% whereas 1 year mortality was 41% with most common reason being pneumonia. Empaglifozin related adverse effect was seen in 11% most commonly lower urinary tract infection in 6%. Conclusion: Empagliflozin can safely be started among acute stroke with better Hba1C control and low mortality outcome compared to treatment without SGLT2 inhibitor.

Keywords: diabetes, ICU, mortality, SGLT2 inhibitors, stroke

Procedia PDF Downloads 30
48235 Intergenerational Trauma: Patterns of Child Abuse and Neglect Across Two Generations in a Barbados Cohort

Authors: Rebecca S. Hock, Cyralene P. Bryce, Kevin Williams, Arielle G. Rabinowitz, Janina R. Galler

Abstract:

Background: Findings have been mixed regarding whether offspring of parents who were abused or neglected as children have a greater risk of experiencing abuse or neglect themselves. In addition, many studies on this topic are restricted to physical abuse and take place in a limited number of countries, representing a small segment of the world's population. Methods: We examined relationships between childhood maltreatment history assessed in a subset (N=68) of the original longitudinal birth cohort (G1) of the Barbados Nutrition Study and their now-adult offspring (G2) (N=111) using the Childhood Trauma Questionnaire-Short Form (CTQ-SF). We used Pearson correlations to assess relationships between parent and offspring CTQ-SF total and subscale scores (physical, emotional, and sexual abuse; physical and emotional neglect). Next, we ran multiple regression analyses, using the parental CTQ-SF total score and the parental Sexual Abuse score as primary predictors separately in our models of G2 CTQ-SF (total and subscale scores). Results: G1 total CTQ-SF scores were correlated with G2 offspring Emotional Neglect and total scores. G1 Sexual Abuse history was significantly correlated with G2 Emotional Abuse, Sexual Abuse, Emotional Neglect, and Total Score. In fully-adjusted regression models, parental (G1) total CTQ-SF scores remained significantly associated with G2 offspring reports of Emotional Neglect, and parental (G1) Sexual Abuse was associated with offspring (G2) reports of Emotional Abuse, Physical Abuse, Emotional Neglect, and overall CTQ-SF scores. Conclusions: Our findings support a link between parental exposure to childhood maltreatment and their offspring's self-reported exposure to childhood maltreatment. Of note, there was not an exact correspondence between the subcategory of maltreatment experienced from one generation to the next. Compared with other subcategories, G1 Sexual Abuse history was the most likely to predict G2 offspring maltreatment. Further studies are needed to delineate underlying mechanisms and to develop intervention strategies aimed at preventing intergenerational transmission.

Keywords: trauma, family, adolescents, intergenerational trauma, child abuse, child neglect, global mental health, North America

Procedia PDF Downloads 56
48234 Neuroimaging Markers for Screening Former NFL Players at Risk for Developing Alzheimer's Disease / Dementia Later in Life

Authors: Vijaykumar M. Baragi, Ramtilak Gattu, Gabriela Trifan, John L. Woodard, K. Meyers, Tim S. Halstead, Eric Hipple, Ewart Mark Haacke, Randall R. Benson

Abstract:

NFL players, by virtue of their exposure to repetitive head injury, are at least twice as likely to develop Alzheimer's disease (AD) and dementia as the general population. Early recognition and intervention prior to onset of clinical symptoms could potentially avert/delay the long-term consequences of these diseases. Since AD is thought to have a long preclinical incubation period, the aim of the current research was to determine whether former NFL players, referred to a depression center, showed evidence of incipient dementia in their structural imaging prior to diagnosis of dementia. Thus, to identify neuroimaging markers of AD, against which former NFL players would be compared, we conducted a comprehensive volumetric analysis using a cohort of early stage AD patients (ADNI) to produce a set of brain regions demonstrating sensitivity to early AD pathology (i.e., the “AD fingerprint”). A cohort of 46 former NFL players’ brain MRIs were then interrogated using the AD fingerprint. Brain scans were done using a T1-weighted MPRAGE sequence. The Free Surfer image analysis suite (version 6.0) was used to obtain the volumetric and cortical thickness data. A total of 55 brain regions demonstrated significant atrophy or ex vacuo dilatation bilaterally in AD patients vs. healthy controls. Of the 46 former NFL players, 19 (41%) demonstrated a greater than expected number of atrophied/dilated AD regions when compared with age-matched controls, presumably reflecting AD pathology.

Keywords: alzheimers, neuroimaging biomarkers, traumatic brain injury, free surfer, ADNI

Procedia PDF Downloads 134
48233 Prevalence of Fast-Food Consumption on Overweight or Obesity on Employees (Age Between 25-45 Years) in Private Sector; A Cross-Sectional Study in Colombo, Sri Lanka

Authors: Arosha Rashmi De Silva, Ananda Chandrasekara

Abstract:

This study seeks to comprehensively examine the influence of fast-food consumption and physical activity levels on the body weight of young employees within the private sector of Sri Lanka. The escalating popularity of fast food has raised concerns about its nutritional content and associated health ramifications. To investigate this phenomenon, a cohort of 100 individuals aged between 25 and 45, employed in Sri Lanka's private sector, participated in this research. These participants provided socio-demographic data through a standardized questionnaire, enabling the characterization of their backgrounds. Additionally, participants disclosed their frequency of fast-food consumption and engagement in physical activities, utilizing validated assessment tools. The collected data was meticulously compiled into an Excel spreadsheet and subjected to rigorous statistical analysis. Descriptive statistics, such as percentages and proportions, were employed to delineate the body weight status of the participants. Employing chi-square tests, our study identified significant associations between fast-food consumption, levels of physical activity, and body weight categories. Furthermore, through binary logistic regression analysis, potential risk factors contributing to overweight and obesity within the young employee cohort were elucidated. Our findings revealed a disconcerting trend, with 6% of participants classified as underweight, 32% within the normal weight range, and a substantial 62% categorized as overweight or obese. These outcomes underscore the alarming prevalence of overweight and obesity among young private-sector employees, particularly within the bustling urban landscape of Colombo, Sri Lanka. The data strongly imply a robust correlation between fast-food consumption, sedentary behaviors, and higher body weight categories, reflective of the evolving lifestyle patterns associated with the nation's economic growth. This study emphasizes the urgent need for effective interventions to counter the detrimental effects of fast-food consumption. The implementation of awareness campaigns elucidating the adverse health consequences of fast food, coupled with comprehensive nutritional education, can empower individuals to make informed dietary choices. Workplace interventions, including the provision of healthier meal alternatives and the facilitation of physical activity opportunities, are essential in fostering a healthier workforce and mitigating the escalating burden of overweight and obesity in Sri Lanka

Keywords: fast food consumption, obese, overweight, physical activity level

Procedia PDF Downloads 4
48232 PhenoScreen: Development of a Systems Biology Tool for Decision Making in Recurrent Urinary Tract Infections

Authors: Jonathan Josephs-Spaulding, Hannah Rettig, Simon Graspeunter, Jan Rupp, Christoph Kaleta

Abstract:

Background: Recurrent urinary tract infections (rUTIs) are a global cause of emergency room visits and represent a significant burden for public health systems. Therefore, metatranscriptomic approaches to investigate metabolic exchange and crosstalk between uropathogenic Escherichia coli (UPEC), which is responsible for 90% of UTIs, and collaborating pathogens of the urogenital microbiome is necessary to better understand the pathogenetic processes underlying rUTIs. Objectives: This study aims to determine the level in which uropathogens optimize the host urinary metabolic environment to succeed during invasion. By developing patient-specific metabolic models of infection, these observations can be taken advantage of for the precision treatment of human disease. Methods: To date, we have set up an rUTI patient cohort and observed various urine-associated pathogens. From this cohort, we developed patient-specific metabolic models to predict bladder microbiome metabolism during rUTIs. This was done by creating an in silico metabolomic urine environment, which is representative of human urine. Metabolic models of uptake and cross-feeding of rUTI pathogens were created from genomes in relation to the artificial urine environment. Finally, microbial interactions were constrained by metatranscriptomics to indicate patient-specific metabolic requirements of pathogenic communities. Results: Metabolite uptake and cross-feeding are essential for strain growth; therefore, we plan to design patient-specific treatments by adjusting urinary metabolites through nutritional regimens to counteract uropathogens by depleting essential growth metabolites. These methods will provide mechanistic insights into the metabolic components of rUTI pathogenesis to provide an evidence-based tool for infection treatment.

Keywords: recurrent urinary tract infections, human microbiome, uropathogenic Escherichia coli, UPEC, microbial ecology

Procedia PDF Downloads 104
48231 Identification of New Familial Breast Cancer Susceptibility Genes: Are We There Yet?

Authors: Ian Campbell, Gillian Mitchell, Paul James, Na Li, Ella Thompson

Abstract:

The genetic cause of the majority of multiple-case breast cancer families remains unresolved. Next generation sequencing has emerged as an efficient strategy for identifying predisposing mutations in individuals with inherited cancer. We are conducting whole exome sequence analysis of germ line DNA from multiple affected relatives from breast cancer families, with the aim of identifying rare protein truncating and non-synonymous variants that are likely to include novel cancer predisposing mutations. Data from more than 200 exomes show that on average each individual carries 30-50 protein truncating mutations and 300-400 rare non-synonymous variants. Heterogeneity among our exome data strongly suggest that numerous moderate penetrance genes remain to be discovered, with each gene individually accounting for only a small fraction of families (~0.5%). This scenario marks validation of candidate breast cancer predisposing genes in large case-control studies as the rate-limiting step in resolving the missing heritability of breast cancer. The aim of this study is to screen genes that are recurrently mutated among our exome data in a larger cohort of cases and controls to assess the prevalence of inactivating mutations that may be associated with breast cancer risk. We are using the Agilent HaloPlex Target Enrichment System to screen the coding regions of 168 genes in 1,000 BRCA1/2 mutation-negative familial breast cancer cases and 1,000 cancer-naive controls. To date, our interim analysis has identified 21 genes which carry an excess of truncating mutations in multiple breast cancer families versus controls. Established breast cancer susceptibility gene PALB2 is the most frequently mutated gene (13/998 cases versus 0/1009 controls), but other interesting candidates include NPSR1, GSN, POLD2, and TOX3. These and other genes are being validated in a second cohort of 1,000 cases and controls. Our experience demonstrates that beyond PALB2, the prevalence of mutations in the remaining breast cancer predisposition genes is likely to be very low making definitive validation exceptionally challenging.

Keywords: predisposition, familial, exome sequencing, breast cancer

Procedia PDF Downloads 464
48230 The Effect of Dopamine D2 Receptor TAQ A1 Allele on Sprinter and Endurance Athlete

Authors: Öznur Özge Özcan, Canan Sercan, Hamza Kulaksız, Mesut Karahan, Korkut Ulucan

Abstract:

Genetic structure is very important to understand the brain dopamine system which is related to athletic performance. Hopefully, there will be enough studies about athletics performance in the terms of addiction-related genetic markers in the future. In the present study, we intended to investigate the Receptor-2 Gene (DRD2) rs1800497, which is related to brain dopaminergic system. 10 sprinter and 10 endurance athletes were enrolled in the study. Real-Time Polymerase Chain Reaction method was used for genotyping. According to results, A1A1, A1A2 and A2A2 genotypes in athletes were 0 (%0), 3 (%15) and 17 (%85). A1A1 genotype was not found and A2 allele was counted as the dominating allele in our cohort. These findings show that dopaminergic mechanism effects on sport genetic may be explained by the polygenic and multifactorial view.

Keywords: addiction, athletic performance, genotype, sport genetics

Procedia PDF Downloads 185
48229 A False Introduction: Teaching in a Pandemic

Authors: Robert Michael, Kayla Tobin, William Foster, Rachel Fairchild

Abstract:

The COVID-19 pandemic has caused significant disruptions in education, particularly in the teaching of health and physical education (HPE). This study examined a cohort of teachers that experienced being a preservice and first-year teacher during various stages of the pandemic. Qualitative data collection was conducted by interviewing six teachers from different schools in the Eastern U.S. over a series of structured interviews. Thematic analysis was employed to analyze the data. The pandemic significantly impacted the way HPE was taught as schools shifted to virtual and hybrid models. Findings revealed five major themes: (a) You want me to teach HOW?, (b) PE without equipment and six feet apart, (c) Behind the Scenes, (d) They’re back…I became a behavior management guru, and (e) The Pandemic Crater. Overall, this study highlights the significant challenges faced by preservice and first-year teachers in teaching physical education during the pandemic and underscores the need for ongoing support and resources to help them adapt and succeed in these challenging circumstances.

Keywords: teacher education, preservice teachers, first year teachers, health and physical education

Procedia PDF Downloads 144
48228 Autosomal Dominant Polycystic Kidney Patients May Be Predisposed to Various Cardiomyopathies

Authors: Fouad Chebib, Marie Hogan, Ziad El-Zoghby, Maria Irazabal, Sarah Senum, Christina Heyer, Charles Madsen, Emilie Cornec-Le Gall, Atta Behfar, Barbara Ehrlich, Peter Harris, Vicente Torres

Abstract:

Background: Mutations in PKD1 and PKD2, the genes encoding the proteins polycystin-1 (PC1) and polycystin-2 (PC2) cause autosomal dominant polycystic kidney disease (ADPKD). ADPKD is a systemic disease associated with several extrarenal manifestations. Animal models have suggested an important role for the polycystins in cardiovascular function. The aim of the current study is to evaluate the association of various cardiomyopathies in a large cohort of patients with ADPKD. Methods: Clinical data was retrieved from medical records for all patients with ADPKD and cardiomyopathies (n=159). Genetic analysis was performed on available DNA by direct sequencing. Results: Among the 58 patients included in this case series, 39 patients had idiopathic dilated cardiomyopathy (IDCM), 17 had hypertrophic obstructive cardiomyopathy (HOCM), and 2 had left ventricular noncompaction (LVNC). The mean age at cardiomyopathy diagnosis was 53.3, 59.9 and 53.5 years in IDCM, HOCM and LVNC patients respectively. The median left ventricular ejection fraction at initial diagnosis of IDCM was 25%. Average basal septal thickness was 19.9 mm in patients with HOCM. Genetic data was available in 19, 8 and 2 cases of IDCM, HOCM, and LVNC respectively. PKD1 mutations were detected in 47.4%, 62.5% and 100% of IDCM, HOCM and LVNC cases. PKD2 mutations were detected only in IDCM cases and were overrepresented (36.8%) relative to the expected frequency in ADPKD (~15%). The prevalence of IDCM, HOCM, and LVNC in our ADPKD clinical cohort was 1:17, 1:39 and 1:333 respectively. When compared to the general population, IDCM and HOCM was approximately 10-fold more prevalent in patients with ADPKD. Conclusions: In summary, we suggest that PKD1 or PKD2 mutations may predispose to idiopathic dilated or hypertrophic cardiomyopathy. There is a trend for patients with PKD2 mutations to develop the former and for patients with PKD1 mutations to develop the latter. Predisposition to various cardiomyopathies may be another extrarenal manifestation of ADPKD.

Keywords: autosomal dominant polycystic kidney (ADPKD), polycystic kidney disease, cardiovascular, cardiomyopathy, idiopathic dilated cardiomyopathy, hypertrophic cardiomyopathy, left ventricular noncompaction

Procedia PDF Downloads 285
48227 Flipped Classroom in a European Public Health Program: The Need for Students' Self-Directness

Authors: Nynke de Jong, Inge G. P. Duimel-Peeters

Abstract:

The flipped classroom as an instructional strategy and a type of blended learning that reverses the traditional learning environment by delivering instructional content, off- and online, in- and outside the classroom, has been implemented in a 4-weeks module focusing on ageing in Europe at the Maastricht University. The main aim regarding the organization of this module was implementing flipped classroom-principles in order to create meaningful learning opportunities, while educational technologies are used to deliver content outside of the classroom. Technologies used in this module were an online interactive real time lecture from England, two interactive face-to-face lectures with visual supports, one group session including role plays and team-based learning meetings. The cohort of 2015-2016, using educational technologies, was compared with the cohort of 2014-2015 on module evaluation such as organization and instructiveness of the module, who studied the same content, although conforming the problem-based educational strategy, i.e. educational base of the Maastricht University. The cohort of 2015-2016 with its specific organization, was also more profound evaluated on outcomes as (1) experienced duration of the lecture by students, (2) experienced content of the lecture, (3) experienced the extent of the interaction and (4) format of lecturing. It was important to know how students reflected on duration and content taken into account their background knowledge so far, in order to distinguish between sufficient enough regarding prior knowledge and therefore challenging or not fitting into the course. For the evaluation, a structured online questionnaire was used, whereby above mentioned topics were asked for to evaluate by scoring them on a 4-point Likert scale. At the end, there was room for narrative feedback so that interviewees could express more in detail, if they wanted, what they experienced as good or not regarding the content of the module and its organization parts. Eventually, the response rate of the evaluation was lower than expected (54%), however, due to written feedback and exam scores, we dare to state that it gives a good and reliable overview that encourages to work further on it. Probably, the response rate may be explained by the fact that resit students were included as well, and that there maybe is too much evaluation as some time points in the program. However, overall students were excited about the organization and content of the module, but the level of self-directed behavior, necessary for this kind of educational strategy, was too low. They need to be more trained in self-directness, therefore the module will be simplified in 2016-2017 with more clear and fewer topics and extra guidance (step by step procedure). More specific information regarding the used technologies will be explained at the congress, as well as the outcomes (min and max rankings, mean and standard deviation).

Keywords: blended learning, flipped classroom, public health, self-directness

Procedia PDF Downloads 195