Search results for: Population-base cohort study
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 48343

Search results for: Population-base cohort study

48313 Mild Hypothermia Versus Normothermia in Patients Undergoing Cardiac Surgery: A Propensity Matched Analysis

Authors: Ramanish Ravishankar, Azar Hussain, Mahmoud Loubani, Mubarak Chaudhry

Abstract:

Background and Aims: Currently, there are no strict guidelines in cardiopulmonary bypass temperature management in cardiac surgery not involving the aortic arch. This study aims to compare patient outcomes undergoing mild hypothermia and normothermia. The aim of this study was to compare patient outcomes between mild hypothermia and normothermia undergoing on-pump cardiac surgery not involving the aortic arch. Methods: This was a retrospective cohort study from January 2015 until May 2023. Patients who underwent cardiac surgery with cardiopulmonary bypass temperatures ≥32oC were included and stratified into mild hypothermia (32oC – 35oC) and normothermia (>35oC) cohorts. Propensity matching was applied through the nearest neighbour method (1:1) using the risk factors detailed in the EuroScore using RStudio. The primary outcome was mortality. Secondary outcomes included post-op stay, intensive care unit readmission, re-admission, stroke, and renal complications. Patients who had major aortic surgery and off-pump operations were excluded. Results: Each cohort had 1675 patients. There was a significant increase in overall mortality with the mild hypothermia cohort (3.59% vs. 2.32%; p=0.04912). There was also a greater stroke incidence (2.09% vs. 1.13%; p=0.0396) and transient ischaemic attack (TIA) risk (3.1% vs. 1.49%; p=0.0027). There was no significant difference in renal complications (9.13% vs. 7.88%; p=0.2155). Conclusions: Patient’s who underwent mild hypothermia during cardiopulmonary bypass have a significantly greater mortality, stroke, and transient ischaemic attack incidence. Mild hypothermia does not appear to provide any benefit over normothermia and does not appear to provide any neuroprotective benefits. This shows different results to that of other major studies; further trials and studies need to be conducted to reach a consensus.

Keywords: cardiac surgery, therapeutic hypothermia, neuroprotection, cardiopulmonary bypass

Procedia PDF Downloads 43
48312 Exclusive Breastfeeding Abandonment among Adolescent Mothers: A Cohort Study

Authors: Maria I. Nuñez-Hernández, Maria L. Riesco

Abstract:

Background: Exclusive breastfeeding (EBF) up to 6 months old infant have been considered one of the most important factors in the overall development of children. Nevertheless, as resources are scarce, it is essential to identify the most vulnerable groups that have major risk of EBF abandonment, in order to deliver the best strategies. Children of adolescent mothers are within these groups. Aims: To determine the EBF abandonment rate among adolescent mothers and to analyze the associated factors. Methods: Prospective cohort study of adolescent mothers in the southern area of Santiago, Chile, conducted in primary care services of public health system. The cohort was established from 2014 to 2015, with a sample of 105 adolescent mothers and their children at 2 months of life. The inclusion criteria were: adolescent mother from 14 to 19 years old; not twin babies; mother and baby leaving the hospital together after birthchild; correct attachment of the baby to the breast; no difficulty understanding the Spanish language or communicating. Follow-up was performed at 4 and 6 months old infant. Data were collected by interviews, considering EBF as breastfeeding only, without adding other milk, tea, juice, water or other product that not breast milk, except drugs. Data were analyzed by descriptive and inferential statistics, by Kaplan-Meier estimator and Log-Rank test, admitting the probability of occurrence of type I error of 5% (p-value = 0.05). Results: The cumulative EBF abandonment rate at 2, 4 and 6 months was 33.3%, 52.2% and 63.8%, respectively. Factors associated with EBF abandonment were maternal perception of the quality of milk as poor (p < 0.001), maternal perception that the child was not satisfied after breastfeeding (p < 0.001), use of pacifier (p < 0.001), maternal consumption of illicit drugs after delivery (p < 0.001), mother return to school (p = 0.040) and presence of nipple trauma (p = 0.045). Conclusion: EBF abandonment rate was higher in the first 4 months of life and is superior to the population of women who breastfeed. Among the EBF abandonment factors, one of them is related to the adolescent condition, and two are related to the maternal subjective perception.

Keywords: adolescent, breastfeeding, midwifery, nursing

Procedia PDF Downloads 294
48311 Dietary Vitamin D Intake and the Bladder Cancer Risk: A Pooled Analysis of Prospective Cohort Studies

Authors: Iris W. A. Boot, Anke Wesselius, Maurice P. Zeegers

Abstract:

Diet may play an essential role in the aetiology of bladder cancer (BC). Vitamin D is involved in various biological functions which have the potential to prevent BC development. Besides, vitamin D also influences the uptake of calcium and phosphorus , thereby possibly indirectly influencing the risk of BC. The aim of the present study was to investigate the relation between vitamin D intake and BC risk. Individual dietary data were pooled from three cohort studies. Food item intake was converted to daily intakes of vitamin D, calcium and phosphorus. Pooled multivariate hazard ratios (HRs), with corresponding 95% confidence intervals (CIs) were obtained using Cox-regression models. Analyses were adjusted for gender, age and smoking status (Model 1), and additionally for the food groups fruit, vegetables and meat (Model 2). Dose–response relationships (Model 1) were examined using a nonparametric test for trend. In total, 2,871 cases and 522,364 non-cases were included in the analyses. The present study showed an overall increased BC risk for high dietary vitamin D intake (HR: 1.14, 95% CI: 1.03-1.26). A similar increase BC risk with high vitamin D intake was observed among women and for the non-muscle invasive BC subtype, (HR: 1.41, 95% CI: 1.15-1.72, HR: 1.13, 95% CI: 1.01-1.27, respectively). High calcium intake decreased the BC risk among women (HR: 0.81, 95% CI: 0.67-0.97). A combined inverse effect on BC risk was observed for low vitamin D intake and high calcium intake (HR: 0.67, 95% CI: 0.48-0.93), while a positive effect was observed for high vitamin D intake in combination with low, moderate and high phosphorus (HR: 1.31, 95% CI: 1.09-1.59, HR: 1.17, 95% CI: 1.01-1.36, HR: 1.16, 95% CI: 1.03-1.31, respectively). Combining all nutrients showed a decreased BC risk for low vitamin D intake, high calcium and moderate phosphor intake (HR: 0.37, 95% CI: 0.18-0.75), and an increased BC risk for moderate intake of all the nutrients (HR: 1.18, 95% CI: 1.02-1.38), for high vitamin D and low calcium and phosphor intake (HR: 1.28, 95% CI: 1.01-1.62), and for moderate vitamin D and calcium and high phosphorus intake (HR: 1.27, 95% CI: 1.01-1.59). No significant dose-response analyses were observed. The findings of this study show an increased BC risk for high dietary vitamin D intake and a decreased risk for high calcium intake. Besides, the study highlights the importance of examining the effect of a nutrient in combination with complementary nutrients for risk assessment. Future research should focus on nutrients in a wider context and in nutritional patterns.

Keywords: bladder cancer, nutritional oncology, pooled cohort analysis, vitamin D

Procedia PDF Downloads 57
48310 Breech Versus Cephalic Elective Caesarean Deliveries – A Comparison of Immediate Neonatal Outcomes

Authors: Genevieve R. Kan, Jolyon Ford

Abstract:

Background: Caesarean section has become the routine route of delivery for breech fetuses, but breech cesarean deliveries are hypothesized to have poorer immediate neonatal outcomes when compared to cephalic deliveries. In accordance with this, in many Australian hospitals, the pediatric team is routinely required to attend every elective breech cesarean section in case urgent resuscitation is required. Our study aimed to determine whether term elective breech deliveries indeed had worse immediate neonatal outcomes at delivery, which will justify the necessity of pediatric staff presence at every elective breech cesarean delivery and influence the workload for the pediatric team. Objective: Elective breech cesarean deliveries were compared to elective cephalic cesarean deliveries at 37 weeks gestation or above to evaluate the immediate neonatal outcomes (Apgar scores <7 at 5 minutes, and Special Care Nursery admissions on Day 1 of life) of each group. Design: A retrospective cohort study Method: This study examined 2035 elective breech and cephalic singleton cesarean deliveries at term over 5 years from July 2017 to July 2022 at Frankston Hospital, a metropolitan hospital in Melbourne, Australia. There were 260 breech deliveries and 1775 cephalic deliveries. De-identified patient data were collected retrospectively from the hospital’s electronically integrated pregnancy and birth records to assess demographics and neonatal outcomes. Results: Apgar scores <7 at 5 minutes of life were worse in the breech group compared to the cephalic group (3.4% vs 1.6%). Special Care Nursery admissions on Day 1 of life were also higher for the breech cohort compared to the cephalic cohort (9.6% vs 8.7%). Conclusions: Our results support the expected findings that breech deliveries are associated with worse immediate neonatal outcomes. It, therefore, suggests that routine attendance at elective breech cesarean deliveries by the pediatric team is indeed required to assist with potentially higher needs for neonatal resuscitation and special care nursery admission.

Keywords: breech, cesarean section, Apgar scores, special care nursery admission

Procedia PDF Downloads 76
48309 Self-rated Health as a Predictor of Hospitalizations in Patients with Bipolar Disorder and Major Depression: A Prospective Cohort Study of the United Kingdom Biobank

Authors: Haoyu Zhao, Qianshu Ma, Min Xie, Yunqi Huang, Yunjia Liu, Huan Song, Hongsheng Gui, Mingli Li, Qiang Wang

Abstract:

Rationale: Bipolar disorder (BD) and major depressive disorder (MDD), as severe chronic illnesses that restrict patients’ psychosocial functioning and reduce their quality of life, are both categorized into mood disorders. Emerging evidence has suggested that the reliability of self-rated health (SRH) was wellvalidated and that the risk of various health outcomes, including mortality and health care costs, could be predicted by SRH. Compared with other lengthy multi-item patient-reported outcomes (PRO) measures, SRH was proven to have a comparable predictive ability to predict mortality and healthcare utilization. However, to our knowledge, no study has been conducted to assess the association between SRH and hospitalization among people with mental disorders. Therefore, our study aims to determine the association between SRH and subsequent all-cause hospitalizations in patients with BD and MDD. Methods: We conducted a prospective cohort study on people with BD or MDD in the UK from 2006 to 2010 using UK Biobank touchscreen questionnaire data and linked administrative health databases. The association between SRH and 2-year all-cause hospitalizations was assessed using proportional hazard regression after adjustment for sociodemographics, lifestyle behaviors, previous hospitalization use, the Elixhauser comorbidity index, and environmental factors. Results: A total of 29,966 participants were identified, experiencing 10,279 hospitalization events. Among the cohort, the average age was 55.88 (SD 8.01) years, 64.02% were female, and 3,029 (10.11%), 15,972 (53.30%), 8,313 (27.74%), and 2,652 (8.85%) reported excellent, good, fair, and poor SRH, respectively. Among patients reporting poor SRH, 54.19% had a hospitalization event within 2 years compared with 22.65% for those having excellent SRH. In the adjusted analysis, patients with good, fair, and poor SRH had 1.31 (95% CI 1.21-1.42), 1.82 (95% CI 1.68-1.98), and 2.45 (95% CI 2.22, 2.70) higher hazards of hospitalization, respectively, than those with excellent SRH. Conclusion: SRH was independently associated with subsequent all-cause hospitalizations in patients with BD or MDD. This large study facilitates rapid interpretation of SRH values and underscores the need for proactive SRH screening in this population, which might inform resource allocation and enhance high-risk population detection.

Keywords: severe mental illnesses, hospitalization, risk prediction, patient-reported outcomes

Procedia PDF Downloads 134
48308 Bridging the Divide: Mixed-Method Analysis of Student Engagement and Outcomes in Diverse Postgraduate Cohorts

Authors: A.Knox

Abstract:

Student diversity in postgraduate classes puts major challenges on educators seeking to encourage student engagement and desired to learn outcomes. This paper outlines the impact of a set of teaching initiatives aimed at addressing challenges associated with teaching and learning in an environment characterized by diversity in the student cohort. The study examines postgraduate students completing the core capstone unit within a specialized business degree. Although relatively small, the student cohort is highly diverse in terms of cultural backgrounds represented, prior learning and/or qualifications, as well as duration and type of work experience relevant to the degree, is completed. The wide range of cultures, existing knowledge and experience create enormous challenges with respect to students’ learning needs and outcomes. Subsequently, a suite of teaching innovations has been adopted to enhance curriculum content/delivery and the design of assessments. This paper explores the impact of these specific teaching and learning practices, examining the ways they have supported students’ diverse needs and enhanced students’ learning outcomes. Data from surveys and focus groups are used to assess the effectiveness of these practices. The results highlight the effectiveness of peer-assisted learning, cultural competence-building, and advanced assessment options in addressing diverse student needs and enhancing student engagement and learning outcomes. These findings suggest that such practices would benefit students’ learning in environments marked by diversity in the student cohort. Specific recommendations are offered for other educators working with diverse classes.

Keywords: assessment design, curriculum content, curriculum delivery, student diversity

Procedia PDF Downloads 82
48307 A Cohort Study of Early Cardiologist Consultation by Telemedicine on the Critical Non-STEMI Inpatients

Authors: Wisit Wichitkosoom

Abstract:

Objectives: To find out the more effect of early cardiologist consultation using a simple technology on the diagnosis and early proper management of patients with Non-STEMI at emergency department of district hospitals without cardiologist on site before transferred. Methods: A cohort study was performed in Udonthani general hospital at Udonthani province. From 1 October 2012–30 September 2013 with 892 patients diagnosed with Non-STEMI. All patients mean aged 46.8 years of age who had been transferred because of Non-STEMI diagnosed, over a 12 week period of studied. Patients whose transferred, in addition to receiving proper care, were offered a cardiologist consultation with average time to Udonthani hospital 1.5 hour. The main outcome measure was length of hospital stay, mortality at 3 months, inpatient investigation, and transfer rate to the higher facilitated hospital were also studied. Results: Hospital stay was significantly shorter for those didn’t consult cardiologist (hazard ratio 1.19; approximate 95% CI 1.001 to 1.251; p = 0.039). The 136 cases were transferred to higher facilitated hospital. No statistically significant in overall mortality between the groups (p=0.068). Conclusions: Early cardiologist consultant can reduce length of hospital stay for patients with cardiovascular conditions outside of cardiac center. The new basic technology can apply for the safety patient.

Keywords: critical, telemedicine, safety, non STEMI

Procedia PDF Downloads 393
48306 Neighborhood Linking Social Capital as a Predictor of Drug Abuse: A Swedish National Cohort Study

Authors: X. Li, J. Sundquist, C. Sjöstedt, M. Winkleby, K. S. Kendler, K. Sundquist

Abstract:

Aims: This study examines the association between the incidence of drug abuse (DA) and linking (communal) social capital, a theoretical concept describing the amount of trust between individuals and societal institutions. Methods: We present results from an 8-year population-based cohort study that followed all residents in Sweden, aged 15-44, from 2003 through 2010, for a total of 1,700,896 men and 1,642,798 women. Social capital was conceptualized as the proportion of people in a geographically defined neighborhood who voted in local government elections. Multilevel logistic regression was used to estimate odds ratios (ORs) and between-neighborhood variance. Results: We found robust associations between linking social capital (scored as a three level variable) and DA in men and women. For men, the OR for DA in the crude model was 2.11 [95% confidence interval (CI) 2.02-2.21] for those living in areas with the lowest vs. highest level of social capital. After accounting for neighborhood-level deprivation, the OR fell to 1.59 (1.51-1-68), indicating that neighborhood deprivation lies in the pathway between linking social capital and DA. The ORs remained significant after accounting for age, sex, family income, marital status, country of birth, education level, and region of residence, and after further accounting for comorbidities and family history of comorbidities and family history of DA. For women, the OR decreased from 2.15 (2.03-2.27) in the crude model to 1.31 (1.22-1.40) in the final model, adjusted for multiple neighborhood-level and individual-level variables. Conclusions: Our study suggests that low linking social capital may have important independent effects on DA.

Keywords: drug abuse, social linking capital, environment, family

Procedia PDF Downloads 449
48305 Incidence and Risk Factors of Central Venous Associated Infections in a Tunisian Medical Intensive Care Unit

Authors: Ammar Asma, Bouafia Nabiha, Ghammam Rim, Ezzi Olfa, Ben Cheikh Asma, Mahjoub Mohamed, Helali Radhia, Sma Nesrine, Chouchène Imed, Boussarsar Hamadi, Njah Mansour

Abstract:

Background: Central venous catheter associated infections (CVC-AI) are among the serious hospital-acquired infections. The aims of this study are to determine the incidence of CVC-AI, and their risk factors among patients followed in a Tunisian medical intensive care unit (ICU). Materials / Methods: A prospective cohort study conducted between September 15th, 2015 and November 15th, 2016 in an 8-bed medical ICU including all patients admitted for more than 48h. CVC-AI were defined according to CDC of ATLANTA criteria. The enrollment was based on clinical and laboratory diagnosis of CVC-AI. For all subjects, age, sex, underlying diseases, SAPS II score, ICU length of stay, exposure to CVC (number of CVC placed, site of insertion and duration catheterization) were recorded. Risk factors were analyzed by conditional stepwise logistic regression. The p-value of < 0.05 was considered significant. Results: Among 192 eligible patients, 144 patients (75%) had a central venous catheter. Twenty-eight patients (19.4%) had developed CVC-AI with density rate incidence 20.02/1000 CVC-days. Among these infections, 60.7% (n=17) were systemic CVC-AI (with negative blood culture), and 35.7% (n=10) were bloodstream CVC-AI. The mean SAPS II of patients with CVC-AI was 32.76 14.48; their mean Charlson index was 1.77 1.55, their mean duration of catheterization was 15.46 10.81 days and the mean duration of one central line was 5.8+/-3.72 days. Gram-negative bacteria was determined in 53.5 % of CVC-AI (n= 15) dominated by multi-drug resistant Acinetobacter baumani (n=7). Staphylococci were isolated in 3 CVC-AI. Fourteen (50%) patients with CVC-AI died. Univariate analysis identified men (p=0.034), the referral from another hospital department (p=0.03), tobacco (p=0.006), duration of sedation (p=0.003) and the duration of catheterization (p=0), as possible risk factors of CVC-AI. Multivariate analysis showed that independent factors of CVC-AI were, male sex; OR= 5.73, IC 95% [2; 16.46], p=0.001, Ramsay score; OR= 1.57, IC 95% [1.036; 2.38], p=0.033, and duration of catheterization; OR=1.093, IC 95% [1.035; 1.15], p=0.001. Conclusion: In a monocenter cohort, CVC-AI had a high density and is associated with poor outcome. Identifying the risk factors is necessary to find solutions for this major health problem.

Keywords: central venous catheter associated infection, intensive care unit, prospective cohort studies, risk factors

Procedia PDF Downloads 338
48304 Near-Peer Mentoring/Curriculum and Community Enterprise for Environmental Restoration Science

Authors: Lauren B. Birney

Abstract:

The BOP-CCERS (Billion Oyster Project- Curriculum and Community Enterprise for Restoration Science) Near-Peer Mentoring Program provides the long-term (five-year) support network to motivate and guide students toward restoration science-based CTE pathways. Students are selected from middle schools with actively participating BOP-CCERS teachers. Teachers will nominate students from grades 6-8 to join cohorts of between 10 and 15 students each. Cohorts are comprised primarily of students from the same school in order to facilitate mentors' travel logistics as well as to sustain connections with students and their families. Each cohort is matched with an exceptional undergraduate or graduate student, either a BOP research associate or STEM mentor recruited from collaborating City University of New York (CUNY) partner programs. In rare cases, an exceptional high school junior or senior may be matched with a cohort in addition to a research associate or graduate student. In no case is a high school student or minor be placed individually with a cohort. Mentors meet with students at least once per month and provide at least one offsite field visit per month, either to a local STEM Hub or research lab. Keeping with its five-year trajectory, the near-peer mentoring program will seek to retain students in the same cohort with the same mentor for the full duration of middle school and for at least two additional years of high school. Upon reaching the final quarter of 8th grade, the mentor will develop a meeting plan for each individual mentee. The mentee and the mentor will be required to meet individually or in small groups once per month. Once per quarter, individual meetings will be substituted for full cohort professional outings. The mentor will organize the entire cohort on a field visit or educational workshop with a museum or aquarium partner. In addition to the mentor-mentee relationship, each participating student will also be asked to conduct and present his or her own BOP field research. This research is ideally carried out with the support of the students’ regular high school STEM subject teacher; however, in cases where the teacher or school does not permit independent study, the student will be asked to conduct the research on an extracurricular basis. Near-peer mentoring affects students’ social identities and helps them to connect to role models from similar groups, ultimately giving them a sense of belonging. Qualitative and quantitative analytics were performed throughout the study. Interviews and focus groups also ensued. Additionally, an external evaluator was utilized to ensure project efficacy, efficiency, and effectiveness throughout the entire project. The BOP-CCERS Near Peer Mentoring program is a peer support network in which high school students with interest or experience in BOP (Billion Oyster Project) topics and activities (such as classroom oyster tanks, STEM Hubs, or digital platform research) provide mentorship and support for middle school or high school freshmen mentees. Peer mentoring not only empowers those students being taught but also increases the content knowledge and engagement of mentors. This support provides the necessary resources, structure, and tools to assist students in finding success.

Keywords: STEM education, environmental science, citizen science, near peer mentoring

Procedia PDF Downloads 63
48303 Adherence to Dietary Approaches to Stop Hypertension-Style Diet and Risk of Mortality from Cancer: A Systematic Review and Meta-Analysis of Cohort Studies

Authors: Roohallah Fallah-Moshkani, Mohammad Ali Mohsenpour, Reza Ghiasvand, Hossein Khosravi-Boroujeni, Seyed Mehdi Ahmadi, Paula Brauer, Amin Salehi-Abargouei

Abstract:

Purpose: Several investigations have proposed the protective association between dietary approaches to stop hypertension (DASH) style diet and risk of cancers; however, they have led to inconsistent results. The present study aimed to systematically review the prospective cohort studies conducted in this regard and, if possible, to quantify the overall effect of using meta-analysis. Methods: PubMed, EMBASE, Scopus, and Google Scholar were searched for cohort studies published up to December 2017. Relative risks (RRs) which were reported for fully adjusted models and their confidence intervals were extracted for meta-analysis. Random effects model was incorporated to combine the RRs. Results: Sixteen studies were eligible to be included in the systematic review from which 8 reports were conducted on the effect of DASH on the risk of mortality from all cancer types, four on the risk of colorectal cancer, and three on the risk of colon and rectal cancer. Four studies examined the association with other cancers (breast, hepatic, endometrial, and lung cancer). Meta-analysis showed that high concordance with DASH significantly decreases the risk of all cancer types (RR=0.83, 95% confidence interval (95%CI):0.80-0.85); furthermore participants who highly adhered to the DASH had lower risk of developing colorectal (RR=0.79, 95%CI: 0.75-0.83), colon (RR=0.81, 95%CI: 0.74-0.87) and rectal (RR=0.79, 95%CI: 0.63-0.98) cancer compared to those with the lowest adherence. Conclusions: DASH-style diet should be suggested as a healthy approach to protect from cancer in the community. Prospective studies exploring the effect on other cancer types and from regions other than the United States are highly recommended.

Keywords: cancer, DASH-style diet, dietary patterns, meta-analysis, systematic review

Procedia PDF Downloads 156
48302 Frailty Patterns in the US and Implications for Long-Term Care

Authors: Joelle Fong

Abstract:

Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.

Keywords: actuarial modeling, cohort analysis, frail elderly, health

Procedia PDF Downloads 216
48301 Improving Patient Outcomes for Aspiration Pneumonia

Authors: Mary Farrell, Maria Soubra, Sandra Vega, Dorothy Kakraba, Joanne Fontanilla, Moira Kendra, Danielle Tonzola, Stephanie Chiu

Abstract:

Pneumonia is the most common infectious cause of hospitalizations in the United States, with more than one million admissions annually and costs of $10 billion every year, making it the 8th leading cause of death. Aspiration pneumonia is an aggressive type of pneumonia that results from inhalation of oropharyngeal secretions and/or gastric contents and is preventable. The authors hypothesized that an evidence-based aspiration pneumonia clinical care pathway could reduce 30-day hospital readmissions and mortality rates, while improving the overall care of patients. We conducted a retrospective chart review on 979 patients discharged with aspiration pneumonia from January 2021 to December 2022 at Overlook Medical Center. The authors identified patients who were coded with aspiration pneumonia and/or stable sepsis. Secondarily, we identified 30-day readmission rates for aspiration pneumonia from a SNF. The Aspiration Pneumonia Clinical Care Pathway starts in the emergency department (ED) with the initiation of antimicrobials within 4 hours of admission and early recognition of aspiration. Once this is identified, a swallow test is initiated by the bedside nurse, and if the patient demonstrates dysphagia, they are maintained on strict nothing by mouth (NPO) followed by a speech and language pathologist (SLP) referral for an appropriate modified diet recommendation. Aspiration prevention techniques included the avoidance of straws, 45-degree positioning, no talking during meals, taking small bites, placement of the aspiration wrist band, and consuming meals out of the bed in a chair. Nursing education was conducted with a newly created online learning module about aspiration pneumonia. The authors identified 979 patients, with an average age of 73.5 years old, who were diagnosed with aspiration pneumonia on the index hospitalization. These patients were reviewed for a 30-day readmission for aspiration pneumonia or stable sepsis, and mortality rates from January 2021 to December 2022 at Overlook Medical Center (OMC). The 30-day readmission rates were significantly lower in the cohort that received the clinical care pathway (35.0% vs. 27.5%, p = 0.011). When evaluating the mortality rates in the pre and post intervention cohort the authors discovered the mortality rates were lower in the post intervention cohort (23.7% vs 22.4%, p = 0.61) Mortality among non-white (self-reported as non-white) patients were lower in the post intervention cohort (34.4% vs. 21.0% , p = 0.05). Patients who reported as a current smoker/vaper in the pre and post cohorts had increased mortality rates (5.9% vs 22%). There was a decrease in mortality for the male population but an increase in mortality for women in the pre and post cohorts (19% vs. 25%). The authors attributed this increase in mortality in the post intervention cohort to more active smokers, more former smokers, and more being admitted from a SNF. This research identified that implementation of an Aspiration Pneumonia Clinical Care Pathway showed a statistically significant decrease in readmission rates and mortality rates in non-whites. The 30-day readmission rates were lower in the cohort that received the clinical care pathway (35.0% vs. 27.5%, p = 0.011).

Keywords: aspiration pneumonia, mortality, quality improvement, 30-day pneumonia readmissions

Procedia PDF Downloads 27
48300 A Retrospective Cohort Study on an Outbreak of Gastroenteritis Linked to a Buffet Lunch Served during a Conference in Accra

Authors: Benjamin Osei Tutu, Sharon Annison

Abstract:

On 21st November, 2016, an outbreak of foodborne illness occurred after a buffet lunch served during a stakeholders’ consultation meeting held in Accra. An investigation was conducted to characterise the affected people, determine the etiologic food, the source of contamination and the etiologic agent and to implement appropriate public health measures to prevent future occurrences. A retrospective cohort study was conducted via telephone interviews, using a structured questionnaire developed from the buffet menu. A case was defined as any person suffering from symptoms of foodborne illness e.g. diarrhoea and/or abdominal cramps after eating food served during the stakeholder consultation meeting in Accra on 21st November, 2016. The exposure status of all the members of the cohort was assessed by taking the food history of each respondent during the telephone interview. The data obtained was analysed using Epi Info 7. An environmental risk assessment was conducted to ascertain the source of the food contamination. Risks of foodborne infection from the foods eaten were determined using attack rates and odds ratios. Data was obtained from 54 people who consumed food served during the stakeholders’ meeting. Out of this population, 44 people reported with symptoms of food poisoning representing 81.45% (overall attack rate). The peak incubation period was seven hours with a minimum and maximum incubation periods of four and 17 hours, respectively. The commonly reported symptoms were diarrhoea (97.73%, 43/44), vomiting (84.09%, 37/44) and abdominal cramps (75.00%, 33/44). From the incubation period, duration of illness and the symptoms, toxin-mediated food poisoning was suspected. The environmental risk assessment of the implicated catering facility indicated a lack of time/temperature control, inadequate knowledge on food safety among workers and sanitation issues. Limited number of food samples was received for microbiological analysis. Multivariate analysis indicated that illness was significantly associated with the consumption of the snacks served (OR 14.78, P < 0.001). No stool and blood or samples of etiologic food were available for organism isolation; however, the suspected etiologic agent was Staphylococcus aureus or Clostridium perfringens. The outbreak could probably be due to the consumption of unwholesome snack (tuna sandwich or chicken. The contamination and/or growth of the etiologic agent in the snack may be due to the breakdown in cleanliness, time/temperature control and good food handling practices. Training of food handlers in basic food hygiene and safety is recommended.

Keywords: Accra, buffet, conference, C. perfringens, cohort study, food poisoning, gastroenteritis, office workers, Staphylococcus aureus

Procedia PDF Downloads 193
48299 Epileptic Seizures in Patients with Multiple Sclerosis

Authors: Anat Achiron

Abstract:

Background: Multiple sclerosis (MS) is a chronic autoimmune disease that affects the central nervous system in young adults. It involves the immune system attacking the protective covering of nerve fibers (myelin), leading to inflammation and damage. MS can result in various neurological symptoms, such as muscle weakness, coordination problems, and sensory disturbances. Seizures are not common in MS, and the frequency is estimated between 0.4 to 6.4% over the disease course. Objective: Investigate the frequency of seizures in individuals with multiple sclerosis and to identify associated risk factors. Methods: We evaluated the frequency of seizures in a large cohort of 5686 MS patients followed at the Sheba Multiple Sclerosis Center and studied associated risk factors and comorbidities. Our research was based on data collection using a cohort study design. We applied logistic regression analysis to assess the strength of associations. Results: We found that younger age at onset, longer disease duration, and prolonged time to immunomodulatory treatment initiation were associated with increased risk for seizures. Conclusions: Our findings suggest that seizures in people with MS are directly related to the demyelination process and not associated with other factors like medication side effects or comorbid conditions. Therefore, initiating immunomodulatory treatment early in the disease course could reduce not only disease activity but also decrease seizure risk.

Keywords: epilepsy, seizures, multiple sclerosis, white matter, age

Procedia PDF Downloads 32
48298 Long-Term Otitis Media with Effusion and Related Hearing Loss and Its Impact on Developmental Outcomes

Authors: Aleema Rahman

Abstract:

Introduction: This study aims to estimate the prevalence of long-term otitis media with effusion (OME) and hearing loss in a prospective longitudinal cohort studyand to study the relationship between the condition and educational and psychosocial outcomes. Methods: Analysis of data from the Avon Longitudinal Study of Parents and Children (ALSPAC) will be undertaken. ALSPAC is a longitudinal birth cohort study carried out in the UK, which has collected detailed measures of hearing on ~7000 children from the age of seven. A descriptive analysis of the data will be undertaken to estimate the prevalence of OME and hearing loss (defined as having average hearing levels > 20dB and type B tympanogram) at 7, 9, 11, and 15 years as well as that of long-term OME and hearing loss. Logistic and linear regression analyses will be conducted to examine associations between long-term OME and hearing loss and educational outcomes (grades obtained from standardised national attainment tests) and psychosocial outcomes such as anxiety, social fears, and depression at ages 10-11 and 15-16 years. Results: Results will be presented in terms of the prevalence of OME and hearing loss in the population at each age. The prevalence of long-term OME and hearing loss, defined as having OME and hearing loss at two or more time points, will also be reported. Furthermore, any associations between long-term OME and hearing loss and the educational and psychosocial outcomes will be presented. Analyses will take into account demographic factors such as sex and social deprivation and relevant confounders, including socioeconomic status, ethnicity, and IQ. Discussion: Findings from this study will provide new epidemiological information on the prevalence of long-term OME and hearing loss. The research will provide new knowledge on the impact of OME for the small group of children who do not grow out of condition by age 7 but continue to have hearing loss and need clinical care through later childhood. The study could have clinical implications and may influence service delivery for this group of children.

Keywords: educational attainment, hearing loss, otitis media with effusion, psychosocial development

Procedia PDF Downloads 109
48297 Utility of Optical Coherence Tomography (OCT) and Visual Field Assessment in Neurosurgical Patients

Authors: Ana Ferreira, Ines Costa, Patricia Polónia, Josué Pereira, Olinda Faria, Pedro Alberto Silva

Abstract:

Introduction: Optical coherence tomography (OCT) and visual field tools are pivotal in evaluating neurological deficits and predicting potential visual improvement following surgical decompression in neurosurgical patients. Despite their clinical significance, a comprehensive understanding of their utility in this context is lacking in the literature. This study aims to elucidate the applications of OCT and visual field assessment, delineating distinct patterns of visual deficit presentations within the studied cohort. Methods: This retrospective analysis considered all adult patients who underwent a single surgery for pituitary adenoma or anterior skull base meningioma with optic nerve involvement, coupled with neuro-ophthalmology evaluation, between July 2020 and January 2023. A minimum follow-up period of 6 months was deemed essential. Results: A total of 24 patients, with a median age of 61, were included in the analysis. Three primary patterns emerged: 1) Low visual field involvement with compromised OCT, 2) High visual field involvement with relatively unaffected OCT, and 3) Significant compromise observed in both OCT and visual fields. Conclusion: This study delineates various findings in OCT and visual field assessments with illustrative examples. Based on the current findings, a prospective cohort will be systematically collected to further investigate and validate these patterns and their prognostic significance, enhancing our understanding of the utility of OCT and visual fields in neurosurgical patients.

Keywords: OCT, neurosurgery, visual field, optic nerve

Procedia PDF Downloads 25
48296 Psychological Stress and Accelerated Aging in SCI Patients - A Longitudinal Pilot Feasibility Study

Authors: Simona Capossela, Ramona Schaniel, Singer Franziska, Aquino Fournier Catharine, Daniel Stekhoven, Jivko Stoyanov

Abstract:

A spinal cord injury (SCI) is a traumatic life event that often results in ageing associated health conditions such as muscle mass decline, adipose tissue increase, decline in immune function, frailty, systemic chronic inflammation, and psychological distress and depression. Psychological, oxidative, and metabolic stressors may facilitate accelerated ageing in the SCI population with reduced life expectancy. Research designs using biomarkers of aging and stress are needed to elucidate the role of psychological distress in accelerated aging. The aim of this project is a feasibility pilot study to observe changes in stress biomarkers and correlate them with aging markers in SCI patients during their first rehabilitation (longitudinal cohort study). Biological samples were collected in the SwiSCI (Swiss Spinal Cord Injury Cohort Study) Biobank in Nottwil at 4 weeks±12 days after the injury (T1) and at the end of the first rehabilitation (discharge, T4). The "distress thermometer" is used as a selfassessment tool for psychological distress. Stress biomarkers, as cortisol and protein carbonyl content (PCC), and markers of cellular aging, such as telomere lengths, will be measured. 2 Preliminary results showed that SCI patients (N= 129) are still generally distressed at end of rehabilitation, however we found a statistically significant (p< 0.001) median decrease in distress from 6 (T1) to 5 (T4) during the rehabilitation. In addition, an explorative transcriptomics will be conducted on N=50 SCI patients to compare groups of persons with SCI who have different trajectories of selfreported distress at the beginning and end of the first rehabilitation after the trauma. We identified 4 groups: very high chronic stress (stress thermometer values above 7 at T1 and T4; n=14); transient stress (high to low; n=14), low stress (values below 5 at T1 and T4; n=14), increasing stress (low to high; n=8). The study will attempt to identify and address issues that may occur in relation to the design and conceptualization of future study on stress and aging in the SCI population.

Keywords: stress, aging, spinal cord injury, biomarkers

Procedia PDF Downloads 74
48295 A Clinical Study of Tracheobronchopathia Osteochondroplastica: Findings from a Large Chinese Cohort

Authors: Ying Zhu, Ning Wu, Hai-Dong Huang, Yu-Chao Dong, Qin-Ying Sun, Wei Zhang, Qin Wang, Qiang Li

Abstract:

Background and study aims: Tracheobronchopathia osteochondroplastica (TO) is an uncommon disease of the tracheobronchial system that leads to narrowing of the airway lumen from cartilaginous and/or osseous submucosal nodules. The aim of this study is to perform a detailed review of this rare disease in a large cohort of patients with TO proven by fiberoptic bronchoscopy from China. Patients and Methods: Retrospective chart review was performed on 41,600 patients who underwent bronchoscopy in the Department of Respiratory Medicine of Changhai Hospital between January 2005 and December 2012. Cases of TO were identified based on characteristic features during bronchoscopic examination. Results: 22 cases of bronchoscopic TO were identified. Among whom one-half were male and the mean age was 47.45 ±10.91 years old. The most frequent symptoms at presentation were chronic cough (n=14) and increased sputum production (n=10). Radiographic abnormalities were observed in 3/18 patients and findings on computed tomography consistent with TO such as beaded intraluminal calcifications and/or increased luminal thickenings were observed in 18/22 patients. Patients were classified into the following categories based on the severity of bronchoscopic findings: Stage I (n=2), Stage II (n=6) and Stage III(n=14). The result that bronchoscopic improvement was observed in 2 patients administered with inhaled corticosteroids suggested that resolution of this disease is possible. Conclusions: TO is a benign disease with slow progression, which could be roughly divided into 3 stages on the basis of the characteristic endoscopic features and histopathologic findings. Chronic inflammation was thought to be more important than the other existing plausible hypotheses in the course of TO. Inhaled corticosteroids might have some impact on patients at Stage I/II.

Keywords: airway obstruction, bronchoscopy, etiology, Tracheobronchopathia osteochondroplastica (TO), treatment

Procedia PDF Downloads 431
48294 The Effects of a Mathematics Remedial Program on Mathematics Success and Achievement among Beginning Mathematics Major Students: A Regression Discontinuity Analysis

Authors: Kuixi Du, Thomas J. Lipscomb

Abstract:

The proficiency in Mathematics skills is fundamental to success in the STEM disciplines. In the US, beginning college students who are placed in remedial/developmental Mathematics courses frequently struggle to achieve academic success. Therefore, Mathematics remediation in college has become an important concern, and providing Mathematics remediation is a prevalent way to help the students who may not be fully prepared for college-level courses. Programs vary, however, and the effectiveness of a particular remedial Mathematics program must be empirically demonstrated. The purpose of this study was to apply the sharp regression discontinuity (RD) technique to determine the effectiveness of the Jack Leaps Summer (JLS) Mathematic remediation program in supporting improved Mathematics learning outcomes among newly admitted Mathematics students in the South Dakota State University. The researchers studied the newly admitted Fall 2019 cohort of Mathematics majors (n=423). The results indicated that students whose pretest score was lower than the cut-off point and who were assigned to the JLS program experienced significantly higher scores on the post-test (Math 101 final score). Based on these results, there is evidence that the JLS program is effective in meeting its primary objective.

Keywords: causal inference, mathematisc remedial program evaluation, quasi-experimental research design, regression discontinuity design, cohort studies

Procedia PDF Downloads 64
48293 Predicting High-Risk Endometrioid Endometrial Carcinomas Using Protein Markers

Authors: Yuexin Liu, Gordon B. Mills, Russell R. Broaddus, John N. Weinstein

Abstract:

The lethality of endometrioid endometrial cancer (EEC) is primarily attributable to the high-stage diseases. However, there are no available biomarkers that predict EEC patient staging at the time of diagnosis. We aim to develop a predictive scheme to help in this regards. Using reverse-phase protein array expression profiles for 210 EEC cases from The Cancer Genome Atlas (TCGA), we constructed a Protein Scoring of EEC Staging (PSES) scheme for surgical stage prediction. We validated and evaluated its diagnostic potential in an independent cohort of 184 EEC cases obtained at MD Anderson Cancer Center (MDACC) using receiver operating characteristic curve analyses. Kaplan-Meier survival analysis was used to examine the association of PSES score with patient outcome, and Ingenuity pathway analysis was used to identify relevant signaling pathways. Two-sided statistical tests were used. PSES robustly distinguished high- from low-stage tumors in the TCGA cohort (area under the ROC curve [AUC]=0.74; 95% confidence interval [CI], 0.68 to 0.82) and in the validation cohort (AUC=0.67; 95% CI, 0.58 to 0.76). Even among grade 1 or 2 tumors, PSES was significantly higher in high- than in low-stage tumors in both the TCGA (P = 0.005) and MDACC (P = 0.006) cohorts. Patients with positive PSES score had significantly shorter progression-free survival than those with negative PSES in the TCGA (hazard ratio [HR], 2.033; 95% CI, 1.031 to 3.809; P = 0.04) and validation (HR, 3.306; 95% CI, 1.836 to 9.436; P = 0.0007) cohorts. The ErbB signaling pathway was most significantly enriched in the PSES proteins and downregulated in high-stage tumors. PSES may provide clinically useful prediction of high-risk tumors and offer new insights into tumor biology in EEC.

Keywords: endometrial carcinoma, protein, protein scoring of EEC staging (PSES), stage

Procedia PDF Downloads 197
48292 Schoolwide Implementation of Schema-Based Instruction for Mathematical Problem Solving: An Action Research Investigation

Authors: Sara J. Mills, Sally Howell

Abstract:

The field of special education has long struggled to bridge the research to practice gap. There is ample evidence from research of effective strategies for students with special needs, but these strategies are not routinely implemented in schools in ways that yield positive results for students. In recent years, the field of special education has turned its focus to implementation science. That is, discovering effective methods of implementing evidence-based practices in school settings. Teacher training is a critical factor in implementation. This study aimed to successfully implement Schema-Based Instruction (SBI) for math problem solving in four classrooms in a special primary school serving students with language deficits, including students with Autism Spectrum Disorders (ASD) and Intellectual Disabilities (ID). Using an action research design that allowed for adjustments and modification to be made over the year-long study, two cohorts of teachers across the school were trained and supported in six-week learning cycles to implement SBI in their classrooms. The learning cycles included a one-day training followed by six weeks of one-on-one or team coaching and three fortnightly cohort group meetings. After the first cohort of teachers completed the learning cycle, modifications and adjustments were made to lesson materials in an attempt to improve their effectiveness with the second cohort. Fourteen teachers participated in the study, including master special educators (n=3), special education instructors (n=5), and classroom assistants (n=6). Thirty-one students participated in the study (21 boys and 10 girls), ranging in age from 5 to 12 years (M = 9 years). Twenty-one students had a diagnosis of ASD, 20 had a diagnosis of mild or moderate ID, with 13 of these students having both ASD and ID. The remaining students had diagnosed language disorders. To evaluate the effectiveness of the implementation approach, both student and teacher data was collected. Student data included pre- and post-tests of math word problem solving. Teacher data included fidelity of treatment checklists and pre-post surveys of teacher attitudes and efficacy for teaching problem solving. Finally, artifacts were collected throughout the learning cycle. Results from cohort 1 and cohort 2 revealed similar outcomes. Students improved in the number of word problems they answered correctly and in the number of problem-solving steps completed independently. Fidelity of treatment data showed that teachers implemented SBI with acceptable levels of fidelity (M = 86%). Teachers also reported increases in the amount of time spent teaching problem solving, their confidence in teaching problem solving and their perception of students’ ability to solve math word problems. The artifacts collected during instruction indicated that teachers made modifications to allow their students to access the materials and to show what they knew. These findings are in line with research that shows student learning can improve when teacher professional development is provided over an extended period of time, actively involves teachers, and utilizes a variety of learning methods in classroom contexts. Further research is needed to evaluate whether these gains in teacher instruction and student achievement can be maintained over time once the professional development is completed.

Keywords: implementation science, mathematics problem solving, research-to-practice gap, schema based instruction

Procedia PDF Downloads 103
48291 Association of Human Immunodeficiency Virus with Incident Autoimmune Hemolytic Anemia: A Population-Based Cohort Study in Taiwan

Authors: Yung-Feng Yen, I-an Jen, Yi-Ming Arthur Chen

Abstract:

The molecular mimicry between human immunodeficiency virus (HIV) protein and red blood cell (RBC) antigens could induce the production of anti-RBC autoantibodies. However, the association between HIV infection and subsequent development of autoimmune hemolytic anemia (AIHA) remains unclear. This nationwide population-based cohort study aimed to determine the association between incident AIHA and HIV in Taiwan. From 2000–2012, we identified adult people living with HIV/AIDS (PLWHA) from the Taiwan centers for disease control HIV Surveillance System. HIV-infected individuals were defined by positive HIV-1 western blot. Age- and sex-matched controls without HIV infection were selected from the Taiwan National Health Insurance Research Database for comparison. All patients were followed until Dec. 31, 2012, and observed for occurrence of AIHA. Of 171,468 subjects (19,052 PLWHA, 152,416 controls), 30 (0.02%) had incident AIHA during a mean follow-up of 5.45 years, including 23 (0.12%) PLWHA and 7 (0.01%) controls. After adjusting for potential confounders, HIV infection was found to be an independent risk factor of incident AIHA (adjusted hazard ratio [AHR], 20.9; 95% confidence interval [CI], 8.34-52.3). Moreover, PLWHA receiving HAART were more likely to develop AIHA than those not receiving HAART (AHR, 10.8; 95% CI, 2.90-40.1). Additionally, the risk of AIHA was significantly increased in those taking efavirenz (AHR, 3.15; 95% CI, 1.18-8.43) or atazanavir (AHR, 6.58; 95% CI, 1.88-22.9) component of the HAART. In conclusion, HIV infection is an independent risk factor for incident AIHA. Clinicians need to be aware of the higher risk of AIHA in PLWHA.

Keywords: autoimmune disease , hemolytic anemia, HIV, highly active antiretroviral treatment

Procedia PDF Downloads 206
48290 Continuous Glucose Monitoring Systems and the Improvement in Hypoglycemic Awareness Post-Islet Transplantation: A Single-Centre Cohort Study

Authors: Clare Flood, Shareen Forbes

Abstract:

Background: Type 1 diabetes mellitus (T1DM) is an autoimmune disorder affecting >400,000 people in the UK alone, with the global prevalence expected to double in the next decade. Islet transplant offers a minimally-invasive procedure with very low morbidity and almost no mortality, and is now as effective as whole pancreas transplant. The procedure was introduced to the UK in 2011 for patients with the most severe type 1 diabetes mellitus (T1DM) – those with unstable blood glucose, frequently occurring episodes of severe hypoglycemia and impaired awareness of hypoglycemia (IAH). Objectives: To evaluate the effectiveness of islet transplantation in improving glycemic control, reducing the burden of hypoglycemia and improving awareness of hypoglycemia through a single-centre cohort study at the Royal Infirmary of Edinburgh. Glycemic control and degree of hypoglycemic awareness will be determined and monitored pre- and post-transplantation to determine effectiveness of the procedure. Methods: A retrospective analysis of data collected over three years from the 16 patients who have undergone islet transplantation in Scotland. Glycated haemoglobin (HbA1c) was measured and continuous glucose monitoring systems (CGMS) were utilised to assess glycemic control, while Gold and Clarke score questionnaires tested IAH. Results: All patients had improved glycemic control following transplant, with optimal control seen visually at 3 months post-transplant. Glycemic control significantly improved, as illustrated by percentage time in hypoglycemia in the months following transplant (p=0.0211) and HbA1c (p=0.0426). Improved Clarke (p=0.0034) and Gold (p=0.0001) scores indicate improved glycemic awareness following transplant. Conclusion: While the small sample of islet transplant recipients at the Royal Infirmary of Edinburgh prevents definitive conclusions being drawn, it is indicated that through our retrospective, single-centre cohort study of 16 patients, islet transplant is capable of improving glycemic control, reducing the burden of hypoglycemia and IAH post-transplant. Data can be combined with similar trials at other centres to increase statistical power but from research in Edinburgh, it can be suggested that the minimally invasive procedure of islet transplantation offers selected patients with extremely unstable T1DM the incredible opportunity to regain control of their condition and improve their quality of life.

Keywords: diabetes, islet, transplant, CGMS

Procedia PDF Downloads 244
48289 Gender-Based Violence Public Art Projects: An Analysis of the Value of Including Social Justice Topics in Tertiary Courses

Authors: F. Saptouw

Abstract:

This paper will examine the value of introducing social justice issues into the tertiary fine art curriculum at a first-year level. The paper will present detail of the conceptual impetus and the logistics related to the execution of a collaborative teaching project. The cohort of students was registered for the Fine Art Foundation course at the Michaelis School of Fine Art at the University of Cape Town. The course is dedicated to the development of critical thinking, communication skills, and varied approaches to knowledge construction within the first-year cohort. A core component of the course is the examination of the representation of gender, identity, politics, and power. These issues are examined within a range of public and private representations like art galleries, museum spaces, and contemporary popular culture. This particular project was a collaborative project with the Office of Inclusivity and Change, and the project leaders were Fabian Saptouw and Gabriel Khan. The paper will conclude by presenting an argument for the importance of such projects within the tertiary environment.

Keywords: art, education, gender-based violence, social responsiveness

Procedia PDF Downloads 110
48288 A Short Dermatoscopy Training Increases Diagnostic Performance in Medical Students

Authors: Magdalena Chrabąszcz, Teresa Wolniewicz, Cezary Maciejewski, Joanna Czuwara

Abstract:

BACKGROUND: Dermoscopy is a clinical tool known to improve the early detection of melanoma and other malignancies of the skin. Over the past few years melanoma has grown into a disease of socio-economic importance due to the increasing incidence and persistently high mortality rates. Early diagnosis remains the best method to reduce melanoma and non-melanoma skin cancer– related mortality and morbidity. Dermoscopy is a noninvasive technique that consists of viewing pigmented skin lesions through a hand-held lens. This simple procedure increases melanoma diagnostic accuracy by up to 35%. Dermoscopy is currently the standard for clinical differential diagnosis of cutaneous melanoma and for qualifying lesion for the excision biopsy. Like any clinical tool, training is required for effective use. The introduction of small and handy dermoscopes contributed significantly to the switch of dermatoscopy toward a first-level useful tool. Non-dermatologist physicians are well positioned for opportunistic melanoma detection; however, education in the skin cancer examination is limited during medical school and traditionally lecture-based. AIM: The aim of this randomized study was to determine whether the adjunct of dermoscopy to the standard fourth year medical curriculum improves the ability of medical students to distinguish between benign and malignant lesions and assess acceptability and satisfaction with the intervention. METHODS: We performed a prospective study in 2 cohorts of fourth-year medical students at Medical University of Warsaw. Groups having dermatology course, were randomly assigned to:  cohort A: with limited access to dermatoscopy from their teacher only – 1 dermatoscope for 15 people  Cohort B: with a full access to use dermatoscopy during their clinical classes:1 dermatoscope for 4 people available constantly plus 15-minute dermoscopy tutorial. Students in both study arms got an image-based test of 10 lesions to assess ability to differentiate benign from malignant lesions and postintervention survey collecting minimal background information, attitudes about the skin cancer examination and course satisfaction. RESULTS: The cohort B had higher scores than the cohort A in recognition of nonmelanocytic (P < 0.05) and melanocytic (P <0.05) lesions. Medical students who have a possibility to use dermatoscope by themselves have also a higher satisfaction rates after the dermatology course than the group with limited access to this diagnostic tool. Moreover according to our results they were more motivated to learn dermatoscopy and use it in their future everyday clinical practice. LIMITATIONS: There were limited participants. Further study of the application on clinical practice is still needed. CONCLUSION: Although the use of dermatoscope in dermatology as a specialty is widely accepted, sufficiently validated clinical tools for the examination of potentially malignant skin lesions are lacking in general practice. Introducing medical students to dermoscopy in their fourth year curricula of medical school may improve their ability to differentiate benign from malignant lesions. It can can also encourage students to use dermatoscopy in their future practice which can significantly improve early recognition of malignant lesions and thus decrease melanoma mortality.

Keywords: dermatoscopy, early detection of melanoma, medical education, skin cancer

Procedia PDF Downloads 93
48287 Review of Consecutive Patients Treated with a Combination of Vancomycin and Rifaximin for Diarrhea Predominant Irritable Bowel Syndrome (IBS-D)

Authors: Portia Murphy, Danica Vasic, Anoja W. Gunaratne, Encarnita Sitchon, Teresita Tugonon, Marou Ison, Antoinette Le Busque, Christelle Pagonis, Thomas J. Borody

Abstract:

Irritable bowel syndrome (IBS) is a chronic gastrointestinal disorder that affects an estimated 11% of the population globally with the most predominant symptoms being abdominal pain, bloating and altered bowel movements. All age groups suffer from IBS although the prevalence of IBS decreases for age groups over 50 years. Women are more likely to suffer from IBS than men. IBS can be categorized into 3 groups based on the type of altered bowel movement: diarrhea-predominant IBS (IBS-D), constipation-predominant IBS (IBS-C) and IBS with mixed bowel habit (IBS-M). The contribution of the gut microbiome to the etiology of IBS is becoming increasingly recognized with rising use of anti-microbial agents. Previous studies on vancomycin and rifaximin used as monotherapy or in combination have been conducted mainly on IBS-C and showed marked improvements in the symptoms. According to our knowledge, no studies reported using these two combinations of antibiotics for IBS-D. Here, we report a consecutive cohort of 18 patients treated with both vancomycin and rifaximin for IBS-D. These patients’ records were reviewed retrospectively. In this cohort, patients ages were between 24-74 years (mean 44 years) and 9 were female. Baseline all patients had diarrhea, 4 with mucus and one with blood. Patients reported other symptoms were abdominal pain (n=11) bloating (n=9), flatulence (n=7), fatigue (n=4) and nausea (n=3). Patients treatments were personalized according to their symptom severity and tolerability and were treated with combination of rifaximin (500 - 3000mg/d) and vancomycin (500mg - 1500mg/d) for an ongoing period. Follow-ups were conducted between 2-32 weeks’ time. Of all patients, 89% patients reported improvement of the symptoms, 1 reported no change and 1 patient’s symptoms got worse. The mechanism of action for both vancomycin and rifaximin involves the inhibition of bacterial cell wall and protein synthesis respectively. The role of these medications in improving the symptoms of this cohort suggests that IBS-D may be microbiome infection driven. In this cohort, similar patient presentations to Clostridium difficile, as well as symptom improvement with the use of rifaximin and particularly vancomycin, suggest that the infectious agent may be an unidentified Clostridium. These preliminary results offer an alternative etiology for IBS-D not previously considered and open the avenue for new research.

Keywords: clostridium deficile, diarrhea predominant Irritable Bowel Syndrome, microbiome, vancomycin/rifaximin combination

Procedia PDF Downloads 100
48286 Prophylactic Replacement of Voice Prosthesis: A Study to Predict Prosthesis Lifetime

Authors: Anne Heirman, Vincent van der Noort, Rob van Son, Marije Petersen, Lisette van der Molen, Gyorgy Halmos, Richard Dirven, Michiel van den Brekel

Abstract:

Objective: Voice prosthesis leakage significantly impacts laryngectomies patients' quality of life, causing insecurity and frequent unplanned hospital visits and costs. In this study, the concept of prophylactic voice prosthesis replacement was explored to prevent leakages. Study Design: A retrospective cohort study. Setting: Tertiary hospital. Methods: Device lifetimes and voice prosthesis replacements of a retrospective cohort, including all patients with laryngectomies between 2000 and 2012 in the Netherlands Cancer Institute, were used to calculate the number of needed voice prostheses per patient per year when preventing 70% of the leakages by prophylactic replacement. Various strategies for the timing of prophylactic replacement were considered: Adaptive strategies based on the individual patient’s history of replacement and fixed strategies based on the results of patients with similar voice prosthesis or treatment characteristics. Results: Patients used a median of 3.4 voice prostheses per year (range 0.1-48.1). We found a high inter-and intrapatient variability in device lifetime. When applying prophylactic replacement, this would become a median of 9.4 voice prostheses per year, which means replacement every 38 days, implying more than six additional voice prostheses per patient per year. The individual adaptive model showed that preventing 70% of the leakages was impossible for most patients, and only a median of 25% can be prevented. Monte-Carlo simulations showed that prophylactic replacement is not feasible due to the high Coefficient of Variation (Standard Deviation/Mean) in device lifetime. Conclusion: Based on our simulations, prophylactic replacement of voice prostheses is not feasible due to high inter-and intrapatient variation in device lifetime.

Keywords: voice prosthesis, voice rehabilitation, total laryngectomy, prosthetic leakage, device lifetime

Procedia PDF Downloads 99
48285 Relationship between Different Heart Rate Control Levels and Risk of Heart Failure Rehospitalization in Patients with Persistent Atrial Fibrillation: A Retrospective Cohort Study

Authors: Yongrong Liu, Xin Tang

Abstract:

Background: Persistent atrial fibrillation is a common arrhythmia closely related to heart failure. Heart rate control is an essential strategy for treating persistent atrial fibrillation. Still, the understanding of the relationship between different heart rate control levels and the risk of heart failure rehospitalization is limited. Objective: The objective of the study is to determine the relationship between different levels of heart rate control in patients with persistent atrial fibrillation and the risk of readmission for heart failure. Methods: We conducted a retrospective dual-centre cohort study, collecting data from patients with persistent atrial fibrillation who received outpatient treatment at two tertiary hospitals in central and western China from March 2019 to March 2020. The collected data included age, gender, body mass index (BMI), medical history, and hospitalization frequency due to heart failure. Patients were divided into three groups based on their heart rate control levels: Group I with a resting heart rate of less than 80 beats per minute, Group II with a resting heart rate between 80 and 100 beats per minute, and Group III with a resting heart rate greater than 100 beats per minute. The readmission rates due to heart failure within one year after discharge were statistically analyzed using propensity score matching in a 1:1 ratio. Differences in readmission rates among the different groups were compared using one-way ANOVA. The impact of varying levels of heart rate control on the risk of readmission for heart failure was assessed using the Cox proportional hazards model. Binary logistic regression analysis was employed to control for potential confounding factors. Results: We enrolled a total of 1136 patients with persistent atrial fibrillation. The results of the one-way ANOVA showed that there were differences in readmission rates among groups exposed to different levels of heart rate control. The readmission rates due to heart failure for each group were as follows: Group I (n=432): 31 (7.17%); Group II (n=387): 11.11%; Group III (n=317): 90 (28.50%) (F=54.3, P<0.001). After performing 1:1 propensity score matching for the different groups, 223 pairs were obtained. Analysis using the Cox proportional hazards model showed that compared to Group I, the risk of readmission for Group II was 1.372 (95% CI: 1.125-1.682, P<0.001), and for Group III was 2.053 (95% CI: 1.006-5.437, P<0.001). Furthermore, binary logistic regression analysis, including variables such as digoxin, hypertension, smoking, coronary heart disease, and chronic obstructive pulmonary disease as independent variables, revealed that coronary heart disease and COPD also had a significant impact on readmission due to heart failure (p<0.001). Conclusion: The correlation between the heart rate control level of patients with persistent atrial fibrillation and the risk of heart failure rehospitalization is positive. Reasonable heart rate control may significantly reduce the risk of heart failure rehospitalization.

Keywords: heart rate control levels, heart failure rehospitalization, persistent atrial fibrillation, retrospective cohort study

Procedia PDF Downloads 45
48284 The Impact of CYP2C9 Gene Polymorphisms on Warfarin Dosing

Authors: Weaam Aldeeban, Majd Aljamali, Lama A. Youssef

Abstract:

Background & Objective: Warfarin is considered a problematic drug due to its narrow therapeutic window and wide inter-individual response variations, which are attributed to demographic, environmental, and genetic factors, particularly single nucleotide polymorphism (SNPs) in the genes encoding VKORC1 and CYP2C9 involved in warfarin's mechanism of action and metabolism, respectively. CYP2C9*2rs1799853 and CYP2C9*3rs1057910 alleles are linked to reduced enzyme activity, as carriers of either or both alleles are classified as moderate or slow metabolizers, and therefore exhibit higher sensitivity of warfarin compared with wild type (CYP2C9*1*1). Our study aimed to assess the frequency of *1, *2, and *3 alleles in the CYP2C9 gene in a cohort of Syrian patients receiving a maintenance dose of warfarin for different indications, the impact of genotypes on warfarin dosing, and the frequency of adverse effects (i.e., bleedings). Subjects & Methods: This retrospective cohort study encompassed 94 patients treated with warfarin. Patients’ genotypes were identified by sequencing the polymerase chain reaction (PCR) specific products of the gene encoding CYP2C9, and the effects on warfarin therapeutic outcomes were investigated. Results: Sequencing revealed that 43.6% of the study population has the *2 and/or *3 SNPs. The mean weekly maintenance dose of warfarin was 37.42 ± 15.5 mg for patients with the wild-type allele (CYP2C9*1*1), whereas patients with one or both variants (*2 and/or *3) demanded a significantly lower dose (28.59 ±11.58 mg) of warfarin, (P= 0.015). A higher percentage (40.7%) of patients with allele *2 and/or *3 experienced hemorrhagic accidents compared with only 17.9% of patients with the wild type *1*1, (P = 0.04). Conclusions: Our study proves an association between *2 and *3 genotypes and higher sensitivity to warfarin and a tendency to bleed, which necessitates lowering the dose. These findings emphasize the significance of CYP2C9 genotyping prior to commencing warfarin therapy in order to achieve optimal and faster dose control and to ensure effectiveness and safety.

Keywords: warfarin, CYP2C9, polymorphisms, Syrian, hemorrhage

Procedia PDF Downloads 123